FADEC = complex
Recently, Mxsmanic posted:
Neil Gould writes:
One gets plenty of clues that something is going awry prior to this
happening.
No, one does not.
And, you know this because...?
The whole purpose of automation is to mask
information that contains such clues, in order to reduce the workload
for the pilot.
Well, Jose... this meets *my* criteria for "utter nonsense". Does it still
qualify as less so to you?
It appears that you are describing another form of pilot error. If
one believes that they can set an autopilot and then take a nap,
*that* is the problem, not the behavior of the autopilot.
A lot of commercial pilots do that. Long trips can get pretty boring.
What people may or may not do does not reasign the responsibility for
problems that their actions may create.
Given that so few accidents can be charged to the failure of these
devices, it may be reaching to claim that some unreasonable level
of danger is presented by their use.
A lot of accidents have occurred when automated systems allowed crews
to lose their situational awareness.
The basis for this notion is...?
Neil
Autopilots are particularly
implicated in this respect, perhaps because they've been around so
long and work so well.
|