When Air France 447 crashed into the ocean off of Brazil, the cause was human diagnostic failure. At the time of the crash, every important mechanical and electronic system onboard was functioning normally. And every primary flight instrument was screaming the same thing: the aircraft’s nose is too high. It’s requiring more power from the engines than they are capable of delivering to maintain level flight.

The solution was simple: Push the stick forward. Lower the nose.

That solution wasn’t implemented. Because the humans at the controls failed to diagnose what the machine was telling them. Diagnosis had to wait for the forensic investigation.

My late wife, Sophia, died in the same way. Despite the all machines’ information, no accurate human diagnosis was made. Not until the post-mortem forensic investigation.

Also known as her autopsy.

If there is a commonality between the two failures to diagnose, that commonality is that there was simply too much information. Not too little, but too much. Resulting in a classic failure of human cognition: an inductive failure to see the proverbial forest through the proverbial trees.

Any conclusion about the state of that Airbus 330 should have come from the following facts: A) the aircraft was losing altitude, as every altimeter on board agreed including the radar altimeter in the aircraft’s final moments. B) The nose was high as every attitude indicator professed. C) The plane’s airspeed was low*. D) The engines were operating normally.

That deductive conclusion: the airplane’s angle of attack is too high to sustain level flight within the performance envelope of the aircraft. Deductive conclusions are logical ones. Logical conclusions are things that machines are good at and men are sometimes ill-equipped to make, such as when they are under stress. Machines don’t feel stress.
If there is a commonality between the two failures to diagnose, that commonality is that there was simply too much information. Not too little, but too much. What resulted was a classic failure of human cognition: an inductive failure to see the proverbial forest through the proverbial trees.

Why the aircraft’s computers were not programmed to realize this deductive conclusion themselves, blank the confusing cockpit displays, and simply blare “push the stick forward!” is a mystery. But for reasons having to do with our fear of the rise of the machines, the computers are instead programmed to cede all authority and simply blare the electronic version of “We’re outta here!” at the first sign of trouble.

*There’s a wrinkle. A rhetorical sin of omission was committed when I said that the plane’s instruments accurately indicated the airspeed was too low. The event that set all the other events in motion was actually temporary icing of one of the airspeed probes and a short but important failure of the air speed sensing system. It’s that one transient first sign of trouble that sent the computers packing.

When the system swiftly came back online the computers had already abandoned ship and could no longer offer any help.

That left the three humans in the cockpit with a morass of warning lights, confusing indications, too much information, and too little help and too much stress. Emotional intelligence took over from there, kicking in both fight and flee responses in an environment totally unsuited to either.

Unable to flee, the men instead fought -- quibbling among themselves all the way to their marine oblivion.

For Sophia, who died of cardiac amyloidosis, it was much the same. The machines performed well. The EKG said everything was normal, except that her voltages were kind of low. As if there were something getting between her heart and the machine, attenuating the signals.

The ECG said everything was normal, except that her ejection fraction was low for her age. As if there was something preventing her heart from fully filling.

The MRI(s) said everything was normal, except that her heart appeared a bit stiff after contracting, as if there was some compromise by a foreign body in the muscle.

She had a heart catheter. She looked clean as a whistle inside and all her oxygen levels were normal.

Blood labs? Totally normal. Pulmonary function? Normal. Blood pressure? A bit low, but not that low. Everything else, normal, except that she just kept getting weaker and weaker for no clear reason, just as AF 447 kept descending towards the ocean, for no clear reason. No reason, at least, to the stressed humans trying to stop both.

Now there were a couple of pieces of contradictory information, which only magnified the stress. The first being that she was completely outside the “normal” demographic for what is already a very rare disease. She was young, she was healthy, she was (part) Asian, and she was a woman.

Still, one of her cardiologists suggested a biopsy of her heart muscle. To check for something he called “amyloidosis.” When I inquired what that was, he vaguely waved it off as a “nebulous protein” condition. When I asked what the treatment was, he replied: “oh, there are some things we can do.” Nebulous didn’t sound scary to this layman. Neither did protein. Whatever.

But between that conversation and her biopsy a couple weeks later, I found out what amyloidosis was and it scared me to death. I didn’t share my findings with Sophia because by then I had convinced myself that her having that condition was a statistical impossibility. I decided it was better to ignore what the machines were telling us because they were only saying one thing: the ocean is coming up, fast.

By the way, the doctor who performed the biopsy (not the same as the one who suggested it in the first place) felt the same way — no way did she have amyloidosis — and when the lab test incorrectly came back negative, we all breathed a sigh of relief.

But we were wrong. For a lot of the reasons that the pilots of flight 447 were wrong: we got a lot of information that told us everything was OK and we ignored the information that said “except for …”

We did a lot of inductive reasoning when the evidence supported a deductive conclusion. We didn’t listen to the machines when they told us the information we needed to hear and we did listen only when they told us the information we wanted to hear.

This is not to criticize one form of reasoning over another. It is rather to point out that we technologists need to recognize not only the limits of our machinery but also its strengths, particularly in mechanically deducting a conclusion from a set of facts.

And as designers of systems we are bound to recognize the human tendency towards inductive reasoning which is, after all, a form of creativity born of experience. And as much as medical diagnosis is straightforward and deductive it is also a creative, inductive, and artistic, process. Our machines have a long way to go helping us in the former which can free us for the latter.