1988 - Flight 655, an Iranian civilian passenger flight is shot down by surface-to-air missiles fired from the USS Vincennes killing all 290 passengers on board. The commanding officer believed that the Iranian Airbus was actually an enemy fighter; a belief developed in the context of a high-pressure situation with complicated, confusing, and contradictory information.
Fast-forward 8 years to 1996. Eight climbers die on Mt. Everest when a snowstorm catches them near the summit. Having spent too much time and energy on the ascent the group failed to realize they could not safely make it to the return trip. The two expedition leaders, overwrought and overconfident in their skills, biased their judgments and risk assessments.
How could these two fatal yet seemingly unrelated incidents be the consequence of the same thing?
It is simple. These and literally millions of other bad decisions come down to one fatal flaw – Cognitive Bias.
Cognitive biases not only operate in life and death disasters but in day-to-day reasoning and decision-making – and it costs lives.
So what is cognitive bias? It is an error in thinking or judgment, where inferences about a situation may be drawn in an illogical fashion. Individuals create their own "subjective reality" from their perceptions. Neuroscientists tell us that Cognitive Bias originates in the primitive and emotional part of our brain known as the “Limbic System”.
Our Limbic System can be seen as a bit of “Frenemy”. For those of us that are not teenage girls or don’t live with one, a Frenemy is somebody, or in this case a ‘something’, that we can rely upon as a “Friend” in some situations, but is an “Enemy” to us in others. A “fair-weather friend”.
The Limbic system of our brain wires itself from birth to chemically reward or punish us for our responses to certain external stimulus, mainly perceived threats that demand quick and proven reactions that need to be faster than more deliberate thinking would allow.
It’s from this part of the brain that we get our instinctive “fight, flight or freeze” responses. Before you have time to think “oh that’s a big Brown Snake I’m about to step on” the Limbic System has processed the potential threat, bypassed the rational brain and given you the immediate response for “snake” - whether that be flight, fight or freeze (personally I am a big fan of flight!)
All very fast and potentially life saving stuff, but it can also be an absolute disaster when that same Limbic System gets triggered and involved in what should be more “rational” responses and decisions to solve complex problems.
The Limbic System cannot discriminate between the snake and other ‘threatening’ situations such as someone aggravating us on the football field, being too information and task overloaded or being tired and emotional when under pressure.
That fast and intuitive part of the brain that is designed to save our life in some situations, and mostly does a great job, can also take short cuts under certain circumstances and taint complex problem solving with responses that are too emotional and simplistic.
And that is “cognitive bias”. Emotional and ill-thought out biases that can override the rational brain, and cause us to ignore important facts, and in the heat of the moment favour, our ingrained biases no matter how obvious it might seem later - to have been a really stupid thing to do.
This all happens sub consciously, and so fast, that it takes a lot to beat it. It is the Fatal Flaw in humans.
So it would seem that mastering cognitive biases could save us both literally and metaphorically. Organisations such as the military, medical profession and aviation use Training, teamwork and discipline to beat it.
When task and information overload hit, ideally systems and checklists are in place, as well as a balance of team members with varying thinking and behavioural styles to help catch that cognitive bias before it impacts our decision-making.
Humans will always react more emotionally when under pressure, it’s the speed with which individuals and teams can catch themselves doing it that matters.
When it takes over without being corrected, well that’s when the wrong aircraft gets shot down, or people try to summit Everest even when it should be obvious that it is in fact time to head back down to base camp.
Neuroscientists liken it to a Mahut Riding an Elephant, or a Jockey Riding a Horse. The rider might be smarter and more rationale, but if the stronger and more intuitive beast takes control in the heat of the moment, it can be hard to get it back under control.
The fatal flaw of cognitive bias underpins so many airline disasters, survival failures, oil spills, relationship and business failures.
Where teams or leaders have a better handle on how to rise above it, then the fatal flaw can be beaten.
Often called “human error”, it’s why some of the best-trained airline pilots can fly perfectly good aircraft into the ground.
So can we overcome this flaw? In short yes. Just as there are many examples where Cognitive Bias has cost lives there are also many where this “fatal flaw” was actually averted.
In 2010 a Qantas Airbus A380 had a catastrophic engine failure, with hundreds of lives at risk, and the problem had no precedent, so the existing manuals and checklists useless.
If this complex problem was not solved by the flight deck crew, and quickly, all would perish.
Luckily the pilot had years of military and civil flying experience as well as a great crew onboard. They also employed a system nicknamed G.R.A.D.E. This is a tool Qantas pilots are taught to try to remove emotion and bias from emergency decision-making. It stands for Gather, Review Analyse, Decide and then Evaluate. Pilots are trained to use this system to solve the problem with logic rather than bias and emotion. They are also taught to work as a team to help spot and contain one another’s biases under pressure before they impact decision-making.
Another example is the so-called “Miracle on The Hudson”, where Captain Sully, played by Tom Hanks in the movie, managed to stay calm and fall back on similar training to land a stricken aircraft on the Hudson River. Had he not remained calm, and removed bias from his critical decision-making it could have ended very badly indeed.
All too often the disaster is not due to a computer glitch, a malfunctioning part or the weather, but is in fact due to that little flesh and blood “Frenemy” that resides right up here between our own ears.
There’s an expression that I heard once about how people deal with pressure and are about to succumb to this “fatal flaw” - “You won’t rise to the occasion, you’ll drop to your level of training”.
So knowing your own strengths and weaknesses, having some great training in how to deal with emergencies, and then working in a good team can help avert this “fatal flaw”.