Deadly set: how too much focus causes mistakes.

Aviation Psychologist David Beaty on the phenomenon of ‘Set’ (1991):

‘Set’ is a survival characteristic we have inherited. The human brain evolved to help individuals live and survive circumstances very different from our own. It predisposes us to select our focus on that part of the picture paramount at the time – a vision often so totally focused that it ignores the rest of the environment.

Once something is identified […] it takes on a reality of its own and sticks in the mind like a burr which is difficult to dislodge. […] The mind becomes tunnelled on a particular course of action. Add to that the ingredient of fatigue and it is not difficult to see that a ‘set’ as hard as concrete can result. Furthermore, ‘set’ is infectious. There is a follow-my-leader syndrome. So it is easy to see why most aircraft accidents are caused by ‘silly’ mistakes in the approach and landing phase.

[…] ‘Set’ has been a factor in many aircraft accidents. [In a case in 1972 over Florida], the crew of a Tristar were not sure that their undercarriage was down. The accident sequence was begun by a burned out light bulb in the system which is designed to show that the undercarriage is down and locked. […] the crew examined every possible of finding the trouble. The flight engineer crawled down into the nose, while the captain and the first officer tried every combination of switches and circuit-breakers. […] the three members of crew did not notice that the autopilot had become disengaged and the aircraft was sinking […] eventually crashing into the Everglades.

Because they had become preoccupied with an unsafe landing-gear indication, they failed to monitor the critical altimeter readings. Ironically, the air traffic controller noticed on his radar that the aircraft was losing height, but instead of pointing this out simply asked diplomatically “How are things coming along there?”.

The crew, still obsessed with their landing-gear problem, assuming he referred to that, for they could thing of nothing else, replied seconds before the crash, “Everything is all right!”

– Paraphrased from The Naked Pilot by David Beaty (Ch.6)

Aircraft accidents make grisly reading, but they are one of the most accurately documented and analysed areas of collaborative behaviour in humans. It’s vital for us to understand how and why we make mistakes – not just in safety critical systems but in all walks of life. When I read that passage above, I see parallels with so many of the mistakes I make on a daily basis at work and at home. I can see myself in every role: the captain, the flight engineer, the first officer, the air traffic controller. You should too.

It’s so disappointing to see the way the field of UX has latched onto Psychological research findings recently. Take cognitive biases for example: instead of seeing our own weaknesses in them, we’ve decided to use them as tools of manipulation for the people we design for. “Only 5 items left; sale ends today; untick this box if you don’t want to receive emails, but tick the next one if you do…” – This sucks.

Intelligence Analysis is another field like Safety Critical Systems design, where bad decisions cost lives. If you pick up a textbook on Intelligence Analysis, the chapter on Cognitive Biases will be written from the perspective of the analyst’s reasoning abilities. Understanding your weaknesses helps you avoid making mistakes in the future.

This is the way we should be thinking about thinking. Let’s analyse ourselves for a change.

4 comments