The danger of first impressions

1 Dec 2023 | Confirmation bias

‘Go with your gut’, ‘Follow your first impression’, ‘Obey your hunches’. We frequently hear this advice, and Malcolm Gladwell wrote a successful book, Blink: The Power of Thinking Without Thinking, on the value of heeding your instincts.

But a downside is that it makes us susceptible to confirmation bias. Once we’ve latched onto a first impression, we interpret all the evidence as supporting it. My TED talk highlighted how, even if the facts are consistent with our own theory, they might also be consistent with rival theories — but we don’t think about rival theories if we’re taught to go with our gut. This is particularly dangerous for doctors, who might be wedded to an initial diagnosis regardless of what the tests turn up.

An insightful New Yorker article tells the story of Dr Harrison Alter, who worked in the emergency room of an Arizona hospital. One day, a woman it calls Blanche arrived at the ER with breathing difficulties. She told Dr. Alter she’d been unwell for several days. Initially, she thought it was a bad head cold and took a few aspirin, but got worse. Alter recorded her temperature as 100.2°F, and noticed she was breathing at twice the normal rate, so he immediately suspected viral pneumonia — it was at the front of his mind as he’d treated many cases over the past few weeks. He tested his hunch by listening to Blanche’s lungs, but didn’t find the mucus accumulation that would indicate pneumonia. He ran a chest X-ray, but there were no white streaks in her lungs, nor was her white blood cell count elevated.

But a blood electrolyte test found her blood had become slightly acidic, which indicated a major infection, consistent with Alter’s initial conjecture. He explained away the other tests by diagnosing subclinical pneumonia — the early stages of pneumonia, which is why the virus didn’t yet show up in an X-ray. He gave Blanche intravenous fluids and medicine to bring her fever down. He put her in the care of an internal medicine specialist and then moved to the next emergency patient.

A few minutes later, the specialist approached Alter and told him that Blanche didn’t have viral pneumonia, but aspirin toxicity — ‘a few’ aspirin turned out to be several dozen. As Alter recounted, “she was an absolutely classic case — the rapid breathing, the shift in her blood electrolytes — and I missed it. I got cavalier.” Even though the diagnosis was obvious with a clear head, it wasn’t obvious once Alter’s first impression was viral pneumonia. When the initial tests were negative, he didn’t think to explore rival theories; instead, he attributed them to the pneumonia being in its early stages.

Alter’s story is a single anecdote. A scientific study, entitled ‘Confirmation bias: why psychiatrists stick to wrong preliminary diagnoses’, investigated this behaviour systematically. 75 psychiatrists were given the following case study of a patient:

Mr L (65-years-old) is delivered to your clinic by the emergency medical services. He seems to be heavily sedated and the physician’s referral note states ‘suspicion of overdose on sleeping pills (flurazepam = Dalmadorm®)’.

By the next day Mr L is now fully alert and tells you the following:

He has been married for 32 years and lives with his wife in Munich. Up until his retirement two years ago, he worked for an electrical company as an accountant. He states that he is actually a happy and fun-loving person, but for some time now, he has frequently been quite sad and often on the brink of tears.

Mr L is neat in appearance and well kempt but seems dejected. It’s been decided to keep Mr L in the clinic for further diagnostic work-up.

The researchers asked the psychiatrists whether they suspected depression or Alzheimer’s, and 97% guessed the former. They then offered the psychiatrists 12 additional pieces of information, each in summary form. Six supported the Alzheimer’s diagnosis and the other six depression. The summaries were brief, so the subjects had to click through to read the full information if they thought it useful. They could click on as many as they liked, and then had to give a final opinion. The statements supporting Alzheimer’s were much stronger than those supporting depression, so Alzheimer’s was the correct diagnosis.

13% of the subjects showed confirmation bias in their search — they read more items consistent with their initial hunch than the alternative. 44% did the opposite, and 43% were balanced. This search process mattered — out of the 13%, 70% ended up making the wrong diagnosis, compared with 27% and 47% for the other two groups. When the researchers repeated the study on 75 medical students, 25% of them displayed confirmation bias, of which 63% gave the wrong diagnosis.

Even though 13% and 25% aren’t large, that’s still 13% and 25% too many, particularly since biases are a key part of medical training. And this problem extends to many other high-stakes settings, such as ignoring alternative suspects for a crime, different candidates for CEO of a company, or other people to marry.

If at first you don’t succeed, try try again

If at first you don’t succeed, try try again

One of the papers that I cited most prominently in Grow the Pie is "Corporate Sustainability: First Evidence on Materiality". It shows that ESG doesn't always pays off: firms with high ESG scores don't beat the market; only those that focus their ESG efforts on issues material to their industry. For example, climate change is a serious global threat, but it isn’t the most important concern for a tech company that conducts its business in the cloud rather than along the coastline. Thus, a tech company that’s best-in-class in its carbon footprint doesn’t beat the market; instead, it ...
Want a more innovative conclusion? Innovate the conclusion

Want a more innovative conclusion? Innovate the conclusion

'Want a more innovative company? Hire more women'. The title hooked me immediately. I’m an avid follower of the @TEDTalks Twitter page, but I don’t have time to watch every talk. But when I saw one with the title ‘Want a more innovative company? Hire more women’, I wanted to hit play instantly.
You couldn’t even make it up

You couldn’t even make it up

Confirmation bias leads us to make up excuses to dismiss facts we don’t like. If our favourite politician gets elected and the economy tanks, we’d argue it would have done worse had she not been in charge. Or we’d protest that we need to wait another year before we can truly evaluate her performance.