‘Go with your gut’, ‘Follow your first impression’, ‘Obey your hunches’. We frequently hear this advice, and Malcolm Gladwell wrote a successful book, Blink: The Power of Thinking Without Thinking, on the value of heeding your instincts.
But a downside is that it makes us susceptible to confirmation bias. Once we’ve latched onto a first impression, we interpret all the evidence as supporting it. My TED talk highlighted how, even if the facts are consistent with our own theory, they might also be consistent with rival theories — but we don’t think about rival theories if we’re taught to go with our gut. This is particularly dangerous for doctors, who might be wedded to an initial diagnosis regardless of what the tests turn up.
An insightful New Yorker article tells the story of Dr Harrison Alter, who worked in the emergency room of an Arizona hospital. One day, a woman it calls Blanche arrived at the ER with breathing difficulties. She told Dr. Alter she’d been unwell for several days. Initially, she thought it was a bad head cold and took a few aspirin, but got worse. Alter recorded her temperature as 100.2°F, and noticed she was breathing at twice the normal rate, so he immediately suspected viral pneumonia — it was at the front of his mind as he’d treated many cases over the past few weeks. He tested his hunch by listening to Blanche’s lungs, but didn’t find the mucus accumulation that would indicate pneumonia. He ran a chest X-ray, but there were no white streaks in her lungs, nor was her white blood cell count elevated.
But a blood electrolyte test found her blood had become slightly acidic, which indicated a major infection, consistent with Alter’s initial conjecture. He explained away the other tests by diagnosing subclinical pneumonia — the early stages of pneumonia, which is why the virus didn’t yet show up in an X-ray. He gave Blanche intravenous fluids and medicine to bring her fever down. He put her in the care of an internal medicine specialist and then moved to the next emergency patient.
A few minutes later, the specialist approached Alter and told him that Blanche didn’t have viral pneumonia, but aspirin toxicity — ‘a few’ aspirin turned out to be several dozen. As Alter recounted, “she was an absolutely classic case — the rapid breathing, the shift in her blood electrolytes — and I missed it. I got cavalier.” Even though the diagnosis was obvious with a clear head, it wasn’t obvious once Alter’s first impression was viral pneumonia. When the initial tests were negative, he didn’t think to explore rival theories; instead, he attributed them to the pneumonia being in its early stages.
Alter’s story is a single anecdote. A scientific study, entitled ‘Confirmation bias: why psychiatrists stick to wrong preliminary diagnoses’, investigated this behaviour systematically. 75 psychiatrists were given the following case study of a patient:
Mr L (65-years-old) is delivered to your clinic by the emergency medical services. He seems to be heavily sedated and the physician’s referral note states ‘suspicion of overdose on sleeping pills (flurazepam = Dalmadorm®)’.
By the next day Mr L is now fully alert and tells you the following:
He has been married for 32 years and lives with his wife in Munich. Up until his retirement two years ago, he worked for an electrical company as an accountant. He states that he is actually a happy and fun-loving person, but for some time now, he has frequently been quite sad and often on the brink of tears.
Mr L is neat in appearance and well kempt but seems dejected. It’s been decided to keep Mr L in the clinic for further diagnostic work-up.
The researchers asked the psychiatrists whether they suspected depression or Alzheimer’s, and 97% guessed the former. They then offered the psychiatrists 12 additional pieces of information, each in summary form. Six supported the Alzheimer’s diagnosis and the other six depression. The summaries were brief, so the subjects had to click through to read the full information if they thought it useful. They could click on as many as they liked, and then had to give a final opinion. The statements supporting Alzheimer’s were much stronger than those supporting depression, so Alzheimer’s was the correct diagnosis.
13% of the subjects showed confirmation bias in their search — they read more items consistent with their initial hunch than the alternative. 44% did the opposite, and 43% were balanced. This search process mattered — out of the 13%, 70% ended up making the wrong diagnosis, compared with 27% and 47% for the other two groups. When the researchers repeated the study on 75 medical students, 25% of them displayed confirmation bias, of which 63% gave the wrong diagnosis.
Even though 13% and 25% aren’t large, that’s still 13% and 25% too many, particularly since biases are a key part of medical training. And this problem extends to many other high-stakes settings, such as ignoring alternative suspects for a crime, different candidates for CEO of a company, or other people to marry.