The perils of cognitive biases

Cognitive biases are pervasive human instincts. They are essentially thinking shortcuts or heuristics which we use all the time to make quick judgments, and they are especially likely to crop up when making decisions under time pressure. According to the psychologist Daniel Kahneman, heuristics and biases are products of our subconscious cognitive process. In his bestseller, Thinking, Fast and Slow, Kahneman labels this intuitive form of thinking as System I, as opposed to the slow, deliberate conscious thinking which he calls System II. The major weaknesses of the judgments that arise from System I thinking are neglect of uncertainty and suppression of doubt, two albatrosses that characterise biases and therefore pose clear dangers to medical decision-making.

Photo credit: TheDyslexicBook.com

There is an endless list of cognitive biases that may imperil our decision-making as illustrated by Rolf Dobelli who, in his book The Art of Thinking Clearly, reviewed 99 biases that may impair our thinking. Other books and articles have also explored the specific hazard of cognitive biases in medical practice such as the following which I have found most useful:

I am thinking of… Davide Restivo on Flickr. https://www.flickr.com/photos/somemixedstuff/2403249501

From these sources we come to appreciate the most important heuristics or cognitive shortcuts that have the most influence on medical decision-making, and they are: 

Confirmation

This is the tendency to only seek out evidence that confirm our decisions and judgements, and to ignore any evidence that contradict them. To illustrate confirmation bias, Leonard Mlodinow, in his book The Drunkard’s Walk: How Randomness Rules Our Lives, cited the philosopher Francis Bacon who aptly captured this bias when he said: ‘the human understanding, once it has adopted an opinion, collects any instances that confirm it, and though the contrary instances may be more numerous and more weighty, it either does not notice them or else rejects them, in order that his opinion will remain unshaken‘. It is clear therefore that, to avoid falling victim of this common bias, people must actively seek evidence to disprove their conclusions before acting on them.

Availability

This is the tendency for our opinions and decisions to be influenced by the ease with which we recall previous similar examples. This is particularly likely when the previous experiences were recent, or if they made a strong impression on us. The danger of availability is that the easily recalled examples may in reality have very little in common with the current problem. For example, a recent patient, or a recent case report, may have a strong influence on how a doctor approaches the next patient he or she assesses. To avoid this bias therefore, it is necessary to pause and check that one’s opinion is not being inappropriately swayed by mentally available examples.

Representativeness

This bias, also called pattern recognition, is the tendency for the prominent features of a problem to sway decision-making at the expense of less salient features. Whilst pattern recognition is an effective technique by which experts make judgments within their fields, it may inappropriately impair decision-making when the condition under consideration is rare. This bias may manifest because of base-rate neglect – the tendency to make judgments whilst ignoring the frequency of the disease in the population. A rare disease may therefore take priority over disorders which are more frequent in the population. Doctors are therefore often warned of this bias with aphorisms such as:

  • Common things occur commonly
  • When you hear hoof beats, don’t think zebras
Zebroid 1. David Bygott on Flickr. https://www.flickr.com/photos/davidbygott/4149607202

Diagnosis momentum

This is the tendency for the diagnostic label that was first considered to become stuck and thereby hinder the consideration of alternative options. Coskerry illustrates how the label – initially just a possibility – ‘gathers increasing momentum‘ and transforms into a definite diagnosis, and in the process unwittingly excludes the consideration of more likely possibilities. This is unfortunately a major cause of diagnostic error as we shall see in future blog posts. The opposite of diagnosis momentum, or cognitive fixation, is thematic vagabonding. Sidney Dekker discussed this in his book The Field Guide to Understanding ‘Human Error’. where he referred to the bias as butterfly mind – the tendency to jump from judgement to judgment as new information becomes available.

Hindsight

Hindsight itself is a useful strategy when the understanding of how a previous error occurred enables the institution of safeguards to prevent its recurrence. For example, in his book Judgment and decision making. Psychological perspectives, David Hardman explains that hindsight helps to guide future behaviour by making sense of the current one. Hindsight bias may however make us imagine that the characteristics of past events were more clear than they were at the time they were evolving.  Woods and Cook, writing in Handbook of Applied Cognition, also show that hindsight oversimplifies past events, such as by treating multidimensional phenomena as unidimensional. In these ways therefore, hindsight bias may deceptively strengthen the confidence we have in our decisions, and hinder the consideration of alternative options. As we shall see in future blog posts, hindsight may also enable the easy attribution of blame after patient safety incidents.

Framing

Framing is the tendency for the context in which a problem is presented to sway judgment, often at the expense of other aspects of the problem. A patient’s gender or ethnicity for example may lead to the consideration of specific diagnoses, even though other diagnoses better explain the presentation. 

Anchoring 

Anchoring is the tendency for the first information we gather to determine the judgment we finally make. Whilst this may be the right in some cases, this heuristic often functions to the detriment of important information that emerge subsequently.

Other cognitive biases

There are, of course, several other cognitive biases relevant to diagnostic error, and these include aggregate bias, ascertainment bias, commission bias, feedback sanction, fundamental attribution error, gambler’s fallacy, gender bias, multiple alternatives bias, omission bias, order effects, outcome bias, overconfidence bias, playing the odds (frequency gambling), posterior probability error, premature closure, the primacy effect, the recency effect, search satisfying, Sutton’s slip, sunk costs, triage cueing, unpacking principle, vertical line failure, visceral bias, and Yin-Yang out.

Thinking. Flik on Flickr. https://www.flickr.com/photos/flik/5486945620

In the next blog post we will take a look at three related subconscious cognitive phenomena that also influence our judgment and decision-making, and these are schemas, stereotypes, and the Einstellung effect.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s