As we saw in the previous blog post titled the problem of thinking, thinking is hard and it is associated with several pitfalls. It is therefore not surprising that cognitive errors impose a heavy toll on patient safety. For example, in his article Clinical cognition and diagnostic error: applications of a dual process model of reasoning, Pat Croskerry showed that >75% of diagnostic errors can be attributed to ‘physician thinking failure‘. Ian Scott on the other hand, in his paper titled Errors in clinical reasoning: causes and remedial strategies, pointed out that autopsies identify missed diagnoses in 10-20% of hospital admissions. There is therefore a strong imperative to mitigate the cost of healthcare-related thinking errors.
The 1st step in improving our thinking, as highlighted by Croskerry and Nimmo in their paper titled Better clinical decision making and reducing diagnostic error, is to ‘think about, and understand, our thinking processes’, because this helps to ‘improve our decision making, including diagnosis’. The importance of this simple strategy is highlighted by Croskerry in another paper titled Cognitive forcing strategies in clinical decisionmaking where he argued that most doctors are not aware of their own thinking processes, nor of the biases that threaten their thinking.
The 2nd step in improving our thinking is to implement what Croskerry termed cognitive debiasing strategies. Exploring these in his paper titled The importance of cognitive errors in diagnosis and strategies to minimize them, he said these strategies help to entrench self-monitoring in the decision-making process, and in this way they reduce the errors that arise from faulty thinking. The strategies he discussed include the following:
- Metacognition – stepping back to reflect on one’s thinking process and how specific cognitive biases impact them.
- Reduced reliance on memory
- The use of cognitive aids such as mnemonics, clinical practice guidelines, and algorithms.
- Using mental rehearsal or ‘cognitive walkthrough‘ strategies
- Forced consideration of alternatives such as by generating and assessing differential diagnosis.
Nimmo and Croskerry also suggested the following strategies to overcome thinking errors:
- Routinely challenging our judgments such as by seeking out dis-confirming evidence against them
- Reframing the problem in different ways to open different perspectives of understanding
- Using narrative, metaphor, and analogies ‘to make connections between different aspects of the problem’.
- The use of ‘worst case scenario’ thinking as ‘a reference point for perspective‘
In a striking approach to eliminating thinking errors, Mark Graber referred to the ‘ten commandments‘ of reducing cognitive errors in his paper titled Educational strategies to reduce diagnostic error: can you teach this stuff?. Attributing these to Leo Leonidas, these temporal commandments capture some of the themes mentioned earlier, and added a few more. The 10 commandments are:
- Thou shalt reflect on how you think and decide
- Thou shalt not rely on your memory when making critical decisions
- Thou shalt make your working environment information-friendly by using the latest wireless technology
- Thou shalt consider other possibilities even though you are sure of your first diagnosis
- Thou shalt know Bayesian probability and the epidemiology of the diseases in your differential diagnosis
- Thou shalt mentally rehearse common and serious conditions that you expect to see in your specialty
- Thou shalt ask yourself if you are the right person to make the final decision or a specialist after considering the patient’s values and wishes
- Thou shalt take time to decide and not be pressured by anyone
- Thou shalt create accountability procedures and follow up for decisions made
- Thou shalt record in a relational data base software your patient’s problems and decisions for review and improvement
The 3rd step in overcoming thinking failures is to learn to detect and avoid errors that can wrongly sway our conclusions when we make judgments. The major threat to arriving at such conclusions are errors of logic. Examples of such logical fallacies are:
- The conjunction fallacy: wrongly assuming that two things occurring at the same time share a relationship
- The Texas sharpshooter fallacy: wrongly assuming that random events occurring together constitute a pattern
- The gamblers fallacy: wrongly assuming that the frequent occurrence of events in the past can predict their frequency in the future
- The sunk cost fallacy: pursuing a course of action when it is wrong just because one has invested heavily in it
The 4th step in preventing cognitive failure is to avoid our thinking becoming unwittingly swayed by factors that are extraneous to the issue at hand. This is important when weighing up the strengths of the evidence and the arguments that go into our decision-making. Such false arguments that can wrongly influence our judgments are:
- The authority of the person making the argument: argumentum ad verecundiam
- The loudness by which the argument is made: argumentum verbosum
- The frequency by which the argument is repeated: argumentum ad nauseaum
- The tradition on which the argument is supported: argumentum ad antiquitam
- The novelty of the argument: argumentum ad novitatem
- The personality of the person making the argument: argumentum ad hominem
We have now concluded our review of the cognitive foundations of human error. In the next blog post, we will begin our exploration of the behavioural foundations of human error.