Just as safety failures can arise from individual healthcare episodes, they can also take root at the organisational level where the stakes are higher, and the consequences more profound. In his book The Field Guide to Understanding ‘Human Error‘, Sidney Dekker described the typical trajectory of organisational failure, often starting as a gradual drift from safe practice, through a stage where the organisation manages to get the job done with limited resources, and culminating in a situation where it stops applying all the rules. Dekker noted that this drift into failure is characterised by compensating behaviours which are initially conformist but then become frankly aberrant, and this sets a progressively divergent course which departs from the acceptable, and evolves into ‘normalisation of deviance‘.
There are many factors that drive organisational failure and trigger catastrophic safety disasters. For example, Roger Buehler and colleagues, writing in the book Heuristics and Biases: The Psychology of Intuitive Judgment, pointed to the planning fallacy as a major cause. Illustrating with such failed projects as Eurofighter, the Sydney Opera House, and Denver Airport, they described how the planning process can be overridden with false optimism, and how this beguiles managers into making wrong planning predictions. They showed that at the heart of this planning fallacy is the tendency for planners to underestimate the different ways the future may evolve; they explained that this situation arises when the planners fail to carry out scenario thinking by failing to consider the alternative ways any plan may evolve. In his own exploration of the drivers of organisational failure, Dekker identified other factors such as:
- The existence of goal conflicts
- A high drive for cost-effectiveness
- A mismatch between rules and the practical work
- Using previous successful deviations as proof of safety
- A poor safety culture which does not enable leaders to hear bad news
The Healthcare Commission Investigation into failings at Mid Staffordshire NHS Foundation Trust also discovered a wide range of causes of organisational safety failure which included:
- Weak leadership
- Absence of an open culture
- Pressure to meet targets
- Insulation of management from ‘the reality of poor care‘
- Absence of clear protocols and pathways
- Inadequate staff numbers
- Poor staff training and development
- Poor supervision of junior staff
- Poor team interpersonal relationships
- Inadequate patient monitoring
- Poor audit processes
- Poor record keeping
There are three major approaches to mitigating the risk of organisational safety failure. The first addresses the planning errors which drive organisational failure, and one method of doing this, advocated by Gary Klein in his book Sources of Power: How People Make Decisions, is the deployment of crystal ball strategies. These methods, which include the building of decision scenarios and the application of premortem or prospective hindsight, are hinged on imagining that the proposed plan has failed, and then using this as a trigger to look for the loopholes in the plan that gave rise to this hypothetical failure. Further measures to mitigate the influence of the planning fallacy are highlighted by Roger Buehler and colleagues, and these include the use of an outside neutral perspective, and the use of recall-relevance manipulation to encourage the linking of past experience with current plans.
The second approach to preventing organisational failure, as explored by Sidney Dekker, is the improvement of the safety climate of organisations. Dekker recommended the following ways of doing so:
- Learning from the failure of other organisations
- Not relying on past successes as guarantee against future failure
- Regular discussions around safety
- Bringing in fresh perspectives such as from other industries
- Instituting a devil’s advocate in the system
- Metamonitoring – monitoring how the organisation monitors safety
The third approach to reducing the threat of organisational safety failure is to improve organisational structures, and this is discussed by James Surowiecki in his book The Wisdom of Crowds. Why the Many are Wiser Than the Few. Surowiecki argues that the fostering of diversity in organisations is fundamental to this approach because it ‘increases the number of possible options‘ available to the organisation, it ‘allows the group to conceptualize problems in novel ways‘, and it ‘limits the development of extreme views in the group’. However, for diversity to achieve these goals, Surowiecki stressed that leaders must structure discussions in such a way that they encourage helpful debate and ensure that everyone has a chance to speak. He particularly warned against behaviours that stifle dissenting views such as excessive deference to status and a hierarchical order in which people speak. He also cautioned against the inappropriate influence of the more talkative members of the group, and of powerful advocates of specific agendas. Surowiecki further suggested that minority views be included in reports to discourage the emergence of an illusion of consensus, and that ‘decisions about local problems should be made, as much as possible, by people close to the problem‘.
We have now concluded our review of the systemic foundations of human error. In the next phase, we will begin an exploration of the foundations of the management of patient safety, and we will start with an introduction to patient safety incidents.