Medicine is a late entrant into the world of human factors. Long before the scale of human error in healthcare became evident, the impact in other industries had been glaring. And whilst human error manifests differently in medicine, the human factors driving all accidents, and their requisite remediating measures, are similar. For this reason, the starting point in most patient safety discourses is non-medical. We will therefore review the landmark disasters that have influenced the medical approach to human factors, and we will assess the key lessons these catastrophes teach us. Aviation will feature prominently because, more than any other industry, it has been the human error template which medicine has tried to adapt; this is understandable given the emotional resonance of every air traffic disaster. Apart from aviation, other industries that bear the scars of instructive industrial disasters are nuclear power, manufacturing, and maritime. Let us then see what lessons we can learn from 9 high-profile disasters that blighted these four industries.
The Tenerife airport disaster – This is recognised as “the deadliest plane crash ever” and it occurred when two Boeing 747s collided into each other on the 27th of March 1977. This happened at Los Rodeos airport on the island of Tenerife and it resulted in 583 deaths. It is a perfect example of communication error between pilots and air traffic control, and it was caused by the decision of the captain of the aircraft belonging to KLM who was attempting to take off under the mistaken impression that he had been cleared to do this. Unfortunately, at that time, the jumbo jet belonging to PanAm was still on the runway after it had landed. Whilst the catastrophe was eventually blamed on the decision of the KLM pilot to take off, investigations into the accident discovered that “nearly a dozen mistakes and coincidences had to line up with dismaying precision in order for the disaster to happen“. The coincidences included severe weather conditions characterised by dense fog, and air traffic congestion because a nearby airport had been closed. The errors on the other hand included a deadly power gradient that inhibited the cockpit crew from challenging their pilot’s wrong decisions. This crash has gone into the annals of human factors as the “watershed moment” in aviation safety because it became the trigger for implementing safer cockpit procedures, and the bellwether for the standardisation of aviation communication systems. It also led to the institution of crew resource management, a pilot safety training scheme from which medicine has borrowed heavily.
Korean Air Cargo flight 8509 crash – Flying from London to Milan on 22nd December 1999, Korean Air flight 8509 crashed when it suffered instrument failure. The instruments had malfunctioned during the preceding flight but engineers who attempted to repair it did not have the appropriate manual; they therefore failed to replace the defective unit. Contributing to the accident was a communication cockup in the cockpit; the captain did not communicate his thoughts to the first officer, and the first officer did not challenge the captain when the captain failed to make the remedial measures that may have averted the accident. This communication fiasco was subsequently attributed to a power gradient between the two officers; this may have arisen from a culturally-driven deference to age and authority ingrained in the hierarchial structure of Korean culture. The result was that neither the first officer, nor the more experienced flight engineer, could exert the appropriate assertiveness the situation required. An investigation into the disaster recommended that Korean Airlines should revise its crew training programme and transform its organisational culture “to promote a more free atmosphere between the captain and the first officer“.
Japan Airline flight 123 disaster– With a fatality of 520, the Japan Airline 123 accident has been depicted as “the deadliest single-aircraft accident” in aviation history. Travelling between Tokyo to Osaka in Japan on the 12th of August 1985, the Boeing 747 crashed into a mountain 12 minutes after take off when it experienced sudden cabin decompression. The single cause for the accident was traced to the inadequate repair by ground technicians, seven years earlier, of a tailstrike damage when the aircraft’s tail struck the ground on landing. This is a lesson in the power of remote human error to cause devastating havoc years after.
Nuclear power disasters
The Chernobyl disaster – This was the explosion of reactor number 4 at the Chernobyl nuclear power station in then Soviet Ukraine on the 26th of April 1986. Considered “the worst nuclear disaster in history“, it was the culmination of several human errors and systemic transgressions during a poorly designed safety test. Reactor design flaws, inadequate instruction guidelines, poor operational and management safety culture, and inadequately trained personnel all contributed to the calamity. The result was an unstable chain reaction in the reactor core which resulted in a nuclear explosion and a partial meltdown of the reactor core. The radioactive material that was discharged required the evacuation of the residents from the contaminated area around which there is still a >4,000 square kilometre exclusion zone.
The Three Mile Island disaster – This is a radiation leak that developed on the 28th of March 1979 at Three Mile Island nuclear power station in Pennsylvania in the United States. Acknowledged as “the most significant accident in U.S. commercial nuclear power plant history“, it was primarily the result of mechanical failure involving a faulty relief valve that resulted in a partial meltdown of the plant’s reactor number 2. There were however several other factors that contributed to the disaster, and these included poorly designed and ambiguous human-machine interfaces, inadequately trained personnel, and poor communication between regulatory agencies. Although the health consequences were negligible, this accident led to the introduction of better regulatory practices, and the institution of human factors engineering, to the nuclear power industry.
The Bhopal gas disaster – This tragedy occurred when 30 tonnes of a highly toxic chemical gas, methyl isocyanate, was released from the Union Carbide pesticide plant in Bhopal, India, on the 2nd of December 1984. With more than 500,000 injuries, and a toll of more than 15,000 deaths, it is considered the world’s worst industrial disaster. Investigations revealed that “at least 10 violations of the standard procedures of both the parent corporation and its Indian-run subsidiary led to the calamity”. Amongst the contributory factors to the catastrophe were inadequate safety systems, poorly maintained equipment, misinterpretation of tank pressure readings, and poor communication.
The Minamata mercury poisoning disaster – This was a tragedy that resulted when an entire Japanese town fell victim to mercury poisoning. It initially manifested as a mysterious illness, and it took time for investigators to establish that the disease was caused by the consumption of seafood that had been poisoned by methylmercury that was being discharged from a chemical factory into Minamata bay. The calamity evolved between 1932 to 1968, and manifested as Minanamata disease – an epidemic disorder which presented with neurological impairments such as neuropathy and ataxia. The long evolution of the tragedy was partly a result of the refusal of the factory owner, Chisso Corporation, to acknowledge the problem and institute remedial measures. Rather than face up to its responsibility, the company placed profit over human life and deliberately obstructed the research that was set up to establish the cause of the disease.
The Exxon Valdez oil spill – This was a massive oil spill that occurred when the oil tanker, the Exxon Valdez, struck Bligh Reef near Alaska on 24th of March 1989. Considered the worst oil spill worldwide in terms of damage to the environment, the accident resulted in the discharge of about 11 million gallons of crude oil, affecting more than 2,000 kilometres of coastline. Second only to the Deepwater horizon oil spill in terms of volume, the accident “killed hundreds of thousands of seabirds, otters, seals and whales“. Amongst the contributory factors to this environmental disaster were a fatigued and overworked crew, inadequately maintained anti-collision radar equipment, and inefficient supervision by the parent company. An attempt was made to make the captain a scapegoat by claims that he was inebriated and away from his post at the time; these accusations however turned out to be false. As a fallout of this disaster, an oil pollution act was signed into law in the United States; this legislated the use of double hulled designs for oil tankers, and it increased the penalties for oil spills.
The Piper Alpha accident – This disaster unfolded on the 6th of July 1988 on the Piper Alpha oil production platform in the North Sea, off the coast of Aberdeen in Scotland. The accident started when a gas pump tripped and led to the rupture of a high pressure gas line. The gas that was released caught fire and the subsequent explosions weakened and collapsed the structure of the oil platform. Described as “the world’s biggest offshore oil disaster“, the fire took three weeks to put out, and the disaster caused more than 160 deaths. The Inquiry that investigated the accident identified several contributory factors including inadequate maintenance procedures, suboptimal risk assessments, ineffective communication between crew shifts, and poor implementation of design changes which were made to accommodate the processing of gas on the plant. This accident led to the implementation of tighter safety regulation through the enactment of the offshore safety act.
It is clear that a common thread of human and system errors links all these high profile disasters – from communication mistakes to leadership failings, from poor training to overwork. The tendency to ignore warning signs and to scapegoat innocent workers are also evident in these stories. Most striking are the influence of remote and proximal causes of the disasters, and the interplay of multiple contributory factors to most of the accidents. To learn more, you may want to checkout the following links:
We will soon take a deep dive to scrutinise how and why we frequently get it wrong in healthcare, but we will first review the distant and recent history of Medicine to see how its patient safety record stands up to scrutiny.