Space shuttle launch

Challenging the establishment

The importance of listening to warnings and observing safety specifications to the letter - many peoples' lives may depend on it, and would you be the one to cut corners?

What's the connection between a space shuttle explosion, a plane crash in Afghanistan and outbreaks of food poisoning in Scotland and Wales? A common thread links these fatal disasters and many more besides. It's the 'normalisation of deviance'.

On the wintry morning of 28 January 1986, Nasa's space shuttle Challenger was launched from a pad gnarled with icicles, and began its ascent into an exceptionally cold Florida sky. The right-side solid rocket booster immediately sprang a leak, stemming from one of the 'O'-shaped rubber rings that sealed the joins between each booster's cylindrical segments. A wayward jet of flame began to scorch the side of the huge liquid fuel tank, to which the boosters and Challenger itself were attached. Just 73 seconds after lift-off, the vehicle exploded, killing all seven astronauts.

Richard Feynman, a veteran of the 1940s Manhattan atomic bomb project and one of the most celebrated physicists of the 20th century, was invited to join the official inquiry into Challenger's loss. Independent-minded and sceptical, he cleared his own path through the verbiage. With the TV cameras running, he dipped a piece of rubber into a glass of iced water and showed how it hardened when cold. 'I believe that has some significance for our problem,' he said.

Feynman demonstrated that gaps between the O-rings and the booster segments were bound to develop in icy weather. It seemed to him that plenty of people within Nasa and its contractor companies had feared that this could happen, yet the space agency had done nothing to prevent it. Why?

There was more to the shuttle explosion than hardware failures alone. Diane Vaughan, professor of sociology and public affairs at Columbia University, New York, spent nine years collating data for her landmark study of the shuttle accident. 'The Challenger Launch Decision: Risky Technology, Culture, and Deviance at Nasa' was published to widespread acclaim on the tenth anniversary of the explosion. The common perception is that Nasa managers must have been reckless to launch a shuttle on such an unexpectedly cold day. Somewhat to her surprise, Vaughan found that this was not so. 'After looking more closely at the data, it turned out that the managers had not violated rules at all, but had actually conformed to all Nasa's requirements. In their view, they were obeying the correct engineering and organisational principles.'

Officials responsible for the launch had acted in good faith. And yet, the system had failed. Worse still, the formal investigation found that O-rings had suffered partial burn-throughs on at least five previous shuttle missions. As Vaughan explains, Nasa 'repeatedly observed the problem with no consequence and reached the point where flying with a known flaw was normal and acceptable. Of course, after the accident, they were shocked and horrified when they saw what they had done.'

Some 16 years later, similar institutional misjudgements led to the destruction of a second shuttle. Returning from a microgravity science expedition on 1 February 2003, Columbia disintegrated during re-entry at the end of an otherwise flawless 16-day mission. The entire crew of seven were lost. A suitcase-sized chunk of insulation from the external fuel tank had peeled off during launch and slammed into the front of Columbia's left wing, making a small but ultimately lethal hole in the heat-resistant panels on the leading edge. The re-entry of a shuttle is usually one of its safest and best-understood procedures. Columbia's loss in mid-air was not the cause of the disaster so much as the final symptom.

Once again, Nasa had missed warning signs that seemed obvious in retrospect. Back in December 1988, the crew of shuttle Atlantis were flying a secret mission for the US Department of Defense. Commander Robert Gibson and his crew were infuriated, after touchdown, when they found that hundreds of thermal insulation tiles on their craft had been damaged by insulation impacts during lift-off and one tile on the vulnerable underside was missing completely.

Vaughan was invited to join the Columbia Accident Investigation Board (CAIB), where she found that 'the origins of the two shuttle accidents were identical. In one case, they flew with O-ring problems that they considered were not a risk to flight safety, and the more they flew, the more they demonstrated that the problem had no consequences. For Columbia, they flew with foam debris that hit the heat shield of the orbiter wing. The more they observed such debris hits, the more they considered that they had no safety consequences.'

According to Vaughan, wherever you have 'a long incubation period filled with early warning signs that are misinterpreted or ignored,' there is the potential for organisations to be taken by surprise, while the outside world will always be shocked, in retrospect, by what seem like obvious failings. 'Normalisation of deviance means that people within an organisation become so accustomed to a deviant behaviour that they don't consider it as deviant, despite the fact that they exceed their own safety rules by a wide margin. People outside see the situation as deviant whereas the people inside get accustomed to it.'

Not listening to warnings

The loss of an RAF Nimrod MR2 aircraft in Afghanistan in September 2006, along with its 14 occupants, was attributable to the same pattern. Fuel leaking onto a hot pipe caused a mid-air explosion. Concerns about the pipe's proximity to the fuel tanks had been raised repeatedly for many years before the accident, but senior officials in Whitehall and the RAF, and among them contractor companies responsible for building and servicing the aircraft, had done little to address the problems.

Specialist aviation lawyer Charles Haddon-Cave conducted an inquiry, the results of which were published in October last year. He discovered 'a widespread assumption by those involved that the Nimrod was safe because it had successfully flown for 30 years.' Officials responsible 'suffered from a belief that, because a risk had been successfully avoided in the past, it was unlikely to transpire in the future.' Just like Nasa, they relied on 'past success as a substitute for sound engineering practices, such as testing to understand why systems are not performing in accordance with requirements and specifications.'

Similar problems occur in all walks of life. For instance, during the winter of 1996, 21 people in Wishaw, Scotland, died after eating meat contaminated with E.coli bacteria, all sourced from the same butcher. Professor Hugh Pennington, at that time Chair of Bacteriology at the University of Aberdeen, led an inquiry into the circumstances. In 2005, an E.coli outbreak in south Wales affected 150 people, mostly children, with one case proving fatal. Once again, Pennington was called to investigate. The fact that health inspection warnings had cropped up repeatedly, yet weren't acted upon, reminded him eerily of the shuttle accidents and Vaughan's assessment of the causes. 'The normalisation of deviance is not peculiar to aviation or space travel,' he found.

Passing the buck

The safety concerns of junior staffers within an organisation are often brushed aside by senior figures more focused on what they mistakenly see as grander priorities, such as strategy, politics, funding and, of course, the security of their own positions at the top of the hierarchy. In the wake of Challenger, Nasa replaced or retrained particular individuals, but Richard Feynman certainly wasn't convinced that much had changed within the organisation itself. He wrote subsequently, 'I had the definite impression that senior managers were allowing errors that the shuttle wasn't designed to cope with, while junior engineers were screaming for help and being ignored.' Even when these human factors are identified, official investigations into disasters usually provoke unsatisfactory adjustments to surface procedures rather than solutions to underlying causes.

Vaughan cautions against 'the tendency of corporate or public agency administrators to blame individuals' while leaving organisational failures in place, so that 'the next person to take the job will just experience the same pressures.' Haddon-Cave quotes the CAIB's warning against organisations that 'aim to fix technical problems and replace or retrain the individuals responsible. Such corrections lead to a misguided and potentially disastrous belief that underlying problems have been solved.'

The lesson from these miserable incidents is summed up by avionics engineer Dennis Gilliam, formerly of the TRW company and now working in the private space sector. After spending a decade testing missile systems on rocket sleds at the White Sands rocket range in New Mexico, he urges that the solution to normalisation of deviance is clear. 'If a system is doing something that it says in the handbook that it shouldn't be doing, then consider yourself warned and fix it. When people build a system, they say at the outset what the operating range should be. And they do so for a reason, not on a whim.'

The biggest problem, Gilliam suggests, is that 'many managers come onto the scene years, or even decades, after a system has been designed, and long after the organisation has forgotten exactly why certain rules were set in place to start with.'

As time goes by, new managers see a record of zero accidents and assume that they can formulate their own rules about safety. And then the truth comes home to roost, and the deviances from the original specifications lead toward catastrophe. The meat does get poisoned, the rocketships really do explode and people really do die.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close