Agreeing to fail

Does 'groupthink' cause management disasters, or are other factors at play?

In his book espousing the value of getting people to put their collective heads together, 'The Wisdom of Crowds', writer James Surowiecki argued that small teams had to be on their guard: "Homogeneous groups, particularly small ones, are often victims of what the psychologist Irving Janis called 'groupthink' ... Janis argued that when decision makers are too much alike - in worldview and mind-set - they easily fall prey to groupthink."

Janis's work in the early 1970s was based on a series of poor foreign-policy decisions made by the US government that included the events leading up to the attack by Japanese forces on Pearl Harbour and the disastrous attempt to invade Cuba from the Bay of Pigs.

Janis drew heavily on the Bay of Pigs fiasco in his 1972 book 'Victims of Groupthink'. On 17 April 1961, around 1,400 Cuban exiles helped by the US military landed on the coast of Cuba at the Bay of Pigs. But within three days, the invaders were dead or captured by Cuban troops.

President John F Kennedy approved the invasion having taken advice from a group of experts. The team made a series of assumptions proving to be false. Among them was the belief that the landings would trigger an armed uprising among the Cuban population. And, landing at the Bay of Pigs, there was little opportunity to retreat to a more defensible mountain position.

Janis argued that the group of experts did not consider alternative viewpoints on the invasion and fought too hard for consensus.

"One of the real dangers that small groups face is emphasising consensus over dissent," Surowiecki argued. "Homogeneous groups become cohesive more easily than diverse groups, and as they become more cohesive they also become more dependent on the group, more insulated from outside opinion, and therefore more convinced that the group's judgment on important issues must be right."

But is groupthink the true cause of these management-inspired catastrophes? Later academics have argued that Janis's groupthink explanation only goes so far in explaining how management teams can get it so badly wrong.

Prospect theory

Researchers such as Glen Whyte of the University of Toronto have considered how managers frame decisions. Using an idea called prospect theory, they have argued that a lot of disastrous decisions come about because of the way that managers perceive risk and reward. Prospect theory frames decisions in two domains: those of gains and of losses. In the domain of losses, where the choices are likely to lead to some degree of loss, individuals tend to pick the risky option. And this gets worse in groups.

Whyte wrote in his 1989 paper 'Groupthink reconsidered' in The Academy of Management Review: "In group decision making, pressures for uniformity will militate toward a choice that is consistent with the initial risky preferences of a majority of members."

On its own, this effect is manageable, although there is a tendency among groups to promote risky decisions where an individual manager on his or her own might avoid them. The situation gets worse when you look at how managers perceive a situation.

According to Whyte: "Decisions that lead to fiascos are most naturally framed, whether appropriately or not, as a choice between two or more unattractive options. One option typically involves the immediate recognition of the permanence of an aversive state of affairs. The other option or options entail potentially an even worse situation combined with the possibility that the state of affairs may be avoided."

You can look at the choices as being akin to a gambler's last roll of the dice. The gambler is already faced with heavy losses but rather than face up to them and call it a night, they go for the last ditch attempt to win it all back or lose everything.

Whyte argued that it was possible to take Janis's examples and use prospect theory to help explain them: "Evidence ...suggests that for each of the fiascos discussed by Janis, the frame adopted by decision-makers led them to perceive their decision as between a certain loss and potentially greater losses."

Whyte added: "The decision to launch the [space shuttle] Challenger can also be described as a choice in the domain of losses. To delay the launch, an additional time entailing certain unfortunate consequences for the shuttle programme. Those consequences could possibly have been avoided by the decision to launch, although such a choice entailed additional risks and additional potential losses."

Referring to an earlier paper by Daniel Kahneman, who introduced prospect theory in the 1970s, and colleague Amos Tversky, Whyte claimed: "It becomes increasingly clear that 'the adoption of a decision frame is an ethically significant act'."

Working more recently, Chun Wei Choo, also of the University of Toronto, argued that analysis of management disasters has to go beyond the perception of the people involved in the decisions. "While human error often precipitates an accident of crisis in an organisation, focusing on human error alone misses the systemic context in which accidents occur and can happen again in the future," argued Choo in the MIT Sloan Management Review in 2005.

"Can organisational disasters be foreseen?" Choo asked. "The surprising answer is yes. [But] signals are not seen as warnings because they are seen as consistent with organisational beliefs and aspirations."

The tendency to ignore warnings is particularly pertinent when it comes to commercial disasters, where questionable ways of doing business ultimately bring a company down. Taking Enron as an example, Choo wrote: "When warning signals about these questionable methods began to appear, board members were not worried because they saw these practices as part of the way of doing business at Enron."

The situation is not helped by the way in which managers tend to analyse information. Work by Kahneman in the first half of this decade pointed to problems of information bias among executives. Managers tend to prefer information that confirms their actions and abilities and tend to feel overconfident about their own judgment. They also tend to be unrealistically optimistic. However, these may be the same traits that suit them to life as executives as more pessimistic types may not take the risks that could make them successful.

Managers can see setbacks as temporary and get in too deep, Choo argued: "Although past decisions are sunk costs that are irrecoverable, they still weigh heavily on the consciousness of executives, often because of a reluctance to admit errors to themselves or to others. If facts challenge the viability of a project, executives often find reasons to discredit the information. If the information is ambiguous, they may select favourable facts that support the project."

The Icarus paradox

Success can make the fall harder. In 1990, Danny Miller coined the term Icarus Paradox to describe the way that businesses often fail after flying high. Allen Amason of the University of Georgia and Ann Mooney of the Stevens Institute of Technology updated the work in the journal Strategic Organisation last year. They quizzed managers at a variety of US companies to work out if the Icarus Paradox had a behavioural basis.

Amason and Mooney wrote: "As performance grows stronger and more consistent, the cycle becomes self-reinforcing. Managers attribute strong performance to good strategy and good decisions...There is no great surprise when good performance is realised, only consternation when it is lost.

"It is against this backdrop that new strategic issues are identified and framed ... Managers in firms with strong past performance are likely to frame arising strategic issues in terms of their potential for loss rather than the potential for gain."

Again, the way in which managers frame their decisions proves instrumental. The problem is that they can become hemmed in by the fear of loss: "Perceptions of threat produce constricted and less comprehensive decision processes, [yielding] dysfunctions," claimed Amason and Mooney.

Could organisations avoid these problems? Choo drew on the experiences of high-reliability organisations. These are organisations where safety is so important it informs the entire culture. Air-traffic control is one example of this kind of high-reliability activity.

Choo observed that these organisations have five information priorities: "They are preoccupied with the possibility of failure and so encourage error reporting, analyse experiences of near-misses and resist complacency. They seek a complete and nuanced picture of any difficult situation. They are attentive to operations at the front line, so that they notice anomalies early while they are still tractable and can be isolated. They develop capabilities to detect, contain and bounce back from errors, creating a commitment to resilience. They push decision-making authority to people with the most expertise, regardless of their rank."

Another way Choo put it was to borrow the phrase used by Andrew Grove, the former head of chipmaker Intel, to title his book Only The Paranoid Survive.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close