17.1

INTRODUCTION

Analytic pitfalls (see chapter 14 ) tend to penetrate analysis and decision-making relatively stealthily. Suffering from an ethnocentric or status quo bias is not a conscious choice. Wishful thinking, following the herd, over-reliance on previous judgment and falling back on conventional wisdom may reflect flaws in critical thinking, exhibited by either analysts or decision-makers, that intentionally distort an analysis.

This is where social biases vary from analytic pitfalls. Social biases relate to intentional behaviour exhibited by either analysts or decision-makers. When this occurs, the person adapts his behaviour to a perceived socially desirable norm, even when doing so distort the conclusions of a piece of analysis. And, as you might imagine, that can wreak havoc with the quality of an organization’s decision-making. 1

Below we’ll take a look at three common social bias phenomena, each of which was briefly mentioned earlier in this book.

The first bias is that of selectively cherry picking data from a data set for political purposes. Cherry picking may either happen at the desk of the analyst, elsewhere in the strategic analysis function or in the corner office of the CEO.

The second social bias is ‘yesmanship’. Yesmanship in its most common form is saying what the audience wants to hear, regardless of what you as analyst have discovered through your research and know to be true.

The third bias is groupthink. 2 This is in fact an umbrella term for several factors that may or may not together occur in group dynamics. Groupthink is a negative phenomenon. It may lead to flawed data set analysis, which can result in less than optimal decision-making. We noted in an earlier chapter that groupthink can pop up in M&A deliberations. The M&A sector is a major employer of strategic analysis resources, so analysts working in that space should certainly be on the lookout for creeping groupthink.

Cherry picking, yesmanship and groupthink are just some of the factors that can lead to sub-optimal outcomes, even when well-rounded data sets are available to support decision-making.

The research on groupthink shows that there is no inevitable causal link between this phenomenon, wrong executive choices and poor outcomes (Janis, 1982c). An executive who finds herself surrounded by groupthinkers, and the murky waters they engender, may still succeed in making the right decisions (Janis, 1982d). However, once an organizational dynamic becomes infected with groupthink, executives generally underestimate the risks entailed in executing their chosen strategies.

17.2

CHERRY-PICKING
FROM A DATASET

Cherry picking is an extreme form of premature closing (see chapter 12 ). In cherry picking, the analysis starts or at least almost starts with a pre-cooked analytical conclusion. In 2002 the US government gradually moved towards a decision that a war with Iraq was necessary.

In the aftermath of 9/11, any evidence that pointed to a new threat to the US homeland was understandably treated extremely seriously. Without this context, the so-called Iraqi aluminium tube intelligence failure of 2001/2002 cannot be understood. Australian intelligence discovered that the Sadaam Hussein regime was procuring a large batch of a particular type of high-grade aluminium tubes, to be shipped into Iraq (Albright, 2003) 3 ; (Clark, 2007d). Under the Iraq military technology ban program that existed at that time, the country was not allowed to procure these tubes, even when they had civilian uses, as they also had known military applications, including a use in short-range ground-to-ground rockets. From a US perspective, the tubes shouldn’t have been destined for Iraq; that much was clear.

The CIA, however, believed the tubes could serve yet another use. They could be used as ultracentrifuges to allow the Hussein regime to separate natural uranium into a weapons-grade component part (a fraction rich in isotope 235 U) and waste (a fraction depleted in the isotope 235 U). These tubes could be a missing link in a possible Iraqi program to acquire its own nuclear weapons. Saddam having WMDs was a no-go in the George W. Bush administration. The tubes were confiscated at the Jordanian-Iraqi border. All evidence (size, material specifications, coating) pointed to the tubes being suited for conventional rocket applications. Using them for ultracentrifuges required considerable, and rather improbable, extrapolation of the facts and knowledge. In layman’s terms: it required a lot of imagination to conclude that they were planning to use them for nukes.

The CIA, however, had already accused Iraq of trying to develop nukes and stuck to its prematurely closed position. A balanced, verifiable view was apparently not what the administration needed. National Security Advisor Condoleeza Rice, in a CNN interview on 8 September, 2002, used the following text, even while admitting that there was uncertainty over how long it would take for the Iraqi government to have a working weapon:

“We don’t want the smoking gun to be a mushroom cloud.”

Once the decision-making moved beyond the point-of-no-return for starting the Iraq War, any data that questioned the substantiation for that decision were vigorously smeared, as this quote from a senior Pentagon official shows (Grey, 2016a):

“It wasn’t intelligence, it was propaganda […] They’d take a little bit of intelligence, cherry-pick it, make it sound much more exciting, usually by taking it out of context, often by juxtaposition of two pieces of information that don’t belong together.”

Reliable data showed that the tubes were in fact useless for uranium enrichment, but the disinformation became so overwhelming that those findings were ignored (Pollack, 2004). There seems to be a proportionality between the acceptability by decision-makers of intelligence data confirming chosen policies and the degree to which data are used in policy design and execution (Treverton, 2008). Inversion is the ultimate risk: the data are framed as leading evidence for a decision, starting with the decision and scrambling backward for the facts. This example helps illustrate how dramatically a highly politicized dynamic can skew the decision-making process, and how serious the consequences can be. And it can happen in business as easily as it did in military intelligence, with facts and figure cherry picked to sell completely erroneous conclusions.

CHERRY PICKING OCCURS IN BOTH GOVERNMENT AND BUSINESS

Think of a executive who may similarly have rallied relevant stakeholders in a large company to support a major capital expenditure. The similarities with the Iraq War story are clear. Once the mental commitment to the investment is there, any evidence that changes in the market could negatively affect the business case is suspect. The tendency of some business leaders is to subsequently discredit this type of new data. The leader has passed the mental point of no return – anything that could halt the investment approval is to be firmly rejected.

This means that for the new data to be accepted, the condition is now that these data have to substantiate the business case the executive is championing. In any other case, the correctness of the data or the source or the sender, or all of them at the same time – must be questionable. When all other arguments fail, the data may be dismissed as a plant: an intentional deception by an adversary. Fortunately – or so the company line goes – the deception was unmasked just in time, allowing for full implementation of the planned strategy. In competitive corporate cultures such deception could even originate from a colleague who’s after the same limited corporate investment budget for his own pet project.

This bias, however, not only applies to the business leader. The filtering of data leading to questionable but undebatable conclusions, may also occur in the analyst’s mind. In strategic analysis, correctly informing decision-makers about what will happen next adds the most value. This indeed turns an analyst into a special breed of forecasters, making the quote below applicable both to to the sweep of strategic analysis in general and to the phenomenon of cherry picking we’re examining in this section (Silver, 2013c):

“A forecaster should almost never ignore data […] Ignoring data is often a tip-off that the forecaster is overconfident […] that she is interested in showing off rather than trying to be accurate.”

Managers, as I discussed in chapter 16 , have been selected for their confidence, and in most cases rightfully so. The analyst, on the other hand, was chosen for her objectivity (among other attributes, of course). It goes without saying that objectivity does not go hand-in-hand with a proclivity for cherry picking.

17.3

YESMANSHIP

The social bias of yesmanship can be triggered by the real or imagined desire of a hedgehog decision-maker (Lovallo, 2010a). In chapter 16 we got a glimpse of the overconfident business leader. It is the type of leader who rarely requires strategic analysis on his competitors. He already knows them all. Moreover, he certainly knows them better than his strategic analysis function could ever know them. (This is to some extent actually true out in the real world… but to some extent is true only in his own mind.)

Yesmanship starts with a given fundamental: an asymmetric power relationship between a decision-maker and his analysis staff. In the world of public policy, a particular paradigm, first postulated by Lowenthal, applies (Yarhi-Milo, 2014b):

“Policymakers can exist and function without the intelligence community, but the opposite is not true.”

What is true in public policy equally applies in business. An analyst may not be blamed too much for being tempted to reason that whoever finds their bread and cheese determines the tune to which they dance – especially in a difficult labour market. Given the asymmetry of the relationship it is the self-confidence of the decision-maker and by implication how tolerant he is of opinions other than his own that determines the freedom of speech enjoyed by the analyst.

Some leaders may simply not appreciate or allow that freedom. They may prefer their staff to express opinions resembling their own. That is where the negative impact of yesmanship raises its ugly head (Surowiecki, 2005b):

“One of the things that get in the way of the exchange of real information (…) is a deep-rooted hostility on the part of bosses to opposition from subordinates.”

Yesmanship is typically a phenomenon that intensifies when management is under stress. During the second half of the Johnson administration, the US – despite continuously increasing its military commitment – got nowhere nearer to achieving its political and military objectives during the Vietnam War. American political leadership came under increasing pressure to show results. Expenditures on the war went through the roof, body bags came back in ever-larger numbers, public approval for the war took a nosedive and there was no sign of success on the horizon. In this atmosphere, as illustrated in the long quote below, yesmanship became the rule in the White House (Halberstam, 1992b):

“[US President Lyndon B. Johnson] was in an office isolated from reality, with concentric circles of advisers who often isolated rather than informed, who tended to soften bad judgments and harsh analyses, knowing that the President was already bearing too many burdens, that he could not accept more weight, that it would upset him, and also knowing that if they became too identified with negative views, ideas and information, their own access would diminish (a classice example of the two problems would be Bob McNamara [US Secretary of Defense] telling Arthur Goldberg midway through the escalation, when Goldberg raised a negative point to him, that it was certainly a good point but would he please not raise it with the President, it would only upset him).”

As discussed earlier in this book, McNamara’s behaviour as US Defense Secretary, and earlier, as president of Ford Motor Company, reinforces my view that the yesmanship bias is equally present in business and wartime government. The following passage makes direct reference to its business-government ubiquity (Halberstam, 1992c):

“[McNamara] believed in what he did, and thus the morality of it was assured, and everything else fell into place. It was alright to lie and dissemble for the right causes. It was part of the service, loyalty to the President, not to the nation, not to colleagues, it was a very special bureaucratic-corporate definition of integrity; you could do almost anything you wanted as long as it served your superior.”

The behaviour of the president and his advisers is all too human. A stressed executive who is about to lose control of events is as difficult a customer as you can find. I believe these McNamara anecdotes describe a dynamic that that can be evidenced in any culture, although some are probably more vulnerable to yesmanship than others. Leaders with a larger-than-life profile, who inspire such acquiescence may simply be more common in some cultures than in others.

Yesmanship clearly stands perpendicular to the professional values of an analyst: it is impossible to reconcile it with integrity and objectivity. Sir Winston Churchill links yesmanship to inevitable failure in a characteristic statement that the conscientious analyst may consider tacking to her office door (Jones, 1978a):

“The temptation to tell a Chief in a great position the things he most likes to hear is the commonest explanation of mistaken policy. Thus the outlook of the leader on whose decisions fateful events depend is usually far more sanguine than the brutal facts admit.”

Yesmanship is part of a broader set of social biases in strategic analysis and intelligence that we touched on earlier – ‘politicization’ (Treverton, 2008). Politicization may have different causes, including those mentioned below. Cherry picking (as discussed above) and groupthink (discussed below), fall under the politicization umbrella:

Direct pressure:

Decision-makers push for conclusions they like to see, regardless of whether the data set and the analysis support such conclusions. Giving in as an analyst or business function to such pressure is the most open form of yesmanship.

The house line:

The strategic analysis function (head) has a particular view on a topic and rejects other conclusions to be distributed, even when the data set suggests that doing so is justified. The origin of the ‘house line’ may be a pitfall resembling conventional wisdom.

Question asking:

The smoothest politicization starts with a tailored brief delivered by a business leader to the strategic analysis function that ensures fitting and ‘useful’ conclusions will follow from the requested work.

Military literature gives other examples of yesmanship. 1 Below are some additional military examples in which the analyst may see parallels with the business environment.

THE ANALYST’S CHALLENGE IS TO POLITELY WITHSTAND DIRECT PRESSURE FOR YESMANSHIP

German military intelligence during the Second World War had little choice but to fall back on yesmanship, if only to save their own analyst’s skins (Macintyre, 2010e). Over the course of a war that was gradually being lost, German military intelligence over time developed a vulnerable position under the paranoid Nazi political leadership. Bad news was not welcome. Knowing this, British intelligence started to exploit Berlin’s hunger for good news as a basis for deception: generating false propaganda that German intelligence wanted to believe. The confirmatory bias that this triggered helped guarantee that the bogus intelligence was passed up the chain of command, which proved most damaging for the the Axis powers.

And the Germans weren’t along in demanding slavish yesmanship. American hero General Douglas McArthur by the end of his career no longer tolerated divergent opinions.

At the beginning of the Korean War in 1951 his relationship with his intelligence organization was described as follows, and I believe this quote to be a great illustration of yesmanship (Halberstam, 2009c):

“[…] MacArthur’s key intelligence chief [was] a man dedicated to the proposition that there were no Chinese in Korea, and that they were not going to come in, at least not in numbers large enough to matter. That was what his commander believed, and McArthur’s was the kind of headquarters where the G-2’s (military intelligences’) job was first and foremost to prove that the commander was always right”
There was an arrogance to Willoughby [McArthur’s intelligence chief] that was completely different from the uncertainty – the cautiousness – you associate with good intelligence men. It was as if he was always right, had always been right. […] Worse, you couldn’t challenge him. Because he always made it clear that he spoke for McArthur and if you challenged him you were challenging McArthur. And that obviously wasn’t allowed.”

Dire were the consequences of the consistent rejection of reports on the buildup of the Chinese military in Korea by the top of US military intelligence. Due to the underestimation of its adversary, the difficulties America faced during the Korean War tainted its superpower reputation.

However, direct pressure to tell executives what they want to hear is not limited to military intelligence. R.J. Fuld, the former (and last) CEO of Lehman Brothers – the investment bank whose failure catalyzed the 2008 global economic crisis – also had a reputation for demanding yesmanship (McDonald, 2009):

“There were mind-blowing tales of Dick Fuld’s temper, secondhand accounts of his rages and threats. It was like hearing the life story of some caged lion.”

It would be impossible for a strategic analysis function to prevent yesmanship in such a corporation. There is an inevitable temptation to serve characters like Fuld with what it’s thought they want to hear. The temptation is only mildly reduced by being aware that characters like Fuld may in response to biased strategic analysis take less than optimal decisions, with all the risk that involves. Yet even when direct pressure may at times be strong, the incompatibility of yesmanship and proper analysis is obvious.

Choosing objectivity, however difficult that may be and no matter how much courage it may take, is the better option. The quote below, related to a senior CIA manager, puts a fine point on this (George, 2008b):

“It was never difficult to respond to political pressure by saying that the CIA supported the president best when it provided the best and most comprehensive analysis possible.”

There is much to be said for being incorruptible: as an analyst and as a strategic analysis function, you are obligated to do your best to withstand pressure from anyone who aims to interfere with the objective outcome of an analysis. It may not be easy, but given the analyst’s raison d’etre , there is no alternative.

17.4

GROUPTHINK

The best collective decisions are the product of disagreement and contest, not of consensus and compromise (Surowiecki, 2005c). When group-based consensus and compromise appear the quality of decision-making is at risk, no matter the quality of the input data that is supplied to the group. Group-think is appropriately defined as ‘collectively uncritical thinking’ (Janis, 1982c). The size of the collective matters here: this is an instance where small is not beautiful (Surowiecki, 2005c):

“Small groups can exacerbate our tendency to prefer the illusion of certainty to the reality of doubt.”

“Juries deliver verdicts each individual juror would disapprove of.”

Fortunately, the phenomenon of groupthink has been analyzed in quite some detail. In the classic study of groupthink, several US government policy fiascoes that resulted from this phenomenon were analyzed, vis-a-vis various political group decision processes that had successful outcomes. The central theme of the analysis is (Janis, 1982d); (Janus, 1982f):

“The more amiability and esprit de corps among the members of a policy-making in-group, the greater is the danger that independent critical thinking will be replaced by groupthink, which is likely to result in irrational and dehumanizing actions directed against out-groups. […]
Members of any small cohesive group tend to maintain esprit de corps by unconsciously developing a number of shared illusions and related norms that interfere with critical thinking and reality testing.”

The groupthink research from the 1980s has since been reviewed and criticized. One criticism is that the research was not sufficiently based on a randomized trial of cases (Sunstein, 2015c). In other words: the outcome may suffer from the availability bias discussed in section 12.10 . As we have only seen white swans, we prematurely conclude that all swans are white.

Despite methodological criticism of the 1980s study, a 2015 meta-analysis of subsequent research into group decision-making revalidates the 1980s research’s worrying conclusions. More recent research adds an extra dimension. When individuals, be they analysts or business leaders, enter a group decision-making process with a bias, the group decision tends to be more biased than if each of the decision-makers had individually been biased prior to the decision being taken. In other words, group deliberations amplify the biases (Sunstein, 2015d):

“The larger point is that with group discussion, individual errors are often propagated and amplified, rather than eliminated. When individuals show a high degree of bias, groups are likely to be more biased, not less biased, than their median or average member. Here, then, is a major reason that groups fail.”

These observations should send a loud and clear warning signal to us as strategic analysts. Given that the 2015 conclusion reconfirms the 1980s study about the risks of groupthink to decision-making in groups, I have decided to ignore the methodological concerns regarding the 1980s study. After all, as I’ve said earlier, I believe that all models are wrong, but some models are useful (Box, 2005). To assist in our understanding of groupthink dynamics I therefore continue to rely on the models and schemes of the 1980s research. I believe those still to be sufficiently useful to our chapter’s purpose to share them below in more detail. Admittedly, I may be deceived by the quality of the 1980s’ narrative, but what a good narrative it is.

The relevance of groupthink to strategic analysis in business is significant. Strategic analysis stands for integrity and objectivity, resulting from neutral and independent critical thinking. Being corporate custodians of critical thinking, analysts who encounter the social bias of groupthink may well face group pressure. Group dynamics may for example pose a challenge to analysts when a cohesive group of executives collectively (sometimes implicitly) agrees to a particular view – regarding a competitor, for example. Getting these executives as a group to accept a fact-based and rational, yet different, outlook on the competitor in question may be a daunting but, possibly to some readers, familiar task. This is no new phenomenon.

Paradoxically, enough business leaders normally do not even exhibit groupthink to favour their personal agenda. Resistance to change is what an analyst bearing new news will often face, even when the change is for the better (Reger, 1994):

“Even beneficial change is often resisted by loyal [group] members who sincerely want what is best for the organization.”

Since analysts often encounter the challenge of presenting change, and often have to present their findings to groups of decision-makers, it seems justified to discuss groupthink in more detail. Janis developed a generalized dissection of the conditions that lead to and the symptoms and consequences of, groupthink. This is reproduced in its entirety in diagram 17.1 (Janis, 1982g).

images

DIAGRAM 17.1 THEORETICAL ANALYSIS OF GROUPTHINK

ANTECEDENT CONDITION FOR GROUPTHINK: AN EXTERNAL THREAT TO A HOMOGENOUS GROUP

Upon analyzing different policy-making cases, Janis identified three antecedent conditions that may lead to groupthink. The most important condition, labeled ‘A’, is that the group is intrinsically cohesive. Non-cohesive groups do not develop groupthink. Decision-making in non-cohesive groups may lead to an even lower probability of a preferred outcome, but that is beside the point and out of the scope of this discussion.

The second antecedent condition, labeled ‘B1’, relates to structural faults of an organization. Faults include the group not having established fundamentally sound working processes such as a serious review of multiple options prior to decision-making. That is not to say that in groups that succumbed to groupthink no working processes existed. In two of the group-think cases described by Janis, 4 sticking to decision-making protocol was put before improving the quality of decision-making. Evidence for this, for example, was denying external (i.e., out-group) experts an opportunity to participate in meetings. In other words, self-censorship within the group led to a misuse of existing working processes to ensure group cohesion. In contrast to this, in the example that Janis provides of a group that did not develop groupthink, meetings were held that resembled brainstorming (Janis, 1982h). Even when the President of the United States attended the meetings, no formal agendas were used.

Other structural faults relate to the composition of the group. Overly homogeneous groups that intentionally do not allow consideration of multiple views are more susceptible to groupthink than less homogeneous groups.

The third concurrent antecedent condition, labeled ‘B2’, is that the group must face a provocative situational context. When a cohesive, overly homogenous, insulated group is exposed to high levels of stress caused by external factors, options to resolve the issue are few. When in such a situation the group’s leader offers a clear solution, the probability of groupthink becomes significant. In three of the cases described by Janis, 5 the perception of time-pressure by the decision-makers was a stress-enhancing factor. Similarly, the need for strict secrecy was used in two cases 6 as an excuse to prevent competing hypotheses from being tested. Secrecy requirements may thus be both a cause of groupthink and an instrument or excuse to protect a cohesive group against challenging outside views.

The homogenous, insulated members of the in-group may start seeking concurrence – obtaining strength from the idea that they are at least in this together and that together they can work miracles.

SYMPTOMS OF GROUPTHINK INCLUDE THE ILLUSION OF INVULNERABILITY

Symptoms of groupthink are listed in box C. Rather than seeking multiple options, the in-group may develop shared illusions of invulnerability and moral righteousness – as opposed to the vilified out-group, which is stereotyped based on collective (and normally erroneous) rationalizations. Whereas moral righteousness probably relates more to a government context than a business environment, illusions of invulnerability apply to business just as well. A characteristic illusion of invulnerability is wishful thinking, which we discussed in chapter 14 . In three of the cases described by Janis, wishful thinking became dominant – partly because accepting the reality of the situation had become too painful. 7 Stereotypes and ethnocentric biases are close neighbors. In all four of the cases related to US foreign policy failures described by Janis, ethnocentric biases mattered. Due to the closed-mindedness of the in-group, these ethnocentric biases could neither be challenged by reviewing solid neutral data nor by inviting external experts. 8

Once a group has developed its convenient set of stereotypes, members may even start to suppress each other’s expressions of doubt in the chosen option. In trying times, revisiting ideas, options and choices represents heresy.

THE KEY SYMPTOM OF GROUPTHINK IS THAT ALTERNATIVES ARE NOT PROPERLY WEIGHED

As a result of groupthink, decision-making will no longer be sound. Box D represents symptoms of defective decision-making. These include the incomplete surveying of alternatives and objectives, a lack of interest in new data and the biased view of new data that comes in unsolicited. The bias of premature closing comes to mind here. New data are unwelcome or challenged to ensure the shared illusions are not compromised by new facts. Finally, box E represents the result of groupthink: a low probability of a favoured outcome of the decision-making. As it becomes more pronounced, groupthink is a phenomenon where multiple biases may concurrently play a role. As we’ve repeatedly shown, groupthink is as likely to occur in a business environment as it is in the military and other branches of government. A 2014 study demonstrated that an overabundance of workplace cohesion is a potent catalyst for groupthink. (Palmquist, 2015).

GROUPS MAKE BETTER DECISIONS THAN INDIVIDUALS

Prior to reflecting on the interface between groupthink occurring in decision-making and the strategic analysis profession, a few remarks need to be made. The first is that Janis is anything but subtle when he starts his group-think book. Janis opens with a well-known quote from Friedrich Nietzsche (Janis, 1982i):

“Madness is the exception in individuals but the rule in groups.”

Janis, however, also points to the unmistakable fact that groups tend to make much better decisions than individuals. This implies that abandoning group decision-making to prevent groupthink is not an option (Janis, 1982d). That would be like calling on Beelzabub to drive out the devil; hardly a comforting thought for those hoping to improve fact-based decision-making. Secondly, it makes sense to review the susceptibility of individuals to groupthink. Although Janis exclusively researched US-based cases in detail, he also points out that groupthink is not likely a culture-linked phenomenon to which only US-based business leaders are susceptible (Janis, 1982j). Third, the publication of Janis’ book in 1982 did nothing to suppress the existence of gropupthink. 9 Awareness of groupthink, like awareness of psychology, is no guarantee of people acting differently because of such awareness (Janis, 1982k):

“[…] none is immune to groupthink. Even individuals who are generally high in self-esteem and low in dependency and submissiveness are quite capable of being caught up from time to time in the group madness that produces the symptoms of groupthink.”

In particular, people with an affiliation need prefer not to be rejected by group members. This is probably most true in trying times. In such cases, these people prefer to concur with a group position. They may develop into mind-guardians, even when they personally have doubts about the decisions at hand. I don’t want to jump to conclusions and I admit that I have not done or read solid scientific research to prove this point, but groupthink versus sound decision-making seems to compare well with the proverbial hedgehog versus fox personalities in forecasting, as discussed in chapter 11 .

When the antecedent conditions as described above are present, a group may gradually swing towards groupthink, and in doing so display hedgehog-like behaviour. Like a hedgehog in its most defensive, rolled-up stance, the groupthink-infected collective aims to expose itself as a sphere with the smallest possible interface per unit volume to the outside world. The group dogmatically closes its mind, takes on a defensive veil of secrecy and self-censorship and endeavour to obtain its information only from like-minded sources. In the comfortable womb of mutual in-group admiration, the group cherishes the negative stereotypes heaped on the out-group, knowing that they are so much better than the others. Janis uses the word ‘doctrines’ (Janis, 1982l), whereas Silver, talking about hedgehogs, refers to ‘ideologies’ (Silver, 2013e).

XENOCENTRIC GROUP PROCESSES MAY PREVENT GROUPTHINK

As foxes outsmart hedgehogs, good group processes may prevent group-think tendencies from occurring in cohesive groups. Such processes, when operating properly, allow groups to remain the best fora for sound decision-making. Janis proposes both major criteria for sound decision-making 10 and proposes three of what he calls tentative inferences to counteract groupthink tendencies (Janis, 1982m):

The group leader should actively encourage all group participants to openly voice their doubts and objections. This must be reinforced by the leader’s acceptance of criticisms of their own judgments. The latter must be genuine and to the team members safe and not career-damaging. If the team members doubt this, the leader will see them begin soft-pedaling their disagreements or not voicing them at all.

The group leader should not prematurely shut down policy or strategy discussion by letting their personal views be known. Briefings to a strategy-developing team should be neutral and limited to the scope of the problem and the availability of resources.

An organization may consider setting up multiple, parallel groups acting under different leaders but focusing on a similar assignment. As a result, truly independent courses of action may be developed, which may later be competitively tested as hypothetical options.

Janis’ work strongly underpins one of my core beliefs. Sound decision-making requires unbiased, broadly sourced intelligence that is developed with a xenocentric view. When commenting on decision-making by the Kennedy administration at the time of the Cuban Missile Crisis, Janis says this about one of the key drivers for the peaceful resolution of the most threatening episode of the Cold War (Janis, 1982o):

“The wording of the President’s letter [that defused the crisis] clearly conveyed the emphatic view of the members of the Executive Committee toward the Russian leaders, reflecting their efforts to project themselves into the role of their counterparts in Moscow […]. Had the President and the Executive Committee thought about the enemy leaders in the usual stereotyped way, without considering how they would react if the roles were reversed, the necessary restraint probably would not have been achieved.
[…] Robert Kennedy [the President’s brother and Attorney General at that time] said ‘A final lesson of the Cuban missile crisis is the importance of placing ourselves in the other country’s shoes.’”

RECOGNIZING AND HANDLING GROUPTHINK IN M&A PROJECTS

Having reviewed these additional government examples, let’s once again turn our attention to the world of business. Table 17.1 provides an overview that correlates the antecedent conditions to groupthink in Janis’ model with business strategy practice. The table elucidates the potential applicability of the groupthink theory to an M&A team that may be supported by a strategic analysis function. In the case of M&A projects, the antecedent conditions are strongly present.

As a result, symptoms of groupthink can be quite apparent in M&A project teams. Table 17.2 relates the symptoms of the groupthink theory to the same M&A team example, which again may be supported by a strategic analysis function, justifying this in-depth discussion. This table shows that in line with the antecedent conditions being present across the board, symptoms of groupthink may develop in M&A teams.

Table 17.3 relates the symptoms of defective decision-making due to groupthink to the roles and responsibilities of a strategic analysis function and an M&A team. As in the case of the antecedent conditions and the symptoms of groupthink itself, table 17.3 also shows that symptoms of defective decision-making are commonly observed in M&A processes.

Customers of strategic analysis, such as M&A teams, may be prone to groupthink tendencies as is shown in the tables 17.1 -17.3 . The M&A team is only used as an example. This is not to suggest that M&A teams’ susceptibility to groupthink is much higher than that of other groups of corporate decision-makers. Similar tendencies may be observed in teams preparing major capital expenditures, those gearing up for new product launches, in management teams or on executive boards.

(SELECTED) ANTECEDENT CONDITIONS

TENTATIVE ASSESSMENT OF SUSCEPTIBILITY
TO GROUPTHINK IN A BUSINESS CONTEXT

A

Decision-makers constitute a cohesive group.

Effective business leadership teams tend to be cohesive. 11 The cohesion factor may increase when the leader has the longest tenure in the team and has handpicked his team members.

B-1

Structural organizational faults.

Groupthink antecedent conditions are worked out below for a particular business setting: an M&A project team, as for such a team, all antecedent fault conditions do matter. What is true for an M&A project team just as well applies to a strategy project team preparing a plan for a major capital expenditure or a major new product launch.

1. Insulation and secrecy.

Secrecy is particularly relevant in M&A projects where share price – related information is available. To minimize risks of leaking and of insider trading, information flows are limited to as small a group as possible. This may induce an inevitable, strong ‘in-group’ feeling among the team members – the happy few to whom the information is entrusted.

2. Lack of tradition of impartial leadership.

M&A teams do not tend to start with a neutral briefing; they start to win a deal and they know the project principal and top executive want that deal.

3. Lack of norms for methodological procedures.

M&A teams are formed for the occasion and may not have established processes in place. Time pressure, especially when the target company is offered in an auction process, is such that no time will be available to agree to methodological procedures for sourcing and weighing all necessary strategic data to do a proper valuation and really take a balanced look at rewards and risks of the potential transaction. Risk areas are an overestimation of market growth and market value in the future – which both relate directly to strategic analysis work – and underestimation of post-merger integration issues and value risks.

4. Social and ideological homogeneity of the group.

M&A teams may also be relatively homogeneous, being overpopulated by staff with financial and legal backgrounds and often a similar educational/social background – Ivy League MBAs who started their careers in dogmatic top-end services firms (consultancy, law, investment banking), most of them working 70 hour-plus work weeks in their early 30s. Moreover, most members of the team probably share the same overriding motivation: we want this deal to happen to obtain our success fee or bonus.

B-2

Provocative situational context.

1. High stress – no alternative to the leader’s initial idea.

In an M&A project team, high stress due to external threats (i.e., other bidders) is inevitable. M&A teams may develop tunnel vision, becoming singularly focused on ‘if we don’t win this deal we never will be able to enter this market’. Such vision implicitly subscribes to the antecedent condition proposed by Janis that there is no alternative to what the leader proposes: buy that company!

2. Low self-esteem.

In an M&A project team low self-esteem is probably uncommon. A cause for collective low self-esteem may be that one or more previous deals were lost to adversaries, whilst the pressure from top management to grow the business through acquisitions grows ever-stronger.

TABLE 17.1 GROUPTHINK ANTECEDENT CONDITIONS APPLY TO M&A PROJECTS

C
(SELECTED)
SYMPTOMS

TENTATIVE ASSESSMENT OF SUSCEPTIBILITY
TO GROUPTHINK IN A BUSINESS CONTEXT

Overestimation of the group

1. Illusion of invulnerability

Gordon Gekko in the movie Wall Street was a caricature of an M&A professional who believed he could win all and everything. A book like Barbarians at the Gate, describing the fight for control of RJR Nabisco in 1988, may be highlighting an extreme case (Burrough, 1991). Still, the illusion of invulnerability may infect M&A teams and their principals. This may result in M&A transactions closing too high, which possibly leads to M&A having a modest success rate in corporate value creation at large. This is not to suggest that M&A teams and processes due to groupthink (alone) realize less than optimal outcomes. Academic research on business may be necessary to prove that. It does, however, suggest that the sorts of people who become successful in M&A tend to display higher than average self-confidence. A group of such players may develop an illusion of invulnerability.

Closed-mindedness

1. Collective rationalizations

M&A teams may develop collective rationalizations of the type: “The target operates in a sustainably strong market with a tremendous potential” or “This currency will no longer devaluate due to the maturing of this economy, so the weighted average cost of capital can safely be lowered” or “This is the only feasible option”.

2. Stereotypes of out-groups

In corporate M&A, two out-groups play a role. The first consists of those business leaders who see one of their peers command massive resources for a deal. In so doing this peer reduces the M&A budget available for other (read: their) deals. To avoid internal competition for resources, these peers are excluded from the in-group information on the deal at hand.

The second group consists of the interlopers who are to be beaten in the bidding process.

Pressure toward uniformity

1. Self-censorship

An M&A team may at times doubt the wisdom of a high valuation, but may not always easily call it off because of that, when the team feels a principal badly wants the deal.

2. Illusion of unanimity

The M&A team may have hearted but useful internal disputes at times, but in the presence of the principal will (understandably) always speak with one voice.

3. Pressure on dissenters

An M&A team may have dissenters, but in the presence of the principal it is not common practice that a dissenter loudly voices why a deal should not be pursued or a valuation is excessive. Once ‘deal fever’ sets in, doubt is out.

TABLE 17.2 GROUPTHINK SYMPTOMS APPLIED TO M&A PROJECTS

D
SYMPTOMS

TENTATIVE ASSESSMENT OF SUSCEPTIBILITY
TO GROUPTHINK IN A BUSINESS CONTEXT

1. Incomplete survey of alternatives

M&A teams will always and possibly rightly blame corporate strategy departments for insufficient guidance on what to buy and why. Ideally a target has indeed been identified, and based on thorough criteriabased screening has been selected by corporate strategy. Still, even when an individual target has been singled out, multiple alternatives are possible in a transaction. These may at times not be completely surveyed, even though when it comes to term sheets and deal structures, M&A teams normally cover multiple alternatives.

2. Failure to examine risks of preferred choice

The M&A team is neither responsible for the market estimates that are used in the business case nor for post-merger integration. The risks involved in getting things wrong in either area may indeed be underestimated.

3. Failure to reappraise initially rejected alternatives

Once rationalizations such as ‘this is the ideal target’ have become embedded in an M&A team, the tunnel vision of groupthink may develop. In such cases, alternatives are overlooked. When in the process the going gets tough (or expensive or both), the earlier alternatives may no longer be revisited.

4. Poor data search

In the ideal case, the M&A team depends on the quality and quantity of its external (market) data intelligence. This is the role of strategic analysis in M&A. When due to excessive secrecy the M&A team decides to collect market information themselves, such as through the backoffice of an investment bank involved the threat of poor information search is just around the corner.

5. Selective bias in processing info at hand

Even when M&A teams fully depend on strategy analysts for their data supply and analysis, the outcome of the analysis may due to inherent yesmanship still be skewed to achieve a particular desired outcome. The increase in a target’s valuation when increasing the target’s anticipated key market value growth with only 100 bps is often remarkable and may just make the difference between getting or not getting board approval for a deal.

TABLE 17.3 GROUPTHINK-BASED DEFECTIVE DECISION-MAKING APPLIED TO M&A PROJECTS

The analyst, once valued for her neutral contribution and subsequently having been integrated into an M&A team, for example, may not remain immune to groupthink. This again is not new. Analysts at MI6 struggled to find an appropriate distance from a group of positive and encouraging executives who in wartime fought for a common cause with the intelligence staff (Jeffery, 2011b):

“It was constantly argued (with reason) that close coordination was highly desirable between the producers and consumers of intelligence. Only then could the intelligence agencies fully understand what was required and thus meet their customers’ requirements. But if the relationship were too close, and the understanding too complete, then there was a danger that the intelligence sought and provided might merely reflect the preconceived needs of the consumers.”

There is undisputed value in analysts and executive cherishing two-way data and intelligence traffic (Steinberg, 2008). Teamwork always beats the work of a lone wolf. Still, for the analyst working in a multi-disciplinary group, a delicate balance needs to be found between objectivity and distance on the one hand and trust and understanding on the other. This applies both for the relationships the analyst builds with individual executives and with teams. An analyst who values her reputation for objectivity and integrity will intuitively manage this dilemma. That is not to say that analysts will always be in a position to correct social biases. That would be a naïve suggestion. Being well-aware of what these biases look like – from antecedent conditions to outputs – may, however, assist analysts to propose appropriate steps, as would outside counsel, to combat groupthink tendencies and/or forced yesmanship effectively.