Two
Conflict or: How organisations learn

It’s so damn complex. If you ever think you have the solution to this, you’re wrong and you’re dangerous.’

– H.R. McMaster

In the absence of guidance or orders, figure out what they should have been …’

– part of a sign on a command-post door in west Baghdad, commandeered by David Petraeus

1 ‘I watched them shoot my grandfather …’

On Saturday 19 November 2005, the weekend before Thanksgiving, a US marine ran into a family home about 150 miles outside Baghdad and began shooting children. By his own account, he ‘saw that children were in the room kneeling down. I don’t remember the exact number but only that it was a lot.’ He concluded that the children were hostile. ‘I am trained to shoot two shots to the chest and two to the head and I followed my training.’

The marine’s friend, Corporal Miguel Terrazas, a twenty-year-old from El Paso, was dead. A concealed bomb had blown his upper body apart. Two other marines had been wounded, and then a white Opel had approached the scene with five young Iraqi men in it – a possible threat. The young soldiers were shocked and under tremendous pressure.

What happened after the bomb exploded was pieced together by marine investigators and by journalists who questioned the marines’ account. The five Iraqi men were shot. One marine sergeant admitted that he had urinated on the head of one of the dead men, and claimed that they’d been shot while trying to surrender.

The marines then swept through the houses on the side of the road. Five-year-old Zainab Salem was killed. So was her sister Aisha, who was three. Five other members of the family were shot and killed. The only survivor was a thirteen-year-old girl who had been playing dead. A baby was killed in another house; a man in a wheelchair was shot nine times. Eman Waleed, who was nine, was sheltered with her eight-year-old brother by the bodies of adult relatives. ‘I watched them shoot my grandfather, first in the chest and then in the head,’ she told journalists. ‘Then they killed my granny.’ In total, twenty-four Iraqis died at the hands of the marines.

Almost as appalling as the killings in the town of Haditha was the fact that the sudden death of twenty-four civilians was accepted as routine. The battalion commander thought it was ‘very sad, very unfortunate’ but saw nothing worthy of investigating. His commander saw ‘nothing out of the ordinary, including the number of civilian dead’. The division commander agreed.

Haditha did not immediately damage the reputation of the US Army in Iraq. Violent deaths were by then so common that they were no more noticed by most Iraqis than by marine officers. But Haditha was a symbol of the utter failure of the US strategy in Iraq. The US and their allies desperately needed the support of ordinary Iraqis, and they were failing to get it. Haditha was a symptom of the stress, frustration, fatigue and sheer isolation of the US occupying forces. The marines at Haditha saw their friend killed and they didn’t have a proper response. Their tactics were failing and they had been given no effective strategy. The result was an atrocity.

2005 had been a dreadful year. 2006 turned out to be worse. On 22 February, the Golden Dome Mosque in Samarra was destroyed by a bomb – an act very roughly comparable to Catholics obliterating Westminster Abbey in London. It marked the beginning of a street-level civil war between the Shia Muslim majority whose holy site had been attacked, and who dominated the Iraqi government, and the Sunni Muslim minority who had been dominant under Saddam Hussein but who were being excluded from the post-Saddam order in Iraq. Some see the Samarra bombing as the trigger for the crisis; others argue that it was simply a mark of escalating tensions between Shia and Sunni. Car bombs became commonplace, but much of the violence was even simpler than that: one summer day in 2006, over fifty bodies were found in Baghdad alone. Each was handcuffed, blindfolded and shot. Shia militia would seize a Sunni man from a mixed area, take him to the edge of a Sunni district, shoot him in the back of the head, dump his body and drive off. Sunni insurgents also tried to clear mixed areas by picking off Shia one by one – first the barbers, then the estate agents, then the ice-sellers. A butcher was shot in the face in front of his customers; his adult son ran in and was shot too. His brother dashed in from the shop next door and met the same fate. Vast numbers of people fled the country, or moved from mixed to segregated areas, where they felt safer from casual violence.

Then there was Al Qaeda in Iraq (AQI), a vicious group of insurgents led by a Jordanian, Abu Musab al-Zarqawi, and pledging allegiance to Osama bin Laden’s network. AQI seized control of Iraqi towns one by one, humiliating tribal leaders – for instance, with a public beating – and if necessary assassinating them on the way to cowing the local population.

The US and allied response to the unfolding catastrophe was inept. The official policy remained that local police and army units were ready to stand up and be counted, but the official policy simply wasn’t working. Iraqi army units refused to move away from their local patches. The police in Baghdad were Shiadominated and had no interest in stopping the violence. Under the guise of ‘pacification’, they would move into a Sunni area and confiscate weapons, and then withdraw, tipping off the local Shia militia that the Sunnis were now defenceless.

Iraq was falling apart, and allied casualties were rising alarmingly. It was clear to anyone on the ground that the country was sliding ever further from peace and good government. Failure looked almost inevitable. And the Haditha massacre, shooting young children and men in wheelchairs, was not only a dreadful crime, but typified the isolation of the occupying forces from the people whose interests they were said to be serving. Strategies for dealing with insurgents such as AQI did exist, yet in 2005 and 2006, US forces seemed scarcely aware even of their existence. The occupation of Iraq was failing beyond the worst nightmares of the Pentagon and the White House.

Yet by 2008, the situation in Iraq had turned around completely. AQI was in full retreat, and the number of attacks, American deaths and Iraqi deaths had fallen dramatically. The damage done by the ill-planned Iraq invasion cannot be undone, and the future of the country remains very uncertain. But it is undeniable that a fragile success was snatched from the jaws of utter failure. The lesson of how the US military did so is important, because it defies everything we want to believe about how any large organisation should deal with problems.

2 The ideal organisation

Take a look at any organisational chart in the world and you’ll see in a simple PowerPoint-friendly format the idealised view of how organisations make decisions. At the top, you have the leader: the CEO, the four-star General, the President. The leader is crucial: if he makes good decisions, all will be well. If he makes bad decisions, the organisation will suffer and may fail altogether.

And how should the leader make good decisions? That’s easy. First, he should take advantage of the fact that he’s in a position to see the big picture. The more technology he devotes to this task, the better he can see how everything fits together, enabling him to coordinate what’s happening on the ground, be it the check-out, the factory floor, or the front line. The leader should also be surrounded by a supportive team with a shared vision of where the organisation is going. And to ensure that the strategy is carried out effectively, reporting lines should be clear. Information should flow to the top and be analysed, and instructions should flow back down in response – otherwise nothing but muddle and chaos lie ahead.

But while this is how we instinctively think about how leadership works and how organisations should operate, it’s a dangerously misleading view. The problem is that no leader can make the right decision every time. Napoleon, perhaps the finest general in history, invaded Russia with half a million men and lost over 90 per cent of them to death and desertion. John F. Kennedy forced Khrushchev to back down during the Cuban missile crisis. Yet he will also be remembered for the Bay of Pigs fiasco, when he somehow persuaded himself both that 1400 US-trained Cuban exiles might defeat 200,000 troops and topple Fidel Castro, and that nobody would suspect that the US was involved. Mao Zedong was the greatest of all insurgent commanders, but a catastrophic peacetime leader whose blundering arrogance killed tens of millions of his own people. Winston Churchill offered fierce warnings about the rise of Hitler, and inspirational wartime leadership for the United Kingdom. But as the politician in charge of the British Navy in the First World War, Churchill forced through the disastrous Gallipoli campaign which claimed tens of thousands of allied soldiers without any success. In war, politics and business, we face complex problems, and adversaries who have their own plans. It is simply impossible to be right every time. As a Prussian general once put it, ‘No plan survives first contact with the enemy’. What matters is how quickly the leader is able to adapt.

If even the best leaders make mistakes, a good organisation will need to have some way to correct those mistakes. Let’s recall the features that make our idealised hierarchy an attractive machine for carrying out correct decisions: the refinement of information to produce a ‘big picture’; the power of a team all pulling in the same direction; and the clear responsibilities producing a proper flow of information up and down the chain of command. Every one of these assets can become a liability if the task of the organisation is to learn from mistakes. The big picture becomes a self-deluding propaganda poster, the unified team retreats into groupthink, and the chain of command becomes a hierarchy of wastebaskets, perfectly evolved to prevent feedback reaching the top. What works in reality is a far more unsightly, chaotic and rebellious organisation altogether.

3 Mr Rumsfeld’s ‘epiphany’

It is impossible to read a history of the Iraq war without concluding that the invasion was misconceived. What is more remarkable, though, is that it was executed with staggering incompetence for many years. How did the fiasco persist for so long?

A clue lies in a press conference given just after Thanksgiving in 2005 by the two most senior figures in the US defence establishment. Donald Rumsfeld, the defense secretary, stood side by side with the chairman of the Joint Chiefs of Staff, General Peter Pace. This was ten days after the Haditha massacre, but the subject of the briefing was the conduct of the war in general.

Several observers noticed something very odd about the press conference. Throughout it, Rumsfeld carefully avoided referring to ‘insurgents’. This was at a time when all three insurgencies – Sunni, Shia and Al Qaeda in Iraq – were on the rise. The quirk was so noticeable that a journalist asked the defense secretary why he was skirting around the word. Rumsfeld explained that he had enjoyed an ‘epiphany’ over Thanksgiving weekend. He’d realised that ‘this is a group of people who don’t merit the word “insurgency” ’.

General Pace couldn’t quite keep to his boss’s surreal script. At one point, he hesitated while describing the situation on the ground and sheepishly admitted, ‘I have to use the word “insurgent” because I can’t think of a better word right now.’ ‘ “Enemies of the legitimate Iraqi government”, how’s that?’ Rumsfeld interjected. When General Pace made the ‘insurgent’ slip later in the press conference, he offered a mock-rueful apology to Rumsfeld, to chuckles all round. General Pace also told a reporter that, ‘No armed force in the world goes to greater effort than your armed force to protect civilians.’ The facts about Haditha had hardly begun their slow crawl up the chain of command.

Rumsfeld’s Orwellian performance at a press conference would have been less remarkable had it been merely an isolated piece of bluster to the media, but it wasn’t. It had an impact on the day-to-day conduct of the war. It was becoming apparent that some kind of counterinsurgency strategy was needed, but that was hard to discuss without using the word ‘insurgent’. The fear of the ‘i-word’ had already trickled down through the military. One captain complained to the journalist George Packer about a general who visited his unit and announced, ‘This is not an insurgency.’ His unspoken response had been, ‘Well, if you could tell us what it is, that’d be awesome.’

Rumsfeld’s denial of reality also typified his refusal to take advice from men who understood the situation. One of the very first opportunities for feedback had come before the Iraq war even started. General Eric Shinseki had warned a Senate committee that several hundred thousand troops would be needed to deal with the aftermath of the invasion, two or three times the number Rumsfeld had allocated. General Shinseki was not only the Army’s chief of staff, but a former commander of peacekeeping forces in Bosnia. His comments, which later proved accurate, had been swiftly dismissed by Donald Rumsfeld’s deputy as ‘wildly off the mark’. Pentagon-watchers reported that General Shinseki had then been marginalised until his scheduled retirement a few months later.

A second opportunity for feedback came when Lt General John Abizaid had spoken to Rumsfeld and his third-in-command, Douglas Feith, six days into the war. Abizaid was the number two field commander in Iraq (he would later assume command of all US forces in the Middle East) and he was a man worth listening to. Of all the Army’s top brass, he was the authority on the Middle East. He had moved with his pregnant wife and toddler daughter to Iraq’s neighbour Jordan in 1978, living in humble accommodation in the capital, Amman. The family had embraced the local culture and Abizaid studied the Koran, witnessed the Jordanian response to the Iranian revolution, and travelled the country, earning the name ‘Abu Zaid’ from nomads. And after the first Gulf War, Abizaid had improvised a remarkable campaign in which he nudged Saddam Hussein’s army back from the Iraqi Kurds, preventing a massacre without a shot being fired. His commanding officer had called it ‘one of the greatest examples of military skill that I have ever seen’.

Twelve years on, the opening ‘shock and awe’ phase of the Iraq war had seemed to be going well. Abizaid, however, had plenty he wanted to discuss. Yet Rumsfeld didn’t detect an opportunity to learn something: he left the conference call after fifteen minutes with a cheery wave. So it had fallen to Feith to hear Abizaid’s opinion of how things were going. Abizaid had tried to share his profound disquiet. He knew from his previous Iraqi experience that ethnic and religious divisions ran deep, and was worried that the Pentagon had no plan for stabilising the country after Saddam’s inevitable fall. Abizaid had argued that the allies needed to win over tens of thousands of low- and mid-level employees of Saddam’s doomed Baath regime, including administrators, police and teachers. But Feith simply hadn’t been interested. He had interrupted Abizaid to declare, ‘The policy of the United States government is de-Baathification’ – the removal of all of Saddam’s party members, no matter how minor they were, and thus the removal of almost anyone in Iraq who knew anything about the functioning of the state. Abizaid had tried again, arguing that even the word was treacherous, laden with entirely misleading parallels with postwar Germany and ‘de-Nazification’.

Feith had responded with the tried-and-true debating tactics of a five-year-old. He simply repeated himself: ‘The policy of the United States government is de-Baathification’. General Abizaid’s concerns were subsequently justified in almost every detail.

It is only with hindsight that we know that Generals Shinseki and Abizaid were correct. Yet even when the war effort had fallen apart, Rumsfeld’s team continued to put their fingers in their ears. There was the case of Andy Krepinevich, a defence analyst who in September 2005 had written a sharp article in Foreign Affairs describing and arguing for a proper counterinsurgency strategy. Rumsfeld asked his advisers to have a word with Krepinevich, but when he was summoned for a breakfast briefing, rather than being asked for his advice, he was told he didn’t understand the situation on the ground. According to Krepinevich, one Rumsfeld aide even joked that they should abandon him on the deadly road to Baghdad airport. The aide in question denies making any threat, but the story hardly shows an eagerness to learn from outsiders.

It’s easy, and true, to blame the failures of the Iraq war on bad decisions at the top. But there was more going on than a simple failure of strategy. Strategic errors are common in war. This wasn’t just about going into Iraq with the wrong strategy. It was a failure – worse, a refusal – to adapt.

4 ‘A kind of family’

Drawing parallels between Vietnam and Iraq can be deceptive. Yet in one respect Vietnam and Iraq are eerie echoes of each other: in both cases, it was almost impossible for dissenting ideas, especially from the battlefield, to penetrate the war rooms at the Pentagon and the White House. The situation in Iraq changed only when dissenting ideas were given space to breathe; in Vietnam, they never were.

The authoritative study of decision-making as the US was sucked into Vietnam was published in 1997, and based on a PhD thesis which itself relied on newly declassified documents. Its author, H.R. McMaster, was so incensed by the failures of President Lyndon Johnson, his defense secretary Robert McNamara, and the generals on the Joint Chiefs of Staff, that he called his book Dereliction of Duty.

McMaster’s book shows clearly how the ideal hierarchy can backfire. Remember the three elements of the idealised, decisive hierarchy: a ‘big-picture’ view produced by the refined analysis of all available information; a united team all pulling in the same direction; and a strict chain of command. Johnson and McNamara managed to tick all those boxes, yet produce catastrophic results. The ‘big-picture’ information that could be summarised and analysed centrally wasn’t the information that turned out to matter. A loyal, unified team left no space for alternative perspectives. And the strict chain of command neatly suppressed bad news from further down the organisation before it reached Johnson. Donald Rumsfeld was later to repeat the same mistakes, and the turnaround in Iraq came only when the US military abandoned its chain of command, love of unanimity and its aspirations to make big-picture decisions.

Robert McNamara was famous for his love of quantitative analysis, which he perfected to such effect at the Ford Motor Company that he was appointed the first Ford president outside the Ford family – before, just a few weeks later, being poached by John F. Kennedy and made defense secretary. McNamara thought that with enough computers and enough Harvard MBAs, he could calculate the optimal strategy in war, far from the front lines. That project brought the US Army no joy in Vietnam, but its spirit continued to animate Donald Rumsfeld. Even more damaging, though, was McNamara’s management style.

H.R. McMaster shows that Lyndon Johnson and Robert McNamara were made for each other. Johnson, an insecure man with the Presidency thrust upon him by John F. Kennedy’s murder, was eager for reassurance and disliked debate. McNamara was the quintessential yes-man, soothing Johnson at every step and ruthlessly enforcing the President’s request to hear a single voice. Shortly after becoming President, and with the 1964 Presidential election looming closer, Johnson hosted a lunchtime discussion each Tuesday with three senior advisers, including McNamara. No military specialists were present, not even the chairman of the Joint Chiefs of Staff. McNamara and Johnson both distrusted the military – indeed, shortly after taking office, Johnson had sacked three military aides because ‘they get in my way’.

Johnson and his advisers saw Vietnam primarily as a political football that might stall or strengthen Johnson’s Presidential campaign. His three aides, who viewed themselves as ‘a kind of family’, were careful always to harmonise their advice before meeting Johnson, which was just the way he liked it. McNamara himself looked for ‘team players’, declaring that it was impossible for a government to operate effectively if departmental heads ‘express disagreement with decisions’ of the President. This was the idealised organisation at its worst. Loyalty wasn’t enough. Merely to ‘express disagreement’ was a threat.

A famous set of experiments by the psychologist Solomon Asch shows why the McNamara–Johnson doctrine of unanimous advice was so dangerous. The classic Asch experiment sat several young men around a table and showed them a pair of cards, one with a single line, and one with three lines of obviously different lengths, labelled A, B and C. The experimenter asked subjects to say which of the three lines was the same length as the single line on the other card. This was a trivially easy task, but there was a twist: all but one of the people sitting around the table were actors recruited by Asch. As they went round the table, each one called out the same answer – a wrong answer. By the time Asch turned to the real experimental subject, the poor man would be baffled. Frequently, he would fall in with the group, and later interviews revealed that this was often because he genuinely believed his eyes were deceiving him. As few as three actors were enough to create this effect.

Less famous but just as important is Asch’s follow-up experiment, in which one of the actors gave a different answer from the rest. Immediately, the pressure to conform was released. Experimental subjects who gave the wrong answer when outnumbered ten to one happily dissented and gave the right answer when outnumbered nine to two. Remarkably, it didn’t even matter if the fellow dissenter gave the right answer himself. As long as the answer was different from the group, that was sufficient to free Asch’s poor subjects from their socially-imposed cognitive straitjacket.

In a surreal variant, the psychologists Vernon Allen and John Levine ran a similar visual test with an elaborate pantomime in which one of the experimental participants had extravagantly thick glasses, specially manufactured by a local optometrist to look like bottle-bottoms. This Mr Magoo character – another actor – then started raising concerns with the experimenter. ‘Will the experiment require any distance vision? I have a lot of trouble seeing objects that are some distance away.’ After a series of set-pieces designed to fool the real subject into believing that Mr Magoo could hardly see his hand in front of his face, the experiment began and of course Magoo kept getting things wrong. Again, subjects found it very hard to disagree with a unanimous – and wrong – group verdict. Again, a single dissenting voice was enough to liberate the subjects. And, astonishingly, this liberation took place even if the fellow dissenter was just poor old Magoo yelling out completely the wrong answer.

An alternative perspective on the value of an alternative perspective comes from the complexity theorists Lu Hong and Scott Page. Their decision-makers are simple automatons inside a computer, undaunted by social pressure. Yet when Hong and Page run simulations in which their silicon agents are programmed to search for solutions, they find that a group of the very smartest agents isn’t as successful as a more diverse group of dumber agents. Even though ‘different’ often means ‘wrong’, trying something different has a value all of its own – a lesson Peter Palchinsky learned as he travelled the industrial hubs of Europe. Both because of the conformity effect Asch discovered, and because of the basic usefulness of hearing more ideas, better decisions emerge from a diverse group.

The doctrine of avoiding split advice, then, couldn’t have been more misguided. The last thing Lyndon Johnson needed was to be confronted with a unanimous view. He desperately needed to hear disagreement. Only then would he feel free to use his own judgement, and only then would he avoid the trap of considering too narrow a range of options. Even an incompetent adviser with a different perspective – the foreign-policy equivalent of Allen and Levine’s fake Mr Magoo – would probably have improved Johnson’s decision-making. But unanimity was what Johnson wanted, and McNamara made sure that he got it.

To add to the trouble, Johnson set up a clear, idealised chain of command and insisted that nobody stepped outside it. Rather than speaking directly to the Joint Chiefs of Staff (who, to Johnson’s discomfort, often disagreed with each other) he used the JCS chairman, and McNamara, to filter out news. Johnson probably didn’t realise how much was being hidden from him. H.R. McMaster’s book gives a telling example: when the Joint Chiefs of Staff commissioned a war game called SIGMA I in 1964, it largely predicted what later happened: a dismal and inexorable escalation into full-blown war. McNamara dismissed SIGMA I because his number-crunching analysts were producing a different conclusion. Johnson never saw the results of SIGMA I. This incident was typical of the abysmal communication between Johnson and his military advisers.

It would be tempting to blame McNamara alone for that – were it not for the fact that the Chiefs of Staff had tried to speak to Johnson through alternative, unofficial routes and the President had made it quite clear he wanted the military to talk to him ‘through the McNamara channel’. Johnson talked only to his political advisers, and his decisions gave him short-term political success and eventual military disaster. The idealised hierarchy backfired with a vengeance, the wrong decisions taken by a team all pulling in the wrong direction, and the chain of command serving as a perfect barrier to the upward flow of vital information. As H.R. McMaster concludes, between November 1963 and July 1964, Johnson ‘made the critical decisions that took the United States into war almost without realizing it’.

Forty years later, Donald Rumsfeld’s refusal to listen to dissenting advice was dooming the allied forces in Iraq. Yes, the strategy was bad, but what was truly unforgivable was that Rumsfeld was preventing it from getting better. H.R. McMaster’s book had documented a systematic failure to learn at the top of the US military establishment. Nothing, it seems, had changed.

5 The Tal Afar experiment

The US turnaround in Iraq had, in fact, begun months before the Haditha massacre and Donald Rumsfeld’s bizarre press conference – it was just that Donald Rumsfeld didn’t know it.

The first glimmerings of success came in a place called Tal Afar, in the spring of 2005. Tal Afar is an ancient Iraqi city with a quarter of a million citizens, not far from the border of Syria. US forces had repeatedly driven insurgents out of Tal Afar, but each time the Americans withdrew, the insurgents returned. By the end of 2004, Tal Afar was a stronghold for Sunni extremists and a jewel in the crown of Musab al-Zarqawi, the Jordanian terrorist who ran ‘Al Qaeda in Iraq’. Always a smugglers’ town, Tal Afar had become the destination of choice for foreign insurgents arriving from Syria, where they could be equipped, trained, and despatched against Shias, the US forces, and collaborators.

At this time, much of the US Army in Iraq was stationed in Forward Operating Bases, FOBs. Some of the FOBs were enormous, four miles or more along each side, with scheduled bus services to get soldiers around the base. FOBs offered soldiers some of the comforts of home, including Baskin Robbins ice cream, cinemas, swimming pools, and even stores where they could buy consumer electronics. The neat concrete symmetry of a FOB would have delighted many a modernist architect, and it made a certain amount of sense tactically, because FOBs in the middle of the desert were almost impregnable against a ragtag bunch of terrorists. Soldiers could be supplied more easily (the support staff were given the not entirely affectionate title, ‘Fobbits’), even if one captain was overheard commenting with black humour that their mission was ‘to guard the ice-cream trucks going north so that someone else can guard them there’. In other words, US strategy in Iraq had collapsed into ‘don’t get the soldiers killed’. And frankly, if not getting killed was the only strategic objective, it could have been accomplished better by moving the troops to Colorado or Texas.

‘Day-tripping like a tourist in hell’ was how one counterinsurgency expert described the armoured sorties from the FOBs. Operating from such isolation, US forces were able to do little more than sweep through cities such as Tal Afar, hoping to kill some bad guys. Not many of these sweeps backfired as badly as the Haditha massacre, but few produced any valuable results. The trouble was that the insurgents could vanish simply by dropping their weapons and walking into any crowd. The people of Tal Afar may have known the difference, but the American soldiers did not, and the people of Tal Afar were not about to tell them.

One American counterinsurgency guru, John Nagl, who served in Iraq in 2003 and 2004, quickly discovered how little cooperation he could expect. On his first day in Iraq, Major Nagl sent one of his captains down to the police station to befriend the local police. Seeing the American coming, the police deserted the building, leaping out of the rear windows and scurrying in all directions as though somebody had discovered a bomb in the basement. Assuming the young captain must have blundered, Nagl went there himself the next day and earned the same reaction. Nagl eventually got his wish for a joint patrol: a policeman walked a couple of yards ahead with Nagl’s rifle in his back. Despite all his expertise in counterinsurgency – Nagl has a doctorate from Oxford in the subject – it was only later that he figured out why the police wouldn’t cooperate.

So why didn’t locals help the American forces? The conventional wisdom was that the Americans were simply losing a popularity contest with the insurgents. Even experienced American commanders such as General Abizaid – who by then was responsible for all US forces in the Middle East – believed that the fundamental problem was that US forces were like an organ transplant that was being rejected. There could be no peace until the US forces withdrew, probably not even then.

It took a while for the penny to drop: although some Iraqis did hate the Americans, most weren’t refusing to collaborate out of hatred. They were refusing to collaborate out of fear. Anyone who helped the American soldiers on one of their sweeps would be murdered when the soldiers withdrew. That was why Major Nagl could get ‘help’ only at gunpoint. It was why Iraqi teachers made excuses when American soldiers suggested that the Iraqi elementary schools set up pen-pal relationships with American elementary schools – it was too risky, no matter how enthusiastically the Iraqi children penned their friendly letters. And it was why while the Americans restricted themselves to temporary sweeps through Iraq’s cities, they would help no one and be helped by no one.

So Tal Afar remained an insurgent stronghold, and with Sunnis running the streets while Shia policemen sallied forth in murder squads at night, it was also a microcosm of Iraq’s growing civil war.

Into this mess walked the 3rd Armored Cavalry Regiment, 3500 men led by an officer we’ll call Colonel H. The Colonel is affable company: his short muscular frame and leathery bald pate might make him look thuggish, if that impression wasn’t continually undermined by a cheeky wit and an impish smile that keeps bursting out during conversation.

Col. H. had quite a reputation. He was a war hero, having captained American tanks in a celebrated battle during the Gulf War in 1991. But Col. H. also had a record as a thinker, and a courageous thinker at that. And as he prepared to turn the tide in Tal Afar, Col. H. was thinking that the US strategy in Iraq made no sense.

Victory in Tal Afar was going to require that Col. H.’s men adapted quickly. Before they had even left American soil, Col. H. had been training them, buying pocket histories of Iraq in bulk, instructing his men to behave more respectfully towards Iraqis, and role-playing difficult social interactions at a mocked-up checkpoint in Fort Carson, Colorado. His soldiers would pretend to deal with drunks, pregnant women, suspected suicide bombers, and then watch videos of the encounters and discuss how to learn from the mistakes they had made. ‘Every time you treat an Iraqi disrespectfully, you are working for the enemy,’ Col. H. told his men.

Arriving at Tal Afar, Col. H.’s regiment moved slowly into the city, securing it block by block. His lieutenants organised repeated, painstaking discussions with the local power-brokers. They tried to reconcile moderate Sunni nationalists with the Shia, to reform the Shiite police force and make it representative of the entire city. They brought in a new mayor, a Baghdadi who didn’t even speak the local language, but at least he had no axe to grind. They established twenty-nine small outposts throughout the city; no ice-cream or swimming pools, and indeed no hot water or regular cooked meals. But Col. H.’s men refused to cede those little bases, no matter how ruthlessly they were attacked.

For the more extreme end of Tal Afar’s warring factions, no act seemed too evil to contemplate. ‘In one case,’ recalls Col. H., ‘terrorists murdered a young boy in his hospital bed, booby-trapped the body, and when the family came to pick up the body they detonated the explosives to kill the father.’

Police recruits were murdered when somebody with explosives strapped all over their body walked into their midst. It wasn’t a suicide bomber, but a mentally disabled thirteen-year-old girl, accompanied by a toddler whose hand she had been asked to hold as she walked towards the line of recruits.

For a few weeks, Col. H.’s men took heavy casualties in tough conditions. But then, an apparent miracle: the people of Tal Afar began to cooperate with the Americans and – slowly, reluctantly – to talk with each other. The more moderate among the warring factions put down their weapons. The true terrorists fled, or were killed or captured when local townsfolk turned them in. Few people, after all, really wish to harbour men who use disabled girls as bombs and toddlers as camouflage. ‘It happened with astonishing speed’, said Col H., but the truth is that it happened the moment most people became convinced that the Americans weren’t going to abandon them to the revenge of Al Qaeda in Iraq.

It would be hard to exaggerate quite how far Col. H. stuck his neck out when he pacified Tal Afar. His strategy was little short of a rebellion against his own commanding officers, General Casey and General Abizaid. He apparently had little time for Donald Rumsfeld’s Orwellian epiphany, bluntly telling journalists that ‘militarily, you’ve got to call it an insurgency, because we have a counterinsurgency doctrine and theory that you want to access.’ He also short-circuited the chain of command, speaking freely to senior officers who were not his immediate superiors. Those immediate superiors gave him little backing. One of them warned him ‘to stop thinking strategically’ – that is, to shut his big mouth and stop thinking above his rank. When he asked for 800 men as reinforcements, he received no response at all, and later figured out that his request had never been passed up the chain of command. And later, according to one account, when General Casey was pinning a medal on Col. H.’s chest in recognition of his achievements at Tal Afar, he warned him that he was making too many enemies among his commanding officers – for his own sake, Col. H. needed to listen more and argue less.

Think back to the idealised organisation, and you see that Col. H. succeeded by violating every one of its principles. He ignored strategic direction from his superiors if he felt it was poor strategy. If the hierarchy suppressed his views, he communicated by turning to journalists. He didn’t rely on ‘big-picture’ information, focusing instead on the specifics of the situation on the ground in Tal Afar and delegating authority to the junior officers who commanded his urban outposts.

Col. H. improvised one of the few successful responses to the Iraqi insurgency at great physical risk to himself and his men. (When I first spoke to him, he was recuperating after a hip replacement, the consequence of injuries sustained courtesy of a bomb in Iraq.) What is more amazing is that he did so by shrugging off the weight of every link in the chain of command above him. He paid a price for his courageous independence. Despite his early promise, a PhD in history, and his proven achievements both in Desert Storm and at Tal Afar, Col. H. was twice passed over for promotion to Brigadier-General – the junior general’s rank – first in 2006 and again in 2007. His superiors focused not on his performance, but on what they saw as a troublemaker’s attitude. As early retirement beckoned for Col. H., a growing band of counterinsurgency geeks began to grumble that this was no way for the Army to treat its most brilliant colonel.

It is a rare soldier – indeed a rare character altogether – who takes such risks with his own career. But there was a simple explanation: Col. H. was H.R. McMaster, the author of Dereliction of Duty, the definitive account of how faulty leadership from the President, the secretary of defense and the senior Army generals had led to disaster in Vietnam. He literally wrote the book on how an organisation can fail from the top down. And if he had any say in the matter, he wasn’t going to let the US Army defeat itself a second time.

6 ‘How to win the war in al Anbar by Cpt. Trav’

McMaster’s achievements in Tal Afar were a rare bright spot in a dismal year for the Americans in Iraq. But they were not the only bright spot. Several other commanders either imitated McMaster’s experiment or came to similar conclusions independently. The most important was Col. Sean MacFarland. MacFarland’s men started in Tal Afar, where they saw what McMaster had achieved. Then they were moved to the city of Ramadi in al Anbar province, 60 miles west of Baghdad.

MacFarland immediately grasped that the official strategy – keep out of harm’s way, train the Iraqi army, and then go home – was in desperate trouble. At a graduation ceremony for almost 1000 Iraqi soldiers, just before MacFarland had arrived, many ripped off their uniforms and deserted on the spot when they heard they were to be deployed outside al Anbar. MacFarland’s own official Iraqi army support had also mutinied. Ramadi wasn’t suffering from sectarian fighting as Tal Afar had, because Ramadi was largely Sunni. But just as in Tal Afar, Al Qaeda in Iraq (AQI) had moved in and was all but running the city. Locals were terrified of being seen anywhere near the Americans.

MacFarland learned from McMaster’s approach, despite a sceptical response from his superior officers, and he adapted it as necessary to deal with local circumstances. Through the summer of 2006, he pushed into Ramadi and gradually established eighteen small bases. AQI was immediately put on the defensive; rather than watching the front gates of a huge Forward Operating Base to learn when the next American patrol was to show up, AQI now had to cope with the fact that they were sharing Ramadi with their enemies. The response was violent, as AQI poured efforts into attacking the outposts, US convoys, and especially the sheikhs whom MacFarland was beginning to win round as allies. At the time the ferocity of the response was alarming; in retrospect, it was a sign of desperation. Emboldened by the solidity of the American presence on the ground, the local sheikhs turned against AQI, and within months, the terrorist organisation in Anbar province had collapsed.

No matter how determined Donald Rumsfeld was to learn nothing from the implosion of the US strategy, on the ground, US soldiers were adapting. Good advice was passed around like a girlie magazine among schoolboys. There was David Kilcullen’s ‘28 Articles: Fundamentals of Company Level Insurgency’, a spiky set of tips that Kilcullen said he wrote with the aid of a bottle of whisky and which was widely circulated by email. (Kilcullen, an Australian soldier and anthropologist hired by the Pentagon, evidently enjoyed his semi-detached status from the US Army and was even more of a maverick than McMaster. One of his notorious pronouncements: ‘If I were a Muslim, I’d probably be a jihadist’; another: ‘Just because you invade a country stupidly doesn’t mean you have to leave it stupidly.’)

It should be no surprise that soldiers on the front line were far quicker to seek out good advice, and far more eager to adapt, than their senior officers. ‘We willingly implement lessons learnt at the bottom end, because changing and adapting low-level tactics saves lives,’ one British general told me with an air of resignation. ‘But we rarely adapt and implement lessons learned at the top end.’

Another famous piece of bottom-up advice was ‘How to win the war in al Anbar by Cpt. Trav’ – an eighteen-slide PowerPoint presentation which conveys more insight than the top brass picked up in the first three years of the occupation, using stick figures and explanations that would suit an eight-year-old. (‘On the right is an insurgent. He is bad. On the left is an Iraqi Man, who is not an insurgent but who is scared of them … There’s Joe and Mohammed! They don’t know if these are good Iraqis or bad Iraqis. What to do?’) ‘Cpt. Trav’ is a witty counterinsurgency mentor, but he also – like McMaster and Kilcullen – displays a streak of sedition. One slide shows one of the sheikhs, leaders of the local people ‘for approximately 14,000 years’, coping with the rules that cut them out of government – courtesy of the incompetently led US civilian authorities in Iraq, or as Cpt. Trav puts it, ‘25 year olds from Texas, and Paul Bremer’.

Cpt. Trav was Captain Travis Patriquin, one of Sean MacFarland’s men, a young Arabic-speaking special forces officer who befriended the sheikhs of al Anbar. Like all good kids’ stories, there’s a happy ending to Cpt. Trav’s tale: ‘The Sheik brings more Sheiks, more sheiks bring more men. Joe realizes that if he’d done this three years ago, maybe his wife would be happier, and he’d have been home more … Joe grows a moustache, because he realizes that Iraqis like people with moustaches and have a hard time trusting people without one.’

Captain Patriquin, of course, sported his own moustache. But there was to be no happy ending for him. He was killed by a roadside bomb three weeks before Christmas 2006, leaving behind his wife and three young children. At his memorial service, the local sheikhs turned out in force.

7 ‘It’s my job to run the division, and it’s your job to critique me’

The conventional story of how the US military recovered from a near-impossible situation in Iraq is a simple one. The problem was that the US had a bad strategy and bad leaders: President Bush and Donald Rumsfeld. The solution came when President Bush – with a little nudge when the voters gave his party a kicking in the 2006 elections – replaced Rumsfeld with Robert Gates, and Robert Gates appointed General David Petraeus to replace General Casey. Good leaders replace bad leaders; good strategy replaces bad strategy; problem solved.

This is not only the story we tell ourselves about Iraq, but the story we tell ourselves about how change happens: that the solution to any problem is a new leader with a new strategy, whether it’s the new coach of a football team, the new chief executive of a failing business, or a new president. The truth, both in Iraq and more widely, is more subtle and far more interesting.

General Petraeus didn’t invent the successful strategy while out for one of his eight-mile runs, and then hand out the orders as though promulgating the Ten Commandments. He did something far rarer and more difficult: he looked further down the ranks, and outside the armed forces entirely, searching for people who had already solved parts of the problem that the US forces were facing.

It’s not that David Petraeus was an empty vessel for the ideas of others. He commanded American forces in Mosul, the largest city in northern Iraq, in 2003. Like McMaster, he ignored much of what he was being ordered to do by his superiors – in particular, when the order came to sack anyone associated with Saddam Hussein’s Baath party, he dodged it, leaving the newly elected governor of Mosul, a Baathist, in his post. Petraeus then figured out a legal fudge that gave him the authority to open the border with Syria – ignoring the State Department’s attempts to freeze out the Syrians. (The joke was that under Petraeus, 101st Airborne was the only division in the US military with its own foreign policy.) Then he shrugged off the objections of the US civilian authorities in Baghdad by raising prices for locally grown wheat. Petraeus figured that a free market approach might sound attractive, but his own price floor would create supporters because farmers would be better off than they had been under Saddam Hussein.

General Petraeus was the only divisional commander to run a successful campaign in the first year of the war. He was rewarded for his success – and his borderline insubordination – by being passed over for the combat appointment he craved, and instead handed first the job of training the Iraqi police, and then a backwater job: training and education at Fort Leavenworth, 7000 miles from Iraq. It was like Peter Palchinsky being assigned to a consultant’s role in Siberia, and the pedigree was unpromising. Petraeus’s predecessor at Leavenworth had been sent there, apparently as a punishment, after guilelessly remarking to a reporter that the US had been caught by surprise during the invasion of Iraq.

But Petraeus realised that from Fort Leavenworth, he had the opportunity to influence American strategy in the most profound way possible: from the bottom up. He set himself the task of rewriting Army doctrine on counterinsurgency. Such doctrine rewrites were usually non-events, merely writing down whatever tactics the Army had adopted. On rare occasions, though, they transformed the Army, with soldiers in the field reading the new doctrine and changing the way they thought and acted.

Petraeus was determined that this would be one of those doctrine rewrites that mattered. And he realised what Donald Rumsfeld, Robert McNamara and President Johnson did not: that the right decisions are more likely when they emerge from a clash of very different perspectives. Petraeus had already been a high-ranking evangelist for David Kilcullen’s ‘28 Articles’. Now he asked the loud-mouthed Kilcullen to join him at a conference in Fort Leavenworth to help develop the Army’s counterinsurgency doctrine. He also invited a British officer, Brigadier Nigel Aylwin-Foster, who had excoriated the American Army, accusing it of a cultural insensitivity bordering on institutional racism. (The Guardian commented that ‘What is startling is the severity of his comments – and the decision by Military Review, a US army magazine, to publish them.’ But Military Review was the Fort Leavenworth magazine – under the control of General Petraeus.) There was John Nagl, who’d learned the counterinsurgency trade in Oxford and then Baghdad, and Kalev Sepp, another counterinsurgency expert who was an outspoken critic of the US strategy. Petraeus didn’t just seek out internal dissidents, but also officials from the State Department and the CIA, journalists, academics, and even human rights advocates. After opening the conference, Petraeus made a point of sitting next to Sarah Sewall, the director of a human rights centre at Harvard. One of the journalists at the conference commented that he had never seen such an open transfer of ideas in any institution.

H.R. McMaster – Colonel H. – was still in Tal Afar as the doctrine began to be drafted, but Petraeus’s team sought his advice via email. ‘H.R. was conducting counterinsurgency in Tal Afar and we used Tal Afar as a case study in real time,’ says John Nagl. ‘So we’re writing the Tal Afar case study and emailing it to him and he’s “Wikipedia-ing” it. Correcting it as we go along. And he’s also saying, “Car bomb, gotta go.” ’ While Rumsfeld had closed his eyes to what was going on in the front line, Petraeus managed to get a ringside seat from 7000 miles away.

This openness to new ideas might have seemed surprising. General Petraeus had a reputation for arrogance, as well as having much to be arrogant about. Petraeus famously described his experience in Mosul as ‘a combination of being the president and the pope’, and one colleague told the journalist Thomas Ricks that ‘David Petraeus is the best general in the U.S. Army, bar none. He also isn’t half as good as he thinks he is.’

But Petraeus received an education in the importance of feedback back in 1981, when as a lowly captain he was offered a job as an aide to Major General Jack Galvin. Galvin told Petraeus that the most important part of the job was to criticise his boss: ‘It’s my job to run the division, and it’s your job to critique me.’ Petraeus protested but Galvin insisted, so each month the young captain would leave a report card in his boss’s in-tray. It was a vital lesson for an officer unwilling to admit mistakes. Galvin himself had learned the hard way about the importance of feedback: a Vietnam vet, he had been relieved from his first assignment after his commander had instructed him to inflate an enemy body count, and Galvin had refused. Later, Galvin was asked to be one of the writers of a confidential – and, it turned out, explosive – history of the US involvement in Vietnam. It was leaked to the New York Times and became known as the Pentagon Papers. Galvin was a man who understood that organisations which ignore internal criticism soon make dreadful errors, and he made sure that Petraeus learned that lesson.

Jack Galvin also taught Petraeus that it is not enough to tolerate dissent: sometimes you have to demand it. Galvin ordered Petraeus to speak frankly to him despite Petraeus’s reluctance to criticise a superior officer. This was absolutely the right example, because there are many instances where leaders have failed to get a frank discussion going, despite being far more open to disagreement than Donald Rumsfeld or Lyndon Johnson.

The classic example is the Bay of Pigs disaster, which required an extraordinary level of self-delusion on the part of President Kennedy. Irving Janis’s classic analysis of the Bay of Pigs and other foreign policy fiascos, Victims of Group Think, explains that a strong team – a ‘kind of family’ – can quickly fall into the habit of reinforcing each other’s prejudices out of simple team spirit and a desire to bolster the group. Janis details the way in which John F. Kennedy fooled himself into thinking that he was gathering a range of views and critical comments. All the while his team of advisers were unconsciously giving each other a false sense of infallibility. Later, during the Cuban Missile Crisis, Kennedy was far more aggressive about demanding alternative options, exhaustively exploring risks, and breaking up his advisory groups to ensure that they didn’t become too comfortable. It was a lesson that David Petraeus – another historian – had grasped.

Once Petraeus had a robust, usable doctrine, properly tested by a range of contrasting views, he launched his own guerrilla campaign to get the US Army to pay attention to it. The media-savvy Petraeus had already scored a coup when he appeared on the cover of Newsweek under the caption ‘Can this man save Iraq?’ Newsweek reckoned that Petraeus was ‘the closest thing to an exit strategy the United States now has’. Rumsfeld had been incandescent: passing through Dublin Airport, an aide had run ahead of him rearranging the airport magazine racks so that Rumsfeld wouldn’t have to face a reminder of his own insurgent general.

The diversity of opinions that had helped produce the manual became Petraeus’s main weapon in disseminating the ideas. The heavy-hitting journalists who’d been invited along were impressed by the doctrine – and perhaps just a little flattered at their involvement – and were happy to write about it. The human rights expert Sarah Sewall wrote a foreword to the counterinsurgency manual FM 3-24. John Nagl appeared on chat shows such as Charlie Rose and even Jon Stewart’s Daily Show. The manual was reviewed in the New York Times Book Review and made front-page news in most of the quality newspapers. It was posted on the internet and downloaded more than 1.5 million times in the first month, having already been open to comments from the ‘six hundred thousand editors’ of the Army and Marine corps. As the new book was circulated at the front line, it mattered less and less what Donald Rumsfeld thought about whether there was, or was not, an insurgency.

While all this was happening, Petraeus was also one of several high-ranking officers trying to change the strategy from the top down. Several generals, some active and some retired, bypassed the traditional chain of command in Washington to lobby for a new approach to the war. H.R. McMaster was in Washington, too – Petraeus had recommended that he be appointed to a panel of colonels reviewing the US strategy in Iraq.

In Vietnam, Lyndon Johnson’s insistence that all information flow through approved channels doomed America to disaster. In Iraq, the Army discovered that if the official hierarchy was on a disastrous course, it was vital to bypass it in order to adapt. Petraeus himself was using the media as a way of talking to everyone from the greenest private to the commander in chief. Others used their influence to whisper in the ear of the President himself. It wasn’t that the hierarchy was always useless, simply that it got in the way of change when change was needed. By the time President Bush and the new defense secretary, Robert Gates, decided to put General Petraeus in command in Iraq, an internal revolution at every level of the US Army had already profoundly changed its direction.

For an organisation that needs to quickly correct its own mistakes, the org. chart can be the worst possible road map.

8 Drawing the wrong lessons from history

There was a dramatic improvement both in US military strategy and in the situation for ordinary Iraqis between 2006 – the nadir of the occupation – and 2008 or 2009, and we’ve seen that a surprising amount of trial and error was involved. It wasn’t simply a matter of replacing one general with another, or even one defense secretary with another, but of learning from hard experience on the ground, and comparing the successful approaches pioneered by David Petraeus in Mosul, H.R. McMaster in Tal Afar, and Sean MacFarland in Ramadi, with awful failures elsewhere. The US Army stumbled its way to a successful strategy.

But was such a painful process of experimentation really necessary? Certainly, the learning process could have been quicker – if H.R. McMaster had been promoted, David Petraeus not banished to Fort Leavenworth, and Donald Rumsfeld more willing to listen to the warnings he was receiving. But could the US military have skipped the ‘mistakes were made’ part of the war entirely and figured out a better strategy from the start?

That was the view of John Nagl, the historian of counterinsurgency who fought in Baghdad and was on the team Petraeus assembled to write the counterinsurgency doctrine, when I suggested that the US military had solved its problem in Iraq through trial and error.

‘We weren’t just trying stuff at random,’ he objected, and pointed to the need to learn lessons from history, as any good historian would. H.R. McMaster and David Petraeus both had history doctorates, too. But while nobody would suggest that pure random experimentation is a good idea, history is also an imperfect guide. A few minutes later, Nagl all but admitted as much when he reflected on the actions of General Abizaid.

‘Abizaid drew the wrong lessons from Lebanon in 1983,’ Nagl explained. ‘Abizaid was convinced that Western forces were a foreign presence that inspired the creation of antibodies in Arab societies. And therefore his conclusion from that was that we need to hand over responsibility for Iraq as soon as we possibly can.’ The result of that lesson was the ‘draw down’ strategy which left the Iraqi army and police underprepared as US troops withdrew to the FOBs, their concrete cocoons in the desert. It was a serious mistake.

But this example simply highlights the fact that it is impossible to know in advance what the correct strategy will be. Remember that General Abizaid, who a few months after the start of the war had been given command of all US forces in the Middle East and central Asia, was an expert on the region. He had lived in Jordan and performed a brilliant peacekeeping role in the aftermath of the first Gulf War. He was a sensitive and intelligent man who had correctly warned that de-Baathification would lead to disaster. If you were looking for one man with the experience and track record to set the right course in Iraq, you’d have had trouble looking further than John Abizaid. If he, of all men, drew the wrong lesson from history, drawing the right lesson cannot be a simple process. That is what Philip Tetlock’s study of expert judgement revealed. And that is why trial and error will always be a part of how any organisation solves a complex, ever-shifting problem.

Another example of history’s uncertain guidance came from the first Gulf War in 1990–1. Desert Storm was an overwhelming defeat for Saddam Hussein’s army: one day it was one of the largest armies in the world; four days later, it wasn’t even the largest army in Iraq. Most American military strategists saw this as a vindication of their main strategic pillars: a technology-driven war with lots of air support and above all, overwhelming force. In reality it was a sign that change was coming: the victory was so crushing that no foe would ever use open battlefield tactics against the US Army again. Was this really so obvious in advance?

Even had the basic US strategy been correct after the invasion, local adaptation would have been necessary. The nature of the problem kept changing as the insurgents changed their methods. Tactics that had worked yesterday were a liability today. Nagl, again, discovered this when putting to the test his Oxford doctorate in the history of counterinsurgency. Iraq was full of surprises. If he tried to respond to a tip-off about someone planting roadside bombs, it wasn’t so easy simply to go and arrest the suspect. Iraq has no addresses: no street names, signs or house numbers. The informant couldn’t be seen with soldiers, and if Nagl were to disguise himself and drive past in an unmarked car, Nagl would lose his rights under the Geneva Convention. These local difficulties weren’t easy to anticipate in the Pentagon, even if the secretary of defense had tried to do so. Some degree of local adaptation was always going to be needed.

The lesson of the Iraq war was that the US Army should have had much better systems for adapting a failing strategy, and should have paid far more attention to successful local experiments. But perhaps there is a broader lesson, too. Donald Rumsfeld was by no means alone in believing he knew better than the soldiers on the ground. His mistakes have been made by many leaders before – in the military, politics and business.

9 ‘It was hard enough teaching computers to play chess’

As a seventeen-year-old boy, I was surely the perfect audience for ‘Stormin’ Norman Schwarzkopf’s no-nonsense briefings during the Gulf War. I remember distinctly the foggy grey aerial images of Iraqi buildings, perspective shifting as the camera moved with the stealth fighter that carried it. Cross-hairs fixed on a bridge or a bunker, giving the viewer a couple of seconds’ warning before the target was obliterated by a laser-guided bomb. As the camera struggled to adjust, there was a screen-wiping blaze of white, then black. I stood in front of the common-room television at school with my classmates, and we were unanimous: precision bombing was cool.

Almost twenty years later, I sat in the late spring sunshine in the courtyard of London’s Royal Academy, listening to Andrew Mackay, a British general who had served in Iraq, and been one of the UK’s most successful commanders in Afghanistan, explain what the images had been supposed to advertise. Allied forces would have superb, real-time information about potential targets – ideally, ‘information dominance’, in which the allies had also destroyed the enemy’s computers, telephone lines and radar. Not only that, but the information would be fed into supercomputers, capable of centralising and processing all the data, which could be distilled into usable form so that a three- or four-star general could perceive the entire theatre of war and adjust tactics and strategy on the fly. The computer could even calculate the likely impact of different strategies, including second-order and third-order knock-on effects. Using ‘effects-based operations’ or EBO, the general could choose a precise tactical strike, knowing that it would disrupt enemy logistics, manoeuvring, perhaps even morale, in a predictable way. It was the third pillar of the ideal organisation, the ‘big picture’, Robert McNamara’s analytical fantasy from Vietnam made reality: a vision of war in which information was so rich and ubiquitous that it could deliver an optimal strategy to a single, all-powerful decision maker.

General Mackay cuts a tall, commanding figure, but this is offset by white hair and eyebrows, soft features, and a magpie fascination with different ideas. He put down his coffee and pointed over my shoulder. ‘So, using effects-based operations, a computer might calculate that destroying that pot plant behind you would achieve precisely the desired strategic outcome. We would launch a missile from fifty miles away, accurate to within a couple of feet, and destroy the pot plant.’ Wow. Suddenly I was recalling those Stormin’ Norman briefings, but with an extra eighteen years of technological sophistication.

Mackay picked up his coffee. ‘The only trouble is, it was hard enough teaching computers to play chess. And chess only has sixty-four squares and thirty-two pieces.’

With a healthy dose of scepticism, Andrew Mackay was describing the planner’s dream: a huge leather swivel-chair, a wall full of screens, infinity in the palm of your hand. It is a vision so seductive that it refuses to die.

Earlier versions of the planner’s dream, of course, pre-date supercomputers. Originally the idea was that with a careful enough plan and a room full of bean-counters, a decentralised system could be centralised and rationalised. For example, Leonid Kantorovich, the only Soviet economist ever to win the Nobel Memorial Prize in economics, was asked to apply his mathematical skills to the problem of production scheduling in the Soviet steel industry in the 1960s. His efforts did lead to a more efficient production process, but gathering the data necessary for the calculations took six years – by which time, of course, the needs of the Soviet economy were different.

At much the same time, Robert McNamara had the same faith in the ability of centralised quantitative analysis to solve a complex problem. His problem was not steel production but the bombing of Vietnam. US bombers dropped three times as much explosive on Vietnam as was used in the entire Second World War. The high explosives weighed more in total than the citizens of Vietnam did. Some districts suffered more than 1200 bomb strikes per square mile. And every bombing raid was meticulously recorded and analysed at Robert McNamara’s request. McNamara’s centralised analytical approach did not bring victory.

It is tempting to conclude that both Kantorovich and McNamara could have prospered if only they had had better computers. That seems to have been the belief of their respective successors, Salvador Allende and Donald Rumsfeld.

Allende was elected President of Chile in 1970 on a Marxist platform, and went on to sponsor one of the most surreal examples of the planner’s dream, Project CyberSyn. CyberSyn used a ‘supercomputer’ called the Burroughs 3500, and a network of telex machines, in an attempt to coordinate decision-making in an increasingly nationalised economy.

Allende recruited the cybernetic theorist Stafford Beer, a larger-than-life character with socialist sympathies and huge enthusiasm for the project, but who still demanded $500 a day and a steady flow of wine, cigars and chocolate. Workers – or more usually, managers – would telex reports of production, shortages and other information at 5 o’clock each morning. Operators would feed the information into the Burroughs 3500, and by 5 p.m. a report could be presented to Allende for his executive input. As with the effects-based operations it predated, CyberSyn would allow for feedback and second-order effects. Some CyberSyn defenders argue that the system was designed to devolve decision-making to the appropriately local level, but that does not seem to be what Allende had in mind when he said that, ‘We are and always shall be in favour of a centralised economy, and companies will have to conform to the Government’s planning.’

The project was not a success. Chile’s economy collapsed, thanks to a combination of the chaos brought on by an ambitious programme of nationalisation, industrial unrest, and overt and covert economic hostility from the United States. Allende died during a coup led by General Pinochet, who then tortured and murdered many of his political opponents. Stafford Beer had the good fortune to be in London on the day of the coup. Shortly afterwards, tormented by survivor’s guilt, he left his family and moved to a cottage in rural Wales.

The Burroughs 3500 was an impressive machine by the standards of the day, but that is not saying much. My father worked for Burroughs in those days – he tells me tales of hard drives the size of washing machines, with eight platters on a spindle storing a total of a few megabytes, less than a simple cell phone has today. Testing a computer was a great way to work out, hauling massive drive spindles and tape spools from one location to another. One of the attractions of the Burroughs 3500 was that memory could be expanded in discrete, reasonably priced chunks – 10,000 bytes at a time, just enough to store a few pages of this chapter. The Burroughs 3500 was never really regarded as a supercomputer, but it was an effective piece of corporate kit that, with the help of piecemeal upgrades, lasted for decades in the back rooms of banks. The 3500s ended their days as controllers for cheque-sorting machines.

CyberSyn is interesting not because it proves that computerised centralisation is a disaster – it does not, since Chile’s economy was under so much internal and external stress that it would surely have collapsed anyway – but because it shows the way in which our critical faculties switch off when faced with the latest technology. Western newspapers were giddily reporting that Chile’s economy was run by a computer that, by today’s standards, was a toy. But CyberSyn seemed sophisticated at the time, which was enough. Its iconic operations room looked tailor-made for Captain Kirk and Mr Spock, with chairs whose arm rests contained screens and control panels. This control room came to represent CyberSyn to the project’s supporters and to its opponents. Yet the control room itself never became operational.

Donald Rumsfeld had better computers at his disposal than Salvador Allende, but the dream was much the same: information delivered in detail, real-time, to a command centre from which computer-aided decisions could be sent back to the front line. Rumsfeld pored over real-time data from the theatre of war and sent memos about minor operational questions to generals such as Abizaid and Casey. But even had Rumsfeld been less of a control freak, the technology was designed to empower a centralised decision maker, be it the secretary of defense or a four-star general. In the Iraq war, the control centre, an air-conditioned tent inside a metal shell in Qatar, provided minute-by-minute updates on the movement of troops and aircraft.

These systems are not useless. Allende’s CyberSyn worked well enough to allow him to coordinate a response when Chile was racked by strikes and industrial sabotage. The opening phases both of the Gulf war and of the Iraq war were astonishing examples of the power of a coordinated, computer-aided attack plan. But such systems always deliver less than they promise, because they remain incapable of capturing the tacit knowledge that really matters.

CyberSyn was designed to bring problems to the attention of the President and his economic planners, but it succeeded only in reporting the issues that local factory managers wanted to report. Problems that they wanted to conceal, they had no difficulty in concealing. And when times were good it was hard to persuade them to telex any useful information at all, a state of affairs anticipated by Friedrich Hayek in an article published in 1945. What Hayek realised, and Allende and Beer did not seem to, was that a complex world is full of knowledge that is localised and fleeting. Crucially, the local information is often something that local agents would prefer to use for their own purposes. Hayek’s essay pre-dated modern computers, but his argument will retain its force until the day that computers can read our minds.

Rumsfeld’s computerised revolution in military affairs, like CyberSyn, often provided the illusion of information without really penetrating the fog of war. In February 2002 in Afghanistan, coalition commanders spent two weeks planning Operation Anaconda, focusing satellites and unmanned surveillance aircraft on a section of the Shah-i-Kot valley before attacking with helicopter-borne infantry. The helicopters dropped the soldiers almost directly on top of Taliban forces who had remained completely undetected. Apache helicopters were shot down by unknown assailants, precision bombers were unable to locate Taliban targets, and the entire operation was nearly a catastrophe for the coalition. Similar problems plagued coalition forces in the early stages of the war in Iraq. They often bumped into enemy forces of which they had received no warning from the ‘information-dominant’ command centre.

An early example of the limitations of ‘dominant battlespace knowledge’ came, not in the narrow streets of Tal Afar or the wooded hills of Kosovo, but in the best possible theatre for computer-aided warfare, the open deserts of Iraq during the first Gulf war. A group of nine US tanks, Eagle Troop, was speeding across the desert in a sandstorm when it stumbled upon a much larger force of Iraqi armour.

We had been moving through what was a relatively flat and featureless desert, and what I didn’t realise is that my tank was moving up a very slight rise in the terrain,’ recalls Eagle Troop’s captain. ‘After we crested that rise and came down on the other side, the whole enemy position really came into view.’ Because of the storm, the Americans had no air support, and they suddenly discovered that they were heavily outnumbered by the tanks and armoured cars of Saddam Hussein’s elite Republican guard, dug into defensive emplacements.

Both sides were caught by surprise. Eagle Troop’s captain had to make a snap decision: there was no time to discuss the situation with his superiors, or plug it into the ‘information-dominant’ computers. He realised at once that it would be more dangerous to try to retreat than to attack quickly and attempt to catch the Iraqis off balance. He yelled the order for his gunner to start firing anti-tank rounds – ‘Fire, fire sabot!’ – and an Iraqi tank was instantly destroyed. Reloading and firing every three seconds, his tank destroyed two more enemy tanks in the few seconds before the rest of Eagle Troop crested the ridge and opened fire. Nine American tanks destroyed almost ninety Iraqi vehicles without suffering any casualties themselves, thanks to their captain’s quick thinking, their training, and their superior weapons. No thanks at all were due to ‘information dominance’ or ‘effects-based operations’.

The spectacular engagement, swift and skilful, is now studied in war colleges as The Battle of 73 Easting. It earned Eagle Troop’s captain a write-up in a Tom Clancy book, and he is the subject of the opening pages – indeed the very first sentence – of the Army’s official history of the Gulf War. The author of this book, titled Certain Victory, gushes that Eagle Troop ‘dramatically illustrates the transformation of the American Army from disillusionment and anguish in Vietnam to confidence and certain victory in Desert Storm’.

Maybe so. It also dramatically illustrates the limits, even with the very best technology, of what a general’s command centre can know about the shape of the battlefield. American planes dominated the theatre of war with their precision bombs, but in the middle of that sandstorm, Eagle Troop was all on its own.

The name of Eagle Troop’s captain was H.R. McMaster.

10 ‘Knowledge of the particular circumstances of time and place’

It is still tempting to think that the US Army would have had no problems if only men like H.R. McMaster, Sean MacFarland, and David Petraeus had been in charge from the beginning. That conclusion misses the real lesson that McMaster was trying to teach the US Army. Long before Tal Afar, he had been arguing that the celebrated technology behind Effects-Based Operations was simply not as effective as military doctrine of the day assumed. Not only was the picture always incomplete, as the Battle of 73 Easting and Operation Anaconda demonstrated, but sometimes it was completely irrelevant. If you are talking to a man at a checkpoint in Tal Afar, no amount of data from a satellite or a surveillance drone will tell you whether he is friendly or hostile. As the British General Andrew Mackay puts it: ‘Insurgents do not show up on radar screens.’

If you are fighting a counterinsurgency campaign, the important decisions will be made by men on the ground, and the challenge is to make sure that the decisions look more like those made at Tal Afar and less like those at Haditha. Even if David Petraeus had been the chairman of the Joint Chiefs of Staff, and H.R. McMaster the head of US operations in the Middle East, someone would have had to develop the strategy in Tal Afar, paying close attention to the local situation. Eagle Troop’s captain would still have had to make an instant decision, no matter who he was. Petraeus and McMaster could have created a more accommodating space for local adaptation, but they could not have made local adaptation unnecessary.

Any large organisation faces a basic dilemma between centralisation and decentralisation. Hayek, back in 1945, argued that the dilemma should be resolved by thinking about information. Decisions taken at the centre can be more coordinated, limit wasteful duplication, and may be able to lower average costs because they can spread fixed resources (anything from a marketing department to an aircraft carrier) across a bigger base. But decisions taken at the fringes of an organisation are quick and the local information will probably be much better, even if the big picture is not clear. Hayek believed that most people overestimated the value of centralised knowledge, and tended to overlook ‘knowledge of the particular circumstances of time and place’. For H.R. McMaster, knowledge of the particular circumstances of time and place was precisely what was necessary to win many wars, and above all to conduct a successful counterinsurgency campaign.

Hayek’s argument was for decades largely ignored in mainstream economics, even after he won the Nobel Memorial Prize in 1974. But more recently, economists have been gathering the detailed data necessary to evaluate how successful organisations actually organise themselves. Julie Wulf and the former chief economist of the International Monetary Fund, Raghuram Rajan, examined large US firms from the mid-1980s throughout the 1990s. They found that these companies were flattening their bureaucracies, with junior executives facing fewer levels of hierarchy than fifteen years previously, and many more managers reporting directly to the top. Rajan and Wulf also gathered evidence on salaries and performance pay which suggests that the changes reflect a real delegation of decision-making power.

One reason for these changes is that businesses are operating in a different environment. Thanks to globalisation, businesses have ventured into new and varied markets, where they face intense competition. The traditional purpose of centralisation is to make sure every business unit is coordinated and nobody is duplicating anyone else’s effort. That might work for a business like Tesco or Wal-Mart, businesses with such control over their supply chains and shop floors that experiments with new products or marketing ideas can be delegated to a computer. But a centralised organisation doesn’t work so well when confronted with a diverse, fast-moving range of markets. The advantage of decentralisation, rapid adaptation to local circumstances, has grown.

Meanwhile, information technology has improved at a famously staggering pace. Kantorovich, Allende, McNamara and Rumsfeld all seemed to operate on the assumption that better computers and better communication links would help the process of centralisation, gathering everything into one place where a planner could make the key decisions. The exact opposite is true: the evidence suggests that more technologically advanced firms are also more decentralised. Typically, new equipment (anything from software to a large machine tool) is superior not because it does the same things faster, but because it is more flexible. To get the most out of that flexibility requires well-trained, adaptable workers with authority to make their own decisions, which is precisely the kind of workforce successful firms seek out or train when they upgrade their machinery or their software. In the organisation of the future, the decisions that matter won’t be taken in some high-tech war-room, but on the front line.

This is a lesson the Army is beginning to learn. When John Nagl served in Baghdad in 2003, he found that while his young, inexperienced soldiers had the authority to kill, he – a major with a doctorate and a decade of experience – didn’t have the authority to print his own propaganda pamphlets to counteract the clever PR campaign that the local insurgents were running. The commander of US forces in Baghdad in 2004 found that he couldn’t tap into the massive USAID budget to provide electricity, clean water, jobs and other assistance to the locals. The budget had been assigned in Washington DC to the Bechtel Corporation, which had been commissioned to carry out a few very large, long-term projects instead. The commander could see immediate needs but had no authority to act.

Over time, the Army learned to decentralise these essential decisions to the same extent that they had decentralised the authority to shoot people. In al Anbar, Sean MacFarland’s men broadcast news from loudspeakers six evenings a week, mixing information from locally trusted sources such as the Al Jazeera network, sports news, helpful advice – for instance, about food aid arriving at the UN warehouse – and just a sprinkling of propaganda attacking Al Qaeda in Iraq.

The Bechtel problem was partly allayed when decentralised aid in the form of the Commander’s Emergency Response Program was introduced. CERP provided cash to local officers, who had the authority to spend it on whatever local reconstruction seemed needed. A careful statistical analysis later found that CERP spending was effective at reducing violence. Spending $200,000 in a district containing 100,000 citizens could be expected to prevent about three violent acts – and since the definition of ‘violent act’ was something an exhausted and battle-hardened field commander felt was worth taking twenty minutes to log in the official records, the bar for inclusion was high.

But perhaps the most significant sign that the Army was learning to give authority to more junior officers came from the career of H.R. McMaster himself. It was a career that, by 2007, looked to be over. Returning from Tal Afar, he was passed over for promotion in 2006. In 2007 he was passed over for promotion again. After his successes in the field and outspoken comments to journalists, H.R. McMaster was the most famous colonel in the US Army. When he was snubbed, people noticed.

Every officer I spoke with knew about it and had pondered its implications,’ wrote the journalist Fred Kaplan in the New York Times. One officer told Kaplan that promotion ‘communicates what qualities are valued and not valued’; another that ‘When you turn down a guy like McMaster that sends a potent message to everybody down the chain.’ In this case the message was clear: if you want to get promoted, respecting your superiors is more important than setting the example that saves the US Army from defeat.

In 2008, it was rumoured that McMaster was about to be passed over yet again, quite possibly pushing him into premature retirement. David Petraeus took the unprecedented step of flying back to the Pentagon, at the height of the surge, to chair the Army’s promotion board. Among those he promoted to the rank of one-star general were Sean MacFarland and H.R. McMaster. Petraeus overruled the complaints of the men who had commanded McMaster in Iraq. Once again, the one-time micromanager Petraeus had demonstrated that what really counted was identifying the more junior officers who were capable of thinking for themselves.

11 Mission command and ‘the enduring uncertainty of war’

H.R. McMaster’s study of the Vietnam war revealed disastrous flaws in the way decisions were taken at the highest levels of the military and political establishment. Lyndon Johnson and Robert McNamara enforced a strictly defined hierarchy, insisted on unanimity, and put too much faith in the idea that information was best centralised and analysed using the latest quantitative techniques.

In Iraq, the US military achieved more success than most observers thought was possible, given how bad the situation had become by 2006. They had good leaders in Robert Gates and General David Petraeus, and a good strategy, but the real story of success was the way more junior officers, including H.R. McMaster himself, improvised new ways to win the war on the front line. The key to learning from mistakes was not to stick blindly to the official chain of command but to subvert it where necessary, not to seek unanimity but to listen to dissenters, and above all, not to rely on a top-down strategy but to decentralise and trust that junior officers would adapt, learning from each other and figuring out the best response to fast-changing local conditions.

In 2001, Army doctrine declared that ‘unmanned systems with artificial intelligence will augment human action and decision making … improved command and control systems will enable leaders to know more than ever before about the nature of activities in their battlespace’. That didn’t impress H.R. McMaster, a man whose formative combat experience involved stumbling upon a large enemy force in a sandstorm, and whose lasting achievement was to supervise a highly political, house-by-house and family-by-family campaign of counterinsurgency in Tal Afar.

‘We tended to believe, you know, that situational understanding could be delivered on a computer screen,’ says McMaster, who in an echo of Petraeus’s career, spent his first assignment as a general redeveloping Army doctrine as the Army’s head of ‘experimentation’. His new approach emphasises cultural understanding, local knowledge, urban environments, and the ‘enduring uncertainty of war’. McMaster is an evangelist for the old Army concept of mission command: senior officers set the aims, but junior officers decide how those aims are to be achieved, adapting flexibility to local information. Under mission command, air support and heavy artillery isn’t allocated by a three-star general sitting in a push-button swivel-chair, but is called in by a colonel or a major who actually understands the local situation and can be trusted to make the right decisions. It is an idea whose time has come again – and not just for the Army.

The painful process by which the US military learned from its mistakes in Iraq offers lessons for any organisation with a failing strategy in a fast-moving world. Experimentation mattered. But there is a limit to how much experimentation – how much variation, to use the Darwinian term – is possible for a single organisation, or desirable on the battlefield.

Sometimes, far more experimentation and far more variation are required – more than any one organisation, no matter how flexible, can provide. In such cases a far more radical approach to promoting new ideas is called for. It is to this problem of creating wild variation that we now turn.