CHAPTER 26
The post–Gulf War military
The Army, like the country as a whole, emerged from the 1991 Gulf War understandably relieved, and also very pleased with itself—probably too much so. The American military of that time may or may not have been the best the nation ever fielded, but it certainly was among the most self-satisfied. In a Fort Leavenworth course on the combat operations of corps and divisions, the Army gave 99.5 percent of the officers an “above average” score. “Basking in the glow of victory in Gulf War I, we became complacent . . . ‘the best trained, best equipped, and best Army in the world!’” recalled Maj. Chad Foster. “We spoke of ourselves only in the superlative.”
Nearly twenty years later, after the Army’s missteps and failures in occupying Iraq, retired Army Gen. Jack Keane, a former vice chief of staff, concluded that the problems had begun in the Gulf War: “The thing that killed us was the 1991 Gulf War. Intellectually, it bankrupted us for the rest of the decade.” Partly as a result of its overestimation of its victory in the Gulf, the Army failed to continue to build on its success in the 1980s, when it had been rebuilt and reequipped. Gen. Huba Wass de Czege, who had helped build this Army, worried that in the aftermath of the victory in Kuwait, “shallow ‘bumper sticker’ concepts captured the imagination of DoD officials and the public—‘Shock and Awe,’ ‘Global Reach—Global Power,’ ‘Operational Maneuver from the Sea,’ ‘Rapid Decisive Operations.’” Just as the Army had abandoned “organizational effectiveness” programs in the 1980s as soon as it was on the road to recovery, so, too, in the 1990s were intellectually oriented programs like the School for Advanced Military Studies given less priority, and instead the Army placed a new emphasis on “digitization” and other Information Age technologies. Applications to the school began to decline, and the quality and influence of its graduates likely did as well.
To be fair to Army leaders of the time, it was an odd, unsettled period. The Cold War was over. It wasn’t clear what would come next, but there was a feeling in political Washington in the early 1990s that the military was facing a period of dormancy. Bill Clinton ran for president in 1992 on a platform that emphasized, among other things, “defense conversion,” or redirecting Cold War defense assets toward peaceful, domestic purposes. Trimming the defense budget produced what was called at the time “the peace dividend.” Reflecting this assumption, the size of the Army was cut by nearly 40 percent, from 749,000 soldiers in 1989 to 462,000 a decade later. Few observers realized that superpower competition had kept a lid on many conflicts and that without the presence of the Soviet threat, it would be far easier (and less strategically risky) for the United States to use force abroad—as it would do in Somalia, Haiti, Bosnia, Kosovo, and, most of all, the Middle East.
There has never been a truly successful drawdown of American military forces after a war. Marshall, Eisenhower, and Bradley were inspired by the troubles of the post–World War I reduction to try to do better after World War II, but they failed and did worse, creating an undertrained, underequipped, out-of-shape force that would be sent to Korea in the summer of 1950. The American military was flat on its back after Vietnam. If there ever was a reduction that came close to success, it likely was the post–Cold War drawdown of the 1990s. That most recent reduction is hardly remembered today, and this is one measure of its achievement. Even as the Army shrank, it managed to conduct some innovative experiments, such as creating a new, faster, lighter type of unit built around wheeled (rather than tracked) armored vehicles called Strykers. “Not all went right, but a lot did,” commented Lt. Gen. James Dubik, now retired but at the time involved in designing and training the first Stryker battalions.
Yet even that reduction in the force had its flaws, most notably that it reinforced existing trends toward intellectual conformity and complacency in the Army. “It only took one boss to say something not nice [in a performance review] and that was it,” recalled Col. John Ferrari, whose career survived that era. He continued: “The nail that sticks up gets whacked. The Army whacked everyone who wasn’t on track for battalion command. To be that, you had to be the S-3, the XO, and then battalion command. And one day we woke up and looked for a Spanish-speaking officer to be a defense attaché, and they were all gone.”
The Army was so pressed to keep its combat units filled that it turned to the private sector for some of its intellectual functions, Ferrari explained: “We outsourced our thinking. We had MPRI [a consulting company led by retired Army generals] to write our doctrine, we had retired colonels as instructors, and we didn’t have battlefield feedback shaping doctrine. . . . It cost us in the decade of war.”
The Army’s nagging leadership problems persisted. When the Army Command and General Staff College surveyed officers in 1995, it found the same concerns that had been reported in the Army War College’s 1970 Study on Military Professionalism. “The overcontrolling leader and the micromanager remain alive and well in the Army today,” retired Army Col. Lloyd Matthews wrote in 1996, in a statement that was greeted as uncontroversial. A year later, retired Maj. Gen. John Faith wrote an article bewailing military micromanagement that was essentially no different from the articles in Military Review four decades earlier. He also noted that there seemed to have been no attempt by anyone to refute Matthews’s charge.
A study done at West Point as the century ended came to several startling conclusions about the state of the Army:
A paper written at about the same time by Col. Michael Cody at the Army War College accused the Army of institutionalized hypocrisy, preaching a doctrine of innovation while actually awarding risk-averse behavior:
Departing from the tried and proven solution to problems or recurring situations is in fact discouraged in a number of different ways by senior leaders, for lots of different reasons, despite the brave rhetoric to the contrary suggested on the appraisal forms. The message received by the junior officer is: don’t take risks, don’t depart from the norm, and don’t dare be less than successful in using a new approach.
The Army had developed a set of code words to ostracize those who departed, Cody noted: “irresponsible, maverick, immature, reckless.”
The plague of micromanagement appeared to be worse than ever in the Army, despite periodic attempts to tamp it down. In 2000, Lt. Col. Lee Staab surveyed fifty people who had left the Army as junior officers and found that every one of them “felt that there was a high degree of micro-management within their final assignments on active duty.” Maj. Anneliese Steele concluded in a monograph done at the School of Advanced Military Studies that “relationships between junior and senior leaders tend to be dysfunctional.” Even as the Army and Marines were invading Iraq in the spring of 2003, Military Review carried yet another article worrying that Army leadership was perceived, in the words of Col. Peter Varljen, as “self-serving, short-sighted, out-of-touch, unethical, and averse to risk.” The Army was led by managers and “performers” rather than by leaders. These men focused more on “short-term mission accomplishment” than on “developing effective organizations,” concluded a study by Col. Steven Jones at the Army War College the same year. Senior officers were skilled at following rules but not in inspiring subordinates or rewriting the rules when necessary.
Despite these persistent problems with leadership, one of the obvious remedies—relief of poor commanders—remained exceedingly rare. Two prominent generals, Walter Ulmer and Montgomery Meigs, published essays about generalship in Parameters, the journal of the Army War College—the former in 1998, the latter in 2001. Both were candid and offered a variety of thoughts. Ulmer argued that the Army could do better in selecting its senior leaders. Yet neither suggested that relief could be a viable solution to the problem. Dismissal, a basic tool of Army officer management in the 1940s, was beyond the realm of conception sixty years later. The vocabulary of relief had been lost.
The post–Cold War era, with its unpredictability in battles and even in foes, would demand a new flexibility in military leadership. Yet it appeared as if adaptability and risk taking largely had been bred out of American generals.
Meanwhile, as a result of the DePuy-era reforms, the Army continued to improve tactically. The all-volunteer force had come into its own. Everyone who was in the military had asked to be there, and most had learned a lot. It was a well-trained, professional, competent force. But the soldiers often were better at their tasks than the generals leading them were at theirs. In Iraq, the U.S. Army would illustrate the danger of viewing war too narrowly. “When war is reduced to fighting,” strategic expert Colin Gray once warned,
the logistic, economic, political and diplomatic, and socio-cultural contexts are likely to be neglected. Any of those dimensions, singly or in malign combination, can carry the virus of eventual defeat, virtually no matter how an army performs on the battlefield. . . . When a belligerent approaches war almost exclusively as warfare, it is all but asking to be out-generalled by an enemy who fights smarter.
That troubling thought set the stage for the American invasion of Iraq in 2003 and its aftermath, when the American military would be “out-generalled,” to use Gray’s term, by an insurgency that appeared to have few if any generals—but had a better conception of how to wage war in Iraq.
Coda: Powell stays on too long
Colin Powell’s last memorable act in government was to clear the way politically for that invasion of Iraq, with a speech at the United Nations early in 2003. Just eight years after writing his Horatio Alger–like memoir, the decent, go-along-to-get-along general would go along one more time, this time with catastrophic results for his reputation.
For several years, Powell had tried to turn a blind eye to the growing split between his beliefs and those of the Republican Party. After retiring as chairman of the Joint Chiefs of Staff in September 1993, he had veered onto the thin ice of an old general involved in politics. Like Douglas MacArthur, he delivered the keynote address at a Republican National Convention. Just as the Chicago convention of 1952 had soured on MacArthur in response to his talk, the San Diego convention of 1996 was unhappy with Powell’s speech, in which he emphatically supported affirmative action and abortion rights. (In another odd parallel, the 1952 Republican convention had been the first to nominate a World War II vet, Eisenhower, while the 1996 convention would be the last, selecting another Kansan, Sen. Bob Dole.) Had Powell left for private life at that point, his reputation would have remained unblemished. But late in his career, his exquisite sense of timing deserted him. He stayed in power too long, becoming secretary of state in 2001 for George W. Bush and Dick Cheney, men whose views were strongly at odds with his. They were willing to let him be secretary of state but not to make policy, and Powell was slow to grasp the nature and extent of his isolation.
Again there was a failure of discourse at the top. Like MacArthur in Korea, Powell was an old general at odds with his president. He thought he could bring his chief around and remained unwilling to face the fact that he could not and rather needed simply to get out of the way. Unlike MacArthur’s, Powell’s disagreements with his command in chief were grounded in loyalty. He remained a good, obedient soldier—and that would be his undoing, in part because he now was supposed to be acting as a civilian official. In his tragic final act, he dutifully carried water for the Bush Administration, traveling to the United Nations in February 2003 to deliver the speech that laid the diplomatic groundwork for the American invasion of Iraq. It was a bravura performance, in which Powell played on all the credibility he had developed during three decades of public service. “My colleagues, every statement I make today is backed up by sources, solid sources,” he said early on, with CIA director George Tenet sitting behind him, literally backing him up. “These are not assertions. What we are giving you are facts and conclusions based on solid intelligence.” He then appeared to divulge intercepted communications—an extraordinary act, given that those are usually the most classified sort of intelligence. He discussed with great confidence Iraq’s biological weapons factories, its stockpile of chemical weapons, its intentions to acquire nuclear weapons.
We now know that almost every single assertion he made in that speech was questionable and that, in fact, much of it was doubted at the time by experts in the intelligence community. Most embarrassingly, his claims about Iraq’s biological weapons turned out to have been based mainly on the statements of one Iraqi defector, code-named Curveball, whose allegations had been discredited even before Powell presented them to the world. In May 2004, long after the damage had been done by his speech and the United States was mired in Iraq, the CIA would officially recant everything Curveball had asserted, spurred perhaps by having learned that Curveball had not even been in Iraq during some of the time when he claimed to have witnessed important developments. Powell’s UN speech also relied on a second informant on biological weapons, but that source had been formally declared a fabricator months earlier by the Defense Intelligence Agency, and no one had informed Powell about that fabrication notice. The UN speech, unfortunately, would become the epitaph to Powell’s long career. “Who went to the United Nations and, regrettably, with a lot of false information?” he said in 2011. “It was me.” In another interview, he lamented, “I will forever be known as the one who made the case.”
In 2003, Powell did the job the Bush Administration needed him to do, putting war skeptics on the defensive. Powell “presented not opinions, not conjecture, but facts,” Defense Secretary Donald Rumsfeld stated to a gathering of European allies not long afterward. He continued, “It is difficult to believe that there still could be question in the minds of reasonable people open to the facts before them.”
Yet behind closed doors, Powell still had doubts. Shortly before the invasion of Iraq, he called Gen. Tommy R. Franks to ask him whether he was sure he had enough troops in his war plan. Powell thought there were sufficient troops to get to Baghdad but was concerned about what would happen after that. “But Tommy was confident,” Powell said ruefully. “They had enough to get to Baghdad, and it fell beautifully, but then the fun started.”