CHAPTER 3
The “Good War” and the GIs Who Fought It
IN 1984 WHEN STUDS TERKEL published his rich oral history of World War II, The Good War, he prefaced it with a note: “The Title of this book was suggested by Herbert Mitgang, who experienced World War Two as an army correspondent. It is a phrase that has been frequently voiced by men of his and my generation, to distinguish that war from other wars, declared and undeclared. Quotation marks have been added, not as a matter of caprice or editorial comment, but simply because the adjective ‘good’ mated to the noun ‘war’ is so incongruous.”1 Terkel displayed an uncommon caution and sensitivity. Few Americans have had any difficulty declaring World War II “good”—embracing the adjective unencumbered and unqualified by quotation marks.
This label obviously reflects more of a comparative than an absolute description. The Second World War has proved to be a better war than others in the American memory, in terms of common recognition of its cause, general public understanding and acceptance of its objectives, absence of ambivalence about enemies, and an unambiguous perception of courage, sacrifice, and accomplishments of the US armed forces—and of the committed sacrifices of those on the home front. But, of course, history is generally more complicated than the memory we carry.
World War II affirmed in many basic ways the revolutionary ideal of the “citizen soldier.” The nineteenth-century narrative that celebrated citizens taking up arms to defend the Republic seemed validated in this war. Civilians in huge numbers, perhaps as high as 12 percent of the population, put on uniforms and took up arms for the duration. Many fought bravely and well. Yet the experience nonetheless was far removed from the casual militia service of history’s memory.
In the Second World War, well-trained and well-equipped citizens served, sometimes for years, in uniform far from home. If they represented a more significant proportion of the population than any previous wars, they were still a minority. If this war was about defending the country and its values, it was also about completing a task that was defined by its participants more in practical than patriotic terms. And the celebrated commitment to American values in this “good” war was undercut by remarkably persistent racism, reflected ironically in the military force engaged in a global defense of “freedom.”
The Second World War also culminated a century and a half of discussion about the obligations of American democracy to those who served to defend that democracy. There were no longer any political advocates of the view that since serving to defend the Republic was an obligation of citizens, there should be no expectation of special gratuity or treatment. It was hard to sustain the rhetoric of everyman’s obligation when, even in this massive mobilization, most Americans did not serve. The postwar support for the World War II veterans and their families was unprecedented in American history and established the template for all of the wars and benefits that would follow.
The war itself became a template for how and why Americans would fight. As such it would influence actions and certainly perceptions of the wars that would follow. This war would prove a heavy political, cultural, and military burden for the next generations. As a result, it deserves some careful consideration. Wars are complicated things. They demand tremendous sacrifice. The memory of war should always include a recognition of the human cost. These costs are borne not only by those who perish in battle, but also by those who survive it.
 
 
The Japanese attacked US military installations in Hawaii on December 7, 1941. This action proved the final incontrovertible reason for American entry into the war. The New York Times captured the common feeling: “The United States has been attacked. The United States is in danger. Let every patriot take his stand on the bastions of democracy. We go into battle in defense of our own land, of our present and our future, of all that we are and all that we still hope to be, of a way of life which we have made for ourselves on free and independent soil, the only way of life which we believe to be worth living.”
President Franklin Roosevelt described December 7, 1941, and the actions that took place on that day, as “a date which will live in infamy.” He requested, and Congress provided, a declaration of war against Japan, and when Germany and Italy joined Japan’s war against the United States, Congress declared war against those other Axis powers. By the second week of December 1941, the war that had engaged much of the rest of the world since the 1930s had a new and major participant.
US government agencies advertised the war as a fight against evil in defense of the country—and a commitment to make the world a better place. Aggressors started it and we would finish it.2 There is little doubt that this view of the American role informed the public understanding of the war, but it did not define the war’s language and symbols. There was little of the “save the Union” drumbeat of the North in the Civil War or the idealistic “make the world safe for democracy” zeal of World War I. Despite the heroic rhetoric and self-image—and contrary to some of the memories of noncombatants that would follow the war—this would prove a complex and difficult war, one that would inflict some very heavy costs. More than 16 million Americans would serve in the military during the war, including a half-million women. More than 400,000 of these servicemen and women would die. The nuclear bombing of Hiroshima and Nagasaki in August 1945 ended a war and announced a new era. World War II would forever alter the place of America in the world.
America’s mood during the war was more practical than heroic. But there was a sense of drama and an enduring memory of heroism that was based on a remarkable set of experiences marked by true courage and sacrifice—at places like El Guettar, Salerno, Monte Cassino, Normandy, Bastogne, Guadalcanal, Iwo Jima, Okinawa, the Coral Sea, Leyte Gulf, and the Philippine Sea and in B-24s over Germany, in submarine warfare, on aircraft carriers, and in tanks. All of these mark historic battles with a significant series of technological innovations. Each battle, each innovation, incurred heavy casualties.
The American public had been slowly, reluctantly, coming to understand the likelihood of war prior to Pearl Harbor. In the 1930s Congress and the administration tried to build barriers to prevent another war. A series of neutrality acts had imposed heavy restrictions on the sale of US goods to belligerents. On the other hand, following the fall of France to the Germans in June 1940, increasingly Americans understood that involvement was perhaps inevitable. Public opinion surveys indicated that the dominant American view was supportive of the Allies—but support did not readily translate into approval of American military engagement.
Yet despite a growing recognition of the inevitability, if not the necessity, of American involvement in war, politicians treaded softly. Europe had been at war for two years and East Asia for even longer when the Japanese attacked Pearl Harbor, but even so, the United States was not ready. In 1940 and 1941 troops trained while wearing World War I uniforms, carrying wooden guns, and riding in trucks that had signs on them proclaiming them to be “tanks.” The distinguished military historian Russell Weigley wrote of the US Army, “The historic preoccupation of the Army’s thought in peacetime has been the manpower question: how, in an unmilitary nation, to muster adequate numbers of capable soldiers quickly should war occur.”3
Unlike any previous American war, the government began to muster the troops for World War II prior to American entry into the war. In September 1940 Congress approved the Roosevelt administration’s request for a draft law—an immediate peacetime draft. When he signed the legislation, Roosevelt invoked the historical view of the citizen soldier, claiming that the action “has broadened and enriched our basic concepts of citizenship. Besides the clear and equal opportunities, we have set forth the underlying other duties, obligations and responsibilities of equal service.” 4 It was an unprecedented move, audacious even, in an election year in which Roosevelt was standing for a third term. But the American public had warmed to the idea, given the challenges of the war already being fought. In 1940 89 percent of the public thought the draft was a good idea—up from 35 percent a year earlier.
The 1940 draft law provided for up to nine hundred thousand men to be drafted. Every male in the country, including foreign-born residents, between the ages of twenty-one and thirty-six had to register. All were subject to being called up for one year of service and for ten years of reserve duty. Congress restricted their service by prohibiting these draftees from serving outside the Western Hemisphere except in US possessions. There were few categorical deferments, and the legislation provided for local draft boards to determine individual exemptions.
The first draft lottery was held on October 29, and Secretary of War Henry Stimson drew the first capsule. In a moment heavy with historical symbolism, he was blindfolded with a cloth that was on a chair used at the signing of the Declaration of Independence. The capsules had been stirred with a piece of wood from Independence Hall.5
Those responsible for overseeing the peacetime draft struggled to develop administrative rules and procedures. In the final accounting, they were not always able to meet the War Department manpower goals. Local draft boards were not consistent from jurisdiction to jurisdiction in allowing deferments. Married men with dependents were generally eligible for an exemption—and for many boards this meant that if a man’s wife worked, she was not dependent and he was not deferred. In the early months there was a significant increase in marriages among men who were age eligible for the draft.
College officials pushed hard for educational deferments, but Congress and the Roosevelt administration resisted this. As Lieutenant Colonel Lewis Hershey, the deputy director of the Selective Service, asked, “Is the college student, per se, of more importance than the automobile mechanic or farm laborer who is now working and producing?”6
Following the declaration of war, Congress, with little debate, lowered the induction age to twenty, ended the restrictions on draftees serving overseas, and provided that all who were inducted were liable to serve for the duration of the war plus six months. In the fall of 1942 Congress authorized the drafting of eighteen-year-olds. This was partially in response to the desire to reduce instances of calling up married men with children—and also in response to the army’s experience that the youngest were more physically fit and more willing to take combat assignments.
The country needed to mobilize more than soldiers for this war. Prior to the declaration of war, the tremendous industrial capacity of the United States was underutilized as a result of plant closings and layoffs from the Great Depression. Wartime production picked up more rapidly than many expected it could. In 1940 the government had spent $1.8 billion on the military. In 1942 the United States spent $22.9 billion. By war’s end the United States was equipping and arming not just its own substantial military force but also was providing significant support for its allies.7
From the fall of 1940 through the summer of 1942, the army built forty-two new bases or camps. They inducted 14,000 men per day by the summer of 1942 and strained to provide facilities and training programs. From October 1940 to March 1947 when the World War II draft expired, the Selective Service registered 49 million men, selected 19 million for conscription, and saw 10 million inducted. George Gallup, whose polls indicated that public approval for the draft never dropped below 75 percent, concluded, “Few programs in the nation’s history have ever received such widespread favorable reaction from the people as the handling of the Selective Service draft.”8
With as many as 184,000 local draft board members, and with their authority over individual cases, there was tremendous variation in the implementation of the draft. Farmers, for example, had a greater advantage in securing occupational deferments—in 1944 some 17 percent of age-eligible farmworkers received deferrals, whereas only 9 percent of the other eligible workers did. Some farm-state senators had pushed for categorical deferments for farmers, but the administration defeated this initiative, largely by allowing even more local board discretion in providing deferrals. Deferments for workers in industry proved more complicated. And there is clear evidence that some local community draft boards used the withholding of deferrals as a tool to control labor activity or even to punish absenteeism.
Despite pressure from colleges for educational deferments, the administration resisted. On the other hand, they worked with the American Council on Education to develop campus-based training programs. The result was the Army Specialized Training Program, which ultimately enrolled more than 150,000 trainees on campuses. Roosevelt supported the ASTP because he was influenced by college officials who insisted that depleting college enrollees through the draft would lead schools to close. But as Lieutenant General Lesley McNair complained, he needed 300,000 more men in the army, was facing a declining quality of inductees, and then, showing his frustration, said, “We are asked to send men to college!” Others agreed—it was, George Flynn concluded, a weak program that “did provide a subsidy to American education during the war.”9
The V-12 Navy College Training Program similarly worked to meet both military needs and those of the higher-education community. Because the navy program aimed at producing officers instead of technical trainees, it proved less controversial. By the war’s end, V-12 had enrolled some 125,000 navy and Marine Corps officers at 131 colleges and universities.
Late in 1942 a presidential order banned the military from recruiting volunteers from among those men who were already eligible for the draft. This forced the navy and the Marine Corps also to turn to the draft. Previously, the army had argued that they were taking away some of the best potential soldiers by their recruiting methods. In the fall of 1942, sailors were on average three years younger than the men serving in the army. Following the presidential order, the military could only recruit seventeen-year-olds who were eligible to serve but were not draft eligible. The marines and the navy quickly began recruiting from this population, resulting in their maintaining a younger average age than the army. In 1944 the average soldier was twenty-six years of age; the average sailor was twenty-three and the average marine twenty-two.10
The army air forces also tended to be attractive to young volunteers. And up until 1943 this special branch of the army had “skimming” privileges that permitted them to take army draftees with the highest scores on qualification tests. The army air forces ended up playing a major role in the war, particularly in Europe, and by 1944 2.4 million men and women served in the AAF.
The B-17 Flying Fortresses were the mainstay of the bombing campaigns in Europe, though they proved far more vulnerable than air planners expected. In 1943 the Eighth Air Force lost about 5 percent of its planes on every mission. Two-thirds of all airmen did not complete their tour of twenty-five missions. The B-24s provided some greater flying range, but this resulted in even more dangerous missions.
As perilous as this service was, it was effective. The army air forces and the British Royal Air Force controlled the skies over Europe by the late spring of 1943, which was a major factor in the success of the Normandy landing and the campaign that followed. More than 88,000 airmen died in the war, but by the end of combat operations, this army unit had proved its strategic value and a separate service branch, the air force, was established in 1947.
The navy evolved quickly during the war from a battleship strategy to one emphasizing more and more the role of airpower. The development of carriers provided a significant tactical strength that the navy utilized particularly in the Pacific, despite the opposition of “old navy” officers, including Pacific Fleet commander Admiral Chester Nimitz. Carrier aircraft inflicted a significant defeat on the Japanese in the Battle of Midway in June 1942. The United States lost 180 carrier aircraft in this battle, and the Japanese lost 332. By the end of the war, the carrier dominated naval operations and engagements.11 In 1945, more than 3.4 million men and women were serving in the navy. The navy lost 32,925 enlisted men and 36,950 officers to combat deaths.
The Marine Corps had focused in the 1920s and 1930s on amphibious warfare, building upon its historical and institutional linkage with the navy. By 1940 it had developed the Fleet Marine Force as its mobile light infantry assault force. Marine leadership also had developed a tactical air division. When World War II began, the army and senior military leadership at the Pentagon did not value highly the Marine Corps assault troops and assigned them to the Pacific rather than to Europe. The tension between the marines and the army that had begun during World War I had not abated in the interwar years.
There were some 50,000 marines in December 1941, following a buildup in strength over the previous year. Marines recruited aggressively and effectively—weekly enlistments jumped from a prewar high of 552 to 6,000. The marines expanded by 45,000 in the three months after Pearl Harbor, and the quality of recruits improved as well. By war’s end there were 475,000 marines.
World War II confirmed and validated the role of the Marine Corps. Intense battles and remarkable courage at Guadalcanal, Peleliu, Tarawa, Iwo Jima, Okinawa, and several other Pacific islands became part of the American war narrative. The marines were adaptable, disciplined, and tough light infantry fighters—and they benefited from a slower buildup to European action and a skilled public relations group. As one marine historian noted, “If the central Pacific campaign was the supreme test of amphibious doctrine, it was also a media event of unparalleled drama for American war reporting. And it was the Marine version of the war that largely dominated the press.”12
 
 
American military planning for the war focused heavily on using airpower, both carrier based as well as land based, tanks and armor, and punishing artillery. Senior officers were reluctant to focus on massive infantry tactics if machines could reduce the casualties. Americans assumed that technology could reduce the terrible losses that marked the Civil War and World War I—as well as the Soviet-German conflict on the eastern front of World War II. Adrian Lewis, a military historian and retired army officer, described it this way: “While the army’s most basic tenet was that man was the ultimate weapon on the battlefield, ground combat was the least desirable American way of war. American beliefs about manhood, battle, and war were at odds with the value placed on young American lives, a value that compels Americans to expend every resource, almost unconditionally to remove man from the battlefield.”13
There is little doubt that technology, especially tactical airpower and armor, did take on some of the combat burden. However, there is also little doubt that troops in the field finally were essential to victory. Major General Walter Smith wrote in an army training memorandum of 1943:
War is a dirty business, and anyone who engages in it must face the facts. It is simply a question of killing or being killed. It cannot be impersonal. To wage successful combat there must be a burning desire to come to grips with the enemy, and to kill him in mortal combat. . . . Battles, large and small, cannot be won entirely by maneuver, or by artillery or air action. Well trained troops cannot be shot or bombed out of a position. . . . [I]t remains for the Infantry . . . to close with the enemy and by use, or threatened use, of the bayonet to drive him from his position. . . . The object of war is to kill the enemy. . . . And the more ruthlessness with which that object is pursued the shorter will be the period of conflict.14
In the first weeks of the war, some communities held farewell parties to acknowledge those who were going off to military service. But “as the war went on the departures came with less fanfare. For one thing, there were so many of them; then, too, the increasing number of family men preferred to say their good-byes privately, in the warmth and intimacy of their homes. But most of all, departing draftees did not care for patriotic send-off ceremonies.”15
Robert Leckie wrote about leaving his home in New Jersey to join the marines in January 1942. His dad accompanied him to the shipping center in New York. “Breakfast at home had been subdued. My mother was up and about; she did not cry. It was not a heart-rending leave-taking, nor was it brave, resolute—any of those words that fail to describe the thing. It was like so much else in this war that was to produce unbounded heroism, yet not a single stirring song: it was resigned. She followed me to the door with sad eyes and said, ‘God keep you.’” After arriving at the New York induction center, “My father embraced me quickly, and just as quickly averted his face and left.” Six months later Robert Leckie sailed with his unit from San Francisco, and in August he was on Guadalcanal.16
The mobilization for this war was comprehensive, and the armed services ultimately met their manpower objectives. Essentially, all age-eligible males were in the pool, even the foreign born who were in the United States illegally. During the war some 16.3 million men and women served in the US armed forces: 11.2 million in the army, including the army air forces; 4.1 million in the navy; and 669,000 in the marines. There were 333,000 women in the military. By war’s end, the army was thirty times its 1940 size, and the navy was twenty times its size in 1940.
The average serviceman had completed more years of education than his civilian counterpart. One careful student of the draft concluded that “the best and brightest did not evade service.”17 Another study confirmed the absence of educational or economic bias in combat. In fact, higher-income and better-educated communities had higher casualties. This was largely the result of the draft and enlistment standards, but it also related to the high casualty rates in the navy and army air forces.18
The military in World War II represented a cross-section of the population. The armed forces expected at least fourth-grade literacy—in two months in the summer of 1941, draft officials rejected 90,000 men who did not meet this guideline. There were also higher expectations for fitness and medical condition. Over the war, some 5 million men failed to pass the physical exams. All of the evidence suggests that these men who failed Selective Service examinations were more likely to be from poorer communities.19
Charles Milton “Stubbie” Pearson graduated from Dartmouth College in 1942. A native of Madison, Minnesota, Pearson excelled in college as a student and as an athlete, serving as captain of football and basketball teams and graduating Phi Beta Kappa. He was the valedictorian for his class, which graduated early due to the war. Ninety-one percent of his Dartmouth Class of 1942 would serve in the armed forces. Speaking at the commencement ceremony, Pearson urged that no one feel sorry for his generation: “We are not sorry for ourselves. Today we are happy. We have a duty to perform and we are proud to perform it.” He reminded his class that their task was to end the war and then to make the world a better place. “A tomorrow with a ray of sunshine more bright than we have ever seen before.” He hoped to teach or work in education to accomplish this. As a navy pilot in the Pacific, Stubbie Pearson wrote home that war was not glorious, but “a dirty, predatory, slimy job that must be done.” Recipient of the Distinguished Flying Cross for his performance in naval battles at Truk and Palau, he died at the latter place when his SBD Dauntless dive-bomber was hit by antiaircraft fire while attacking a Japanese destroyer. Thirty-three of his Dartmouth classmates, 5 percent of the number who matriculated as freshmen in 1938, would also die in the war.20
Celebrities and stars served in this war. Douglas Fairbanks Jr., Robert Montgomery, Clark Gable, and Jimmy Stewart all enlisted. Stewart and Gable served in the army air forces on combat missions over Europe; Stewart was a pilot, and Gable served as a gunner on B-17s (Nazi leader Hermann Göring offered a five-thousand-dollar reward to any pilot who shot Gable down). William McChesney Martin, the president of the New York Stock Exchange, was drafted. And sports stars such as Bob Feller, Ted Williams, Phil Rizzuto, and Joe Louis were 1-A and eligible for early conscription. The American public was not supportive of any special treatment for celebrities.
In September 1942 Glenn Miller was thirty-eight years old, too old for the draft. Though Miller was one of the most successful band leaders in the United States, he disbanded his orchestra and joined the army. Certainly, he forfeited millions in earnings by taking this step. He said, “I, like every patriotic American, have an obligation to fulfill. That obligation is to lend as much support as I can to winning the war.” Miller said that he had been privileged to live in the United States as a “free man,” and now he would help to protect “the freedom and the democratic way of life we have.”21
Glenn Miller led the army air forces orchestra—a forty-two-man marching band—as well as a nineteen-person dance unit, a radio outfit, a string ensemble, and a small jazz combo. He and his groups engaged in bond drives and entertained troops in the United States and overseas. His band did not play military music but played popular swing that bore Glenn Miller’s musical trademark. It evoked sentimental images of home and of women at home. Miller said that the American GI wanted “songs he used to know played as he used to hear them played.” Glenn Miller was tragically lost when a small plane in which he was flying went down over the English Channel in December 1944.
The Glenn Miller story has many of the elements of those themes that define the American memory of World War II. He was selfless and sacrificing and patriotic, ending up a casualty of that “good war.” But his story also opens a page to some of the less heroic annals of this war. Miller’s music stressed a “clean-cut” swing, and his approach, as one scholar noted, “worked with the government policy of military segregation and its desire not to disturb deeply held racial values.”22 His music was clearly influenced by black music of the era—but his orchestra and groups were all white. Black entertainers such as Duke Ellington and Count Basie along with people like Benny Goodman resisted this segregation. Glenn Miller and the army air forces band did not.
 
 
The US military was segregated in World War II; there were some all-black units in the army. In the first part of the war the navy and marines did not take any black members. So in a convoluted system, embarrassingly inconsistent with the values for which the United States was fighting, the army had “race-specific” quotas. It issued these to the Selective Service, which then were passed along to establish state and local board goals.
This policy resulted in significant problems in draft calls. It was based on a principle that discriminated against blacks—and had ironically an additional consequence of discriminating against whites, who were disproportionately called up in most communities. Because blacks served only in segregated units and because there were at the outset few of these units, the demand for black inductees was lower. For example, in one month there were eleven hundred whites drafted in Washington, DC, while no blacks were called up.23
Selective Service director Lewis Hershey, who became a negative symbol to draft protesters during the Vietnam War, pressed both Secretary of War Henry Stimson and even President Roosevelt on this matter. He wrote to the president, “It is obvious we must sooner or later come to the procedure of requisitioning and delivering men in the sequence of their order numbers without regard to color.” Neither Stimson nor Roosevelt responded.24
Once inducted, black draftees as well as black volunteers had to deal with significant racism. It was a deeply rooted racism that was sharpened by American society and politics of the 1930s and 1940s. It was a racism that, as in World War I, failed to recognize the distinguished service of black combat units in the nineteenth century. The racism was compounded by the fact that most military bases were in the South.
When black entertainer Lena Horne performed at a base in Arkansas, German prisoners of war attended, but African American servicemen were not permitted at the performance. On the other hand, racism in the armed forces was institutional as much as regional. One black Mississippian who was drafted in 1942 and served in the United States and overseas stated that he “really didn’t know what segregation was like” prior to serving in the army.25
Military racism was further rationalized by the results of the racist education system that most black Americans experienced. Black inductees did not generally score as well as whites on the army General Classification Test. This affirmed racist stereotypes, even though it surely reflected the inadequacies of the segregated education system at home.
Race was the dominant factor in the assignment of black soldiers. Secretary Stimson argued that black units needed white officers for leadership. General George Marshall thought their low intelligence restricted their value to the army. Similarly, General Patton concluded that they “could not think fast enough for armored warfare.”26 News releases on the war and productions of the federal Office of Wartime Information typically excluded any pictures of black troops.
Despite these stereotypes, black troops did ultimately participate in combat operations. By the fall of 1944 General Patton changed his views and requested that the 761st Tank Battalion, the first black armored unit, join his command. He spoke to the 761st in early November: “Men, you’re the first Negro tankers to ever fight in the American Army. I would never have asked for you if you weren’t good. I have nothing but the best in my Army. I don’t care what color you are, so long as you go up there and kill those Kraut sonsabitches.”27 They did engage the enemy, serving in action for 183 days, in four major campaigns, including the Battle of the Bulge, where they halted a crucial German resupply effort. The 761st suffered 50 percent casualties and won a number of individual and unit honors, including one man, Reuben Rivers, whose Silver Star was later upgraded to a Medal of Honor.
 
 
It was ironic that black units had to press a reluctant command for combat duty while the army struggled to fill its infantry units. Early in the war, those assigned to the infantry had lower scores on the intelligence tests. They also tended to be shorter and weigh less than those who were sent to the army air forces or to Service Forces. The Army Research Branch described the infantry as “the dumping ground for men who could pass physical standards but who need not satisfy any other test.” Infantrymen acknowledged this, and 74 percent of them agreed with this survey statement: “The Infantry gets more than its share of men who aren’t good for anything else.”28 Despite this image, those who served in the infantry took pride in their role and from their sacrifices.
The need for strong combat units ended these practices. Beginning in 1943 General Marshall ordered that prime recruits were to be directed into infantry units. Of course, most of the men and all of the women who served were not in combat units but rather in service ranks. By some government estimates, of the 3 million soldiers who came into western Europe in 1944 and 1945, only one-quarter of them were in units engaged in fighting. Their logistical and support needs were complicated and demanding. And the army sought to provide the American soldiers with “something corresponding to the American standard of living,” in the words of one of the army’s own studies.29
The American Army organizationally had two service personnel for every combat soldier—and effectively the ratio was often much greater than that. The German Wehrmacht, in contrast, had two combat soldiers for every soldier in service ranks. Every American soldier who landed in Europe required forty-five pounds per day of supplies, a quarter of which was petroleum products. The British soldiers had less than half of this, and the Germans often about one-tenth the US amount. “THE BEST-DRESSED, BEST-FED, BEST-EQUIPPED army in the world” was a common boast. Americans were proud that their soldiers served with support both personally and militarily at a level their opponents could not have.30
The term GI, for “government issue,” was used in the early 1940s to describe government-issued equipment or supplies. When first used to describe soldiers, it was pejorative, but it soon came to be adopted by the men themselves as a self-description. In many ways it reflected the absence of any sense of self-importance for these men. One scholar summarized it this way, “The naïve idealism, the noisy confidence of 1917 did not reappear, nor did the impetuosity that led the doughboys to dash forward into their own artillery barrage or to assault machine-gun nests frontally.” These GIs had little interest in discussing a “cause” for which they were fighting, and most observers noted that Hollywood films with posturing heroes generally elicited laughter from GI audiences.31 William Manchester was recuperating in a hospital when John Wayne visited the troops there. Manchester recalled that the troops booed the movie star. “This man was a symbol of the fake machismo we had come to hate, and we weren’t going to listen to him.”32
Early in the war the army was concerned when surveys indicated that men did not want to go overseas and into combat. Ernie Pyle visited one infantry company and reported, “A lot of people have morale confused with the desire to fight. I don’t know of one soldier out of ten thousand who wants to fight. They certainly didn’t in that company. The old-timers were sick to death of battle and the new replacements were scared to death of it. And yet the company went into battle, and it was a proud company.”33 Watching a group of infantrymen walk up a hill, he observed, “In their eyes . . . was no hatred, no excitement, no despair, no tonic of their victory—there was just the simple expression of being there as if they had been there doing that forever, and nothing else.”34
One survey of combat veterans in 1944 asked them what kept them going in combat. The most common answer, 39 percent, was “getting the task done.” Some 10 percent talked of getting home. And their regard for their “buddies” motivated 14 percent. A sense of duty and their own self-respect were marked by 9 percent, and only 5 percent talked of “idealistic” reasons. Simply getting home dominated.35
This practical approach to the war was widely recognized. In May 1943 Life editorialized that “when you look over the U.S. as it is today it’s hard to find the real purpose” for the war. And in January 1944 the magazine reported that “the bewilderment of the boys in the armed forces concerning the meaning of the war is noted by almost everyone who goes out to the front.” Fortune also noted that “the American does not know why we are at war and has not sought to know.” An army unit asked its soldiers to write an essay on why they were fighting. One submitted a six-word statement: “Why I’m Fighting. I was drafted.”36
The most popular song of the war was not a stirring military tune but Irving Berlin’s sentimental “White Christmas.” “Don’t Sit Under the Apple Tree with Anyone Else but Me” was also beloved. One critic observed that Glenn Miller’s orchestra was “the greatest gift from home.” His music struck an emotional chord with many soldiers. A GI reported of a Miller memorial concert that the music played was “tied up with individual memories, girls, hopes, schools. It’s a tangible tie to what we are fighting to get back to.”37
 
 
Wars are not fought and won by humming sentimental songs—or martial music, for that matter. The American troops, young and inexperienced in the early campaigns, led by young and inexperienced field officers and noncommissioned officers, learned well the horrible lessons of combat. As Army Field Manual 100-5 put it, “Man is the fundamental instrument in war; other instruments may change but he remains relatively constant. . . . In spite of the advances in technology, the worth of the individual man is still decisive. . . . The ultimate objective of all military operations is the destruction of the enemy’s armed forces in battle.”38
As one student noted, the combat soldier would sometimes “see himself as a warrior and like what he saw.”39 The idealism was “latent” rather than expressive. A marine wrote from Iwo Jima that there was among those who fought there a sense of patriotism, but it was not “the kind that is amassed in the throats of people when our national ensign is unfurled, or like as many sheep, cheer at a passing parade”; instead, it was a feeling that “lies deep and still in the hearts of” the marines.40
Sometimes purpose derived from experience. One army sergeant who liberated a Nazi death camp said, “I never was so sure before of exactly what I was fighting for.”41 In April 1945, Ohrdruf, part of the Buchenwald complex, was the first camp the US forces liberated. Generals Eisenhower, Patton, and Bradley were there, and they encountered thirty-two hundred emaciated bodies thrown in a ditch. Bradley said that Eisenhower turned pale, and Patton went to a corner and vomited. Eisenhower said of these confrontations with true evil, “We are told that the American soldier does not know what he is fighting for, now at least he will know what he is fighting against.”42
Perhaps this was clear by the spring of 1945. There is little doubt that the memory of World War II is underlined, perhaps dominated, by a narrative of the battle against the cruelty and racism of Nazism and the genocidal megalomania of Adolf Hitler. But there is also little doubt that these factors were not really consequential motivators during the war. In fact, they were little understood, and there was no real effort to encourage such an understanding. America did pursue a Europe-first military approach, but this was a strategic military and geopolitical decision made in concert with the Allies rather than an emotional priority.
Because of the attack on Pearl Harbor, the mood of anger and of vengeance was directed toward the Japanese and seldom toward the Germans. These moods and attitudes had been conditioned by a long-standing American racist view of the Japanese—and of Asians. After Pearl Harbor the word most commonly used to characterize the Japanese was treacherous.
Time, in covering the Pearl Harbor attack, asked the question, “What would the people, the 132,000,000, say in the face of the mightiest event of their time? What they said—tens of thousands of them—was: ‘Why, the yellow bastards!’” Even the New Yorker referred to “yellow monkeys.”43 And Admiral William Halsey, soon to be commander of the South Pacific Force, promised after the Pearl Harbor attack that by the time the United States was finished, Japanese would be spoken only in hell, and he rallied troops with the slogan “Kill Japs, kill Japs, kill more Japs.” US Marines picked up the slogan: “Remember Pearl Harbor—keep’em dying.”44
Americans had heard reports of Japanese atrocities in China and the Philippines as well as in the early accounts of American engagement with Japanese troops at places such as Bataan and Guadalcanal. These stories of Japanese cruelty were largely true. An estimated 35 percent of American servicemen who were imprisoned by the Japanese died in captivity compared to 1 percent of those held by Germans. The average prisoner held in a German prisoner of war camp lost thirty-eight pounds during captivity; his counterpart in a Japanese camp lost sixty-one pounds. (It is relevant to note that the average term for American prisoners in Japanese camps was thirty-eight months, while in German camps it was ten months.)45
Japanese war conduct was indeed marked by horrible stories of treatment of civilians and of prisoners. In most cases, they were true; in all cases, Americans readily accepted them as true, enhancing the already-negative stereotypes. But equally true were accounts of Germans executing prisoners, destroying villages and their inhabitants, and raising anti-Semitism from discriminatory conduct to systematic genocide. Few people in the West thought of these incidents as defining of the German character. John Dower points out that German crimes and atrocities were considered “Nazi” crimes, while Japanese crimes and atrocities were “Jap” behavior.46
Life’s “picture of the week” on May 22, 1944, showed a woman with a Japanese skull that her boyfriend had sent her. It was autographed by him and thirteen others, inscribed, “This is a good Jap—a dead one picked up on the New Guinea beach.” If the Japanese carried out their side of the war with cruelty and brutality, this seldom provides a moral rationale for reciprocating in kind. Except in this case, some argued that it did. Colliers editorialized in 1945, “The barbarism of your enemy is never an excuse for descending to barbarism yourself—though of course our men in the Pacific have to fight the Japanese devils with fire.”47 John Dower frames the proposition that has no ready rebuttal: “It is virtually inconceivable, however, that teeth, ears, and skulls could have been collected from German or Italian war dead and publicized in the Anglo-American countries without provoking an uproar; and in this we have yet another inkling of the racial dimensions of the war.”48
At the end of the war, the distinguished historian Allan Nevins wrote about the hatred that Americans had expressed toward the Japanese. “Probably in all our history, no foe has been so detested as were the Japanese.” He believed this was the result of the nature of the attack on Pearl Harbor, the reports of Japanese atrocities as well as the way they fought in the Pacific, and, he said, “emotions forgotten since our most savage Indian wars were reawakened by the ferocities of Japanese commanders.”49 With no sense of irony, he linked this war with a historical analogy of truly racist contempt and cruelty.
The United States, particularly California, at the beginning of the war already had a record of more than a half century of discrimination against Asian immigrants. In 1924 Congress prohibited Japanese immigration, following a long period of heavy restrictions. With this heritage, the attack on Pearl Harbor dredged up existing racist and paranoid reactions. Residents on the West Coast expressed their fear of additional Japanese attacks there—attacks that would be enabled and abetted by the Japanese Americans living in the area.
There was not a single act of sabotage or espionage by Japanese Americans during the war. Nevertheless, in February 1942 President Roosevelt signed an executive order that provided finally for the 110,000 Japanese Americans living on the West Coast to be removed to internment camps. It was a remarkable decision, indefensible strategically and surely morally.
Of course, people tried to justify the internment of American citizens. As the mayor of Los Angeles, Fletcher Bowron, argued, “If we can send our own young men to war, it is nothing less than sickly sentimentality to say that we will do injustice to American-born Japanese to merely put them in a place of safety” so they cannot harm anyone. After all, the citizens of his city would be “the human sacrifices if the perfidy that characterized the attack on Pearl Harbor is ever duplicated on the American continent.”50
 
 
Early in the war the United States disagreed with the British strategic bombing policy. The British Royal Air Force had engaged in some “area” bombing attacks against German cities, accepting the inevitable civilian casualties in seeking to defeat and demoralize German support for the war. This obviously was part of a retaliatory cycle responding to and inciting German Luftwaffe bombing of London and other English cities. At the time, Americans insisted upon a policy of “precision” bombing of targets, but it would not be long before the United States did engage in some “blind bombing” of its own.51
The United States joined in the horrible firebombing of Dresden in February 1945 even though the city was of limited strategic value. And from the outset, there were fewer constraints on bombing of Japanese cities. By war’s end, there were none. In fact, by the last several months of the war, some 75 percent of the bombs dropped on Japan were incendiary bombs—aimed at destroying cities and demoralizing the civilian population. As many as 100,000 civilians were killed in Tokyo in the spring of 1945, “scorched and boiled and baked to death,” quipped General Curtis LeMay, the commander of strategic air operations over the Japanese home islands.52 The atomic bombings of Hiroshima and Nagasaki aimed at destroying morale rather than military installations.
 
 
Those on the American home front mirrored the soldiers in their practical view of the conflict. The period following the First World War had been marked by a great deal of cynicism, a belief that propaganda had led the United States into a presumed fight for democracy that proved illusory. There was little disposition to jump on this bandwagon again. The war theme was that the United States had been attacked: let us defeat the enemy and get this over with so that the troops can come home. Early in the war the Office of War Information did stress the need to defend democracy, but most political and military leaders did not pick up this expressed objective. One study during the war concluded, “There is much cynicism to overcome. Most men of military age grew up in the midst of disillusionment about the Great Crusade of a generation ago.”53
Nonetheless, at home at least, the positive themes were part of the narrative. In 1942 some 63 percent of Americans agreed that the country was fighting for an “ideal.” Franklin Roosevelt told Congress in January 1942 that “only total victory can reward the champions of tolerance, and decency, and faith.” Roosevelt spoke often of the “Four Freedoms,” goals he had set as universal in January 1941. These were freedom of speech, freedom of worship, freedom from want, and freedom from fear. In the late winter of 1943 Norman Rockwell did four covers for the Saturday Evening Post that evoked these four freedoms. These covers became postage stamps and illustrated the posters for bond drives.
These themes never quite made it to the front. Army surveys found that only 13 percent of soldiers could name three of the four freedoms. James Jones said no one wanted to die for a “cause,” because “after you are dead there is no such a thing as Liberty, or Democracy or Freedom.”54 Life reporter George Biddle wrote in January 1944 that folks back home might do better not to think of the troops as combat heroes: “They might better visualize them as miners trapped underground. They are always frightened and they are always homesick. Their one dream and ambition is not to march on Berlin, as propaganda stories say, but to go home.”55
Most citizens joined the servicemen and -women in avoiding celebration. This low-key approach ironically left a vacuum in such a major national effort. The Infantry Journal wondered about the absence of parades and crowds sending the boys off: “There has not been much of this in the present war.” And columnist Raymond Moley criticized the absence of patriotic sendoffs: the United States had become a place where its soldiers going to war “pass silently through drowsy stations in the night; tank, plane, gun production is veiled in the smokescreen of censorship; flags are seemingly rationed; and there are no more parades.”56
On the home front the official reports of the war were sanitized and censored. It was only in the fall of 1943 that the government censors allowed photos of dead Americans. Even these pictures were often bloodless—to protect the public from the horror and reality of war.57 Censors restricted any mention or photos of men maimed in combat or any indication of racial or other tensions on the American bases. There were to be no photos that indicated a “shell-shocked” GI. The military censors “kept emotionally wounded Americans out of sight throughout the war and after.” Their “efforts went into presenting the war in simple terms of good versus evil.” General Eisenhower ruled that the only photos of casualties permitted would be those who “are walking wounded or are obviously cheerful.” He made clear that “photographs of a horrific nature are always stopped.”58
It turns out that wars without drumbeats and parades, lacking pep talks and speeches stressing national destiny and moral purpose, and cleansed of all signs of combat, are hard to sustain. As late as March 1944 a Gallup survey determined that only 60 percent of Americans acknowledged understanding what the war was really about. There had already been a movement to allow more public glimpses at the war, at least some parts of it. In 1942 the Office of War Information told the Advertising Council that their copy should avoid images of Americans suffering. But within a year, in order to avoid public complacency, the federal agency told the council they could introduce a “grim note” to their promotional advertising—and in war loan ads that year, the first dead American was shown. Following this the photos and illustrations showed more graphic images of war, and by the last year of the war there was a government poster “showing the crumpled, torn, dirt-splattered body of a dead American soldier.”59
Despite these real images, “for the most part Americans at home saw photos and films of the GIs as jaunty heroes or gaunt but unbowed warriors. They read in the dispatches of war correspondents like Ernie Pyle, John Steinbeck, or John Hersey about young men who were wholesome, all-American boys, soft-hearted suckers for needy kids, summer soldiers who wanted nothing more than to come home, as one of them famously told Hersey, “for a piece of blueberry pie.” Steinbeck later reflected that it was not that the correspondents lied, but that “it is in the things not mentioned that the untruth lies.” He acknowledged, “Since our Army and Navy, like all armies and navies, were composed of the good, the bad, the beautiful, the ugly, the cruel, the gentle, the brutal, the kindly, the strong, and the weak, this convention of general nobility might seem to have been a little hard to maintain, but it was not. We were all a part of the War Effort.”60
It is this that makes this war so complex and difficult to describe. It was a war whose public face was scrubbed of blood, yet even if practical it was not passionless since it was always marked by a profound sense of anger and resentment toward at least one of the enemies. And those on the home front felt driven to sacrifice, or to claim sacrifice, in support of “our boys.”
 
 
If the concept of sacrifice was a source of public pride, the act of sacrifice could be more complicated. Some resisted taxes, regulations, and rationing. President Roosevelt expressed his frustration with “the whining demands of selfish pressure groups who seek to feather their nests while young Americans are dying.”61 Surely, the president’s rhetoric involved some politically advantageous posturing—political figures in American history have seldom been punished for siding with the troops in the field—but it also represented some genuine frustration with wealthy Americans who were becoming wealthier as a result of the wartime economy.
World War II costs were eight times those of the First World War. Early in the war, Americans stressed themes of shared sacrifice. These continued to mark public affirmations. Following Pearl Harbor, Americans recognized the need for more revenue to pay for the massive mobilization that would be required. Some proposed a national sales tax, but President Roosevelt and his administration and congressional allies made clear that this was a nonstarter because it would place a greater burden on poorer families.
President Roosevelt proposed a total tax on all salaries greater than $25,000 (or $50,000 for families). There were then about one in fifty thousand who were at this income level. Congress defeated the plan with a veto-proof vote, Democrats joining Republicans. It was a good move on Roosevelt’s part, making it easier to demand sacrifices as well from labor and signaling that his administration would not tolerate “war profiteering.” As a defensive move, advertisers formed the War Advertising Council to administer “public service” advertisements on the part of business and industry. The council organized more than one hundred campaigns underlining sacrifice and patriotism, seeking “to push war bonds, blood drives, food conservation, labor recruitment, and other mobilization demands deemed worthy of advertising support,” which they estimated at a value of $1 billion.62
Beginning in 1942, with subsequent adjustments throughout the war, the government moved to the income tax as the major source of federal revenue. It provided 13.6 percent of revenue in 1940, and this figure grew to 40.7 percent by the end of the war. During the war, Congress raised the tax rates regularly and reduced exemptions. Millions of Americans who had not paid any taxes in 1940 became taxpayers.63
More and more Americans were subject to the income tax under the wartime legislation. There was a major effort to remind people that this too was a cost of war. One radio announcement put it this way: “Well nobody says filling out these forms is fun. But it’s more fun than sitting down in a foxhole, and its more fun than being shot down in a plane. And it’s more fun than waiting for a torpedo to hit.” And the Treasury Department even commissioned Irving Berlin to write a song for the radio, “I Paid My Income Tax Today” as a way to rally support.64
Increasingly higher rates on high-income groups did create resistance from some. Early on President Roosevelt ratcheted up the pressure on the wealthiest Americans, frustrated with their effort to cushion their tax burden. In a September 1942 fireside chat he said, “Battles are not won by soldiers or sailors who think first of their own safety, and wars are not won by people who are concerned primarily with their own comfort, their own convenience, and their own pocketbooks.”65
In 1943 Congress approved a withholding system, a necessary step in the minds of many if there was going to be an effective income tax involving most wage earners. Securing approval of this required a compromise that forgave some taxes that were due, a bonus particularly for the wealthiest taxpayers. Roosevelt vetoed a later bill that would have shifted more of the tax burden onto lower-income Americans. The president insisted that it was a relief bill for the “greedy.” Congress overrode his veto. In 1939 there were fewer than 4 million Americans who paid income taxes. In 1945, 42 million people paid. Federal taxes took 4 percent of gross domestic product in 1941 and by 1943 30 percent.66
Citizens engaged in support and sacrifice in addition to taxes. All Americans had to join in the rationing of most food, gasoline, and rubber products. They joined in scrap drives to recycle needed products, and they engaged in seven major Treasury Department bond drives. Roosevelt and the administration wanted all citizens to share in making loans to the government. It was a way to encourage support for the war effort, and it would be a way to avoid having people taxed after the war to repay debt to only wealthy investors. The United States sold twenty-five-dollar bonds and even ten-cent stamps to schoolchildren. By the end of the war some 85 million Americans had bought war bonds. This came to be an important sign of shared commitment.
 
 
On May 8, 1945, the war in Europe ended. Americans, along with other Western troops, joined the Russian army in Berlin, drinking vodka together in the rubble of the Thousand-Year Reich. On August 15, following the atomic bombing of Hiroshima and Nagasaki, Japan surrendered, with final documents signed on the USS Missouri in Tokyo Bay on September 2. The planned assault on the home islands with extremely high casualty projections never had to take place. When US troops on Okinawa learned of the Japanese surrender, they fired all of the weapons they had into the air. Seven were killed by the raining bullets, and scores were wounded, the last casualties of a long war.
Nearly 292,000 Americans died on the battlefields or in engagement with the enemy on the seas or in the air during the war. Another 114,000 died in service but not in theater. And there were some 671,000 wounded in combat. It was a costly war: some 2.52 percent of those Americans who served died in the war—in World War I the figure had been 2.46 percent, and in the Civil War, Union and Confederate combined, it had been 15.02 percent.67
My father came home from Germany in December 1945, having first been put on hold to see if his unit would be sent to Japan for that war and then waited for the logistical nightmare of demobilization. I remember a song he taught me more than sixty-five years ago, one they sang in Germany in the late summer and fall of 1945:
Oh Mr. Truman, why can’t we go home?
We have conquered Berlin, Tokyo, and Rome.
We have kaput the Master Race,
And now they say there is no shipping space!
Mr. Truman, why can’t we go home?
Getting the troops home was complicated, but it was also done relatively smoothly and quickly. People in government had been worried for some time about what all of these servicemen would do when they came home. Some feared that resentment was building between those who were serving and those who were not. A Red Cross worker in Italy had written home that “many a G.I. thinks every man back home is a 4-F making easy and overlarge war profits. This is a frightening indication of the growing gap in understanding and mutual tolerance between the civilian and the man in uniform.”68 It is not clear why this gap was described as “mutual”—there is not much evidence that the home front had developed an intolerance for those in uniform—a fear perhaps, or in some cases, a sense of guilt.
One GI wrote after a Glenn Miller Orchestra performance in England that men were crying and thinking of home. This was about more than sentiment; it was about hopes. “You owe these guys when they get back, not so much money or gadgets, but a shot at the way of life that many of them have been dreaming about.”69
In July 1943, in the middle of the war, President Roosevelt surprised many, as he was always capable of doing, by arguing in one of his fireside chats that it was necessary to begin planning for the veterans to return to civilian life. He and others were concerned about the capacity of the economy to be able to provide appropriate employment opportunities for these men and women. They must not be “demobilized into an environment of inflation and unemployment, to a place on the breadline or on a corner selling apples.”70 In contrast to any previous American war, the president was thinking of a payment to all veterans upon mustering out of the service as well as perhaps a year of college or other training and unemployment benefits.
Roosevelt continued to wrestle with the issue that had troubled him early in his presidency with the Bonus Expeditionary Force: how much did the country owe to able-bodied veterans? He also recognized the need to initiate some veteran program. In the fall of 1943 he said, “We have taught our youth how to wage war; we must also teach them how to live useful and happy lives in freedom, justice, and decency.”71
Over the next several months Congress and the administration worked on developing a comprehensive veterans program. Partisan politics played a role, not surprisingly, in the debate, and the American Legion came to be a major participant in the shaping and passage of legislation. The Legion’s lobbying during the war was more influential than any that observers at the time recalled from any group. The Legion had early on raised concerns about support for veterans returning with serious injuries. This lobbying effort positioned them well to push next for able-bodied veterans.
Legion officials even raised implicit threats: “God knows what will happen” when these trained killers came home and discovered that there was no provision to enable them to recover the lives they had lost. The Legion insisted that the veterans were not after “gratuities” but were entitled to “what is justly due them.”72
On the other hand, the Disabled American Veterans were concerned about any legislation that would provide support for the able-bodied at the cost of programs for the disabled and seriously injured. The DAV explicitly questioned those who had served briefly, had no injuries, and were the “the lazy and chisely” types who would become beneficiaries of a new program.73
There were ongoing challenges raised by political leaders, often for partisan or sectional reasons, about increasing the federal authority and its bureaucracy, about states rights, and, predictably, about race. Congressman John Rankin of Mississippi was a major force in the drafting of the bill. His concerns focused on race, especially upon the program providing unemployment benefits for veterans since he assumed that the black veterans from his state would not work, and he surely was troubled about any educational benefits that would provide for integrated colleges and universities. Congressman Champ Clark of Missouri insisted that the entire legislation was held up “based entirely upon the hatred of certain Congressmen for the colored portion of our armed forces.”74
A conference committee finally agreed upon a bill—despite Rankin’s obstruction—and Roosevelt signed the legislation on June 22, 1944. It was officially called the Servicemen’s Readjustment Act of 1944. But the American Legion had already titled it the “GI Bill of Rights,” a label that was far more politically popular.
This legislation expanded traditional medical and disability programs and went far beyond this in providing for a significant investment in the transition of all veterans back into American society. The GI Bill provided for up to fifty-two weeks of unemployment benefits; established an interest-free loan program for the purchase of homes, farms, or businesses; and offered a comprehensive and generous plan to support education or training for veterans.
All of the earlier veteran legislation in previous wars had provided for medical support for those in need and, often later, pensions for elderly veterans. The 1944 GI Bill did provide medical support, but this legislation set a new standard with an investment in young, healthy veterans—the only clear precedent for this might have been the land grants that were made available through the first seventy-five years of the nation. As David R. B. Ross notes, government-veteran relations changed: “The 1940–1946 period represents the crucial turning point. For the first time the government anticipated the needs of all its veterans. The notion that the disabled alone needed aid was discarded.” Ross points to this as a consequence of the New Deal philosophy of government responsibility.75
Interestingly, some of the opposition to the GI Bill came from the higher-education community. Robert Maynard Hutchins, president of the University of Chicago, worried that the bill would “demoralize education and defraud the veteran.” He was concerned about veterans going to school if they could not find jobs, resulting in colleges becoming vocational schools and, even worse, “educational hobo jungles.” James Bryant Conant of Harvard was more cautious in raising some of the same concerns. He believed college attendance should be based on demonstrated academic accomplishment rather than military service and hoped that the GI Bill would cover only “a carefully selected group.”76
The politics of passing the GI Bill was marked by compromise—a lubricant of democracy, but one that often introduces selective provisions and qualifications. The legislation and the understandings that were part of the GI Bill made clear that Jim Crow laws would not be challenged. The VA ruled that benefits were not available to gay servicemen who had received “blue” or undesirable discharges, which was their fate if they were determined to be homosexual. There also were numerous allegations and incidents of fraud and abuse in the GI Bill.
Time did a story on the first GI Bill graduates in 1947 that evinced a pretty negative view of their experiences. The article used phrases such as “crammed in crowded Quonset huts” and “grim experience.” It said that veterans insisted that they were all trying to get to where they would have been if not for the war. Obviously, any veterans graduating in 1947 would have matriculated and taken classes prior to their military service.77 Despite these early concerns, scholars have found the experience to have been positive for most of these students and one that did provide opportunities for their lives.78
 
 
While the postwar years were marked by some substantial adjustment problems and labor unrest, there was little of this that was directly the result of veteran activism. There were no new veterans groups organized by the World War II veterans. The Veterans of Foreign Wars and the American Legion represented them. At midcentury the GI Bill was the largest entitlement program down to that point in American history (Social Security would shortly surpass it, but it had not yet done so). In 1950 some 25 percent of federal expenditures went to the support of its programs, and in the several years following the war the Veterans Administration had the largest number of employees of any government agency. In 1950 71 percent of federal payments to individuals went to veterans through the various veterans programs.
A Veterans Administration study in 1955 noted that there had been 15,750,000 veterans eligible for benefits under the GI Bill. Of these, more than half, 8.3 million, received unemployment help; slightly less than half, 7.8 million, received education or training benefits, and some 4 million received support with home, farm, or business loans.79
Most commentators talked about the ease of the process of reabsorption. One scholar, Lee Kennett, believed that it was smooth
because in his heart and in his mind the G.I. had never left home; the military way of life mostly repelled him, and the foreign cultures he encountered did not appeal to him more than his own, so he returned pretty much as he had departed. Also, he came home to a genuine welcome, the sort any hometown accords to members of a winning team after a well-played game. In material terms his conversion to civilian life was made easier in the economic expansion and prosperity that continued a quarter-century after the war. Then, too, he enjoyed an unprecedented bounty of government programs to help him fit back into civilian society—everything from educational benefits and low-interest loans to medical care and the “52-20 Club” ($20 a week in unemployment benefits for up to 52 weeks).
Kennett quotes from Robert Havighurst’s study of veterans that concluded there was “remarkably little difference in the adjustment of veterans and nonveterans four years after the close of the war.”80
These positive experiences were not shared by all who came back from the war. The period after the war was marked by some significant labor unrest, as unions were now free to challenge for higher wages. There were some 5 million workers out on strike during 1946. Veterans who were members of the striking unions joined in these. In the postwar period, racial tensions flared. Many black veterans came back to the South, where they insisted upon more respect and recognition. They seldom received it. Beatings and lynchings increased, and veterans were commonly the victims. The Tuskeegee Institute said there were six lynchings in 1946, and some have argued there were four times that many. The NAACP reported that two-thirds of the lynching victims were veterans.81
It is noteworthy that nonwhite veterans used the GI Bill benefits at a higher rate than white veterans, even in the South. “Even as racial segregation persisted in the United States, therefore, the G.I. Bill gave African Americans greater opportunities to acquire education and training than they had ever known.”82 Nonetheless, black veterans experienced generally systematic indifference or hostility within the various government offices as well as outright racism and exclusion, and not only in the South. So their success in taking advantage of the GI Bill is also a tribute to their dedication to doing just that.83
 
 
As had been true in all previous wars, Americans did seek ways to set up monuments that would salute and memorialize this war. The postwar memorials of World War II focused on the theme of sacrifice. John Bodnar concludes, “The veneration of national sacrifice stood above reminders of personal loss.” He states that the major monuments and cemeteries “performed well the cultural work of turning the tragic aspects of war into honor and heroism and diminishing the reality of suffering. Virtue and strength stood above violence and death.”84
The American Battle Monuments Commission, established after World War I, was in charge of official monuments. Their “goal was to transform tragedy into honor and mass death into national pride.” The inscription at the US cemetery near Florence, Italy, asks that visitors there “not mourn with the parents of the dead who are with us. . . . Rather, comfort them. Let their burden be lightened by the glory of the dead, the love of honor.”85
The Marine Corps War Memorial became a symbol for much of the debate about the art and memorials for the war. The Marine Corps wished to have a statue based on the Joseph Rosenthal photo of the flag being raised at Mount Surabachi on Iwo Jima. This photo, though staged, became one of the most important and iconic pictures of the war. It had been a costly, horrible battle for this small island. Some 30,000 marines went ashore there, and 5,931 were killed and 16,168 wounded in action. The Japanese fought until essentially all of them, more than 20,000, were killed.
The marines and other veterans groups and political leaders managed to overcome various national and Washington arts groups who resisted such a memorial. Felix de Weldon’s statue went up in Arlington. Despite the nearly 6,000 marines who died on Iwo Jima, some 29 percent of all the marines killed in the war, Bodnar points out there are no names on this statue, and there is no “hint there was tremendous loss of life.” Instead, the memorial names the major battles in Marine Corps history and the “uncommon valor” the marines displayed at Iwo Jima.
One of the speakers at the 1954 dedication of the Marine Corps Memorial pointed across the Potomac to the monuments there. He said that there in this statue were reminders of six “small-town boys” whose heroic act was a tribute to the strength of ordinary Americans when the heritage of Washington, Jefferson, and Lincoln is threatened. “Beneath this towering monument, on which a fleeting moment snatched from real life has been preserved in bronze, the ordinary man stands small and humble. The heroes, in their hour of greatness, quite fittingly loom like giants.”86 Many communities put up granite markers with the names of those who had served in the war and a special recognition for those who had died. There were no GI Joe statues to compete for park space.
The National World War II Memorial was dedicated in 2004—too late for many of the war’s veterans to see it. It occupies an honored place on the National Mall and is traditional, even magisterial, in its scale and design. A theme engraved in marble intones: “Americans came to Liberate, not to conquer, to restore freedom and to end tyranny.” The memorial celebrates battles and victories and features a wall with 4,000 gold stars to symbolize the 400,000 who died in the war. The inscription reads, “Here we mark the price of Freedom.”
As was the case after World War I, the United States did proceed with plans to have permanent military cemeteries overseas. There were 288 temporary cemeteries at the end of the war. The American Battle Monuments Commission decided that there would be no permanent cemeteries in Germany or Japan because they had been enemies during the war. On the other hand, an Italian cemetery was appropriate because Italy had joined the Allies before the end of the war.
There were finally fourteen permanent overseas cemeteries, ten in Europe. There was one in North Africa, in Tunisia, which was then a French colony. Three Pacific cemeteries were in the Philippines, Hawaii, and Alaska. There was no permanent cemetery on Iwo Jima, Okinawa, or other major Pacific battle sites. The commission believed they were too remote for families to visit. The American Legion agreed: these places were “desolate,” they were subject to “extremely hazardous forces of nature,” and finally they were just too far from “civilization.”87
Perhaps the most prominent of these overseas cemeteries is the one on the Normandy coast at Colleville-sur-Mer, above Omaha Beach. Following an extremely difficult battle in June 1944 along the shore and on the bluffs, thousands of Americans were killed within sound of the sea. In this immaculate cemetery marked by white crosses and Stars of David rest 9,387 American military dead. Another 1,557 names are inscribed on the Wall of the Missing. It is a place of peace and tranquillity, reflecting pools, with grand open structures of limestone and granite and marble, bounded by the green grounds and the slope to the beach below. It is marked with a bronze statue, not of a warrior but of a soaring figure evoking the memory of those who died too young—”The Spirit of American Youth Rising from the Waves.” Colleville-sur-Mer is a solemn place of memory, one that whispers of sacrifice more than it shouts of heroism. The sacrifices are indelibly marked, if abstracted into anonymity by the scale and the quiet grandeur of the grounds.
Following the war, many people, including veterans and families of deceased servicemen, believed that permanent graves near the comrades with whom they served would be most appropriate. Many, of course, wanted the remains to be brought back for burial at “home” either in a private cemetery or in a US military cemetery. In October 1947 some 400,000 people gathered in New York for a parade and a ceremony in Central Park, greeting the first ship from Europe with war dead. There were 6,200 coffins on this ship. One of these was randomly selected to be placed on a caisson and paraded to Central Park. A reporter who watched the unloading of this first coffin wrote that “women who saw this wept openly and men turned away.”88 Presumably, they turned to weep as well. In 1958 there was an unknown soldier from World War II, along with one from the Korean War, placed at Arlington National Cemetery alongside the World War I unknown.
The American Battle Monuments Commission was typically insensitive to Jewish concerns. The chapels at the military cemeteries were marked largely with Christian symbolism. The unknown graves—some 10,000 of them—were originally marked by a Star of David or a Christian cross based on the known proportion of Jewish and Christian dead in that campaign. Under pressure, though, in 1949 the commission ruled that all unknown graves would be marked by a cross so that no unknown Christian remains would need to rest under a Star of David.
 
 
As far as I know, it was Tom Brokaw who first used the phrase the greatest generation to describe the World War II generation. He covered this remarkable group at the fortieth and fiftieth anniversaries of the Normandy landing, and he wrote a book filled with rich interviews and memories of the war years.89 This was the generation that had endured the Depression, had fought and won the war, and had gone on to have a profound impact on every area of American life. As with Tom Brokaw, I remember them as my father’s generation and also attest to their accomplishments. In describing them as the greatest, I would prefer using Studs Terkel’s qualifying quotation marks. Or perhaps simply remembering them for what they surely were, a great generation.
I have a historian’s caution about superlatives. Simply looking at American history, I am struck by the enduring contributions of the Revolutionary War generation and the remarkable sacrifices of the Civil War generation. Each went on to alter American society fundamentally following their wars. Those who fought at Normandy and Iwo Jima and a score of other places, in the air above them and the seas around them, were clearly remarkable in their dedication and their sacrifice. It is impossible to compare them with those who fought at Bunker Hill, Gettysburg, the Chosin Reservoir, or Khe Sanh—or in the hills of Kandahar Province in Afghanistan, for that matter—in terms of assessing the “greatest” courage. Let us simply recognize the sacrifice and courage that marked each—and still does.
The debate over relative greatness is more than a historical parlor game, a word exercise. I would also suggest that this is a conversation with consequences. I have been interested in this book with the way we interpret and remember war. It is about more than personal recollections; it is about the way societies and cultures think about their past and about their legacy, which can indeed have consequences for subsequent understandings, choices, and behavior.
The deification of the World War II generation—or, more critically, the sanctification of their war—can influence the way Americans over the years following the war think of power and responsibility, think about the nature of war, and think about those who fight wars. This is not about simply setting the bar too high—that is usually a fine way to encourage accomplishment. But it is about setting the bar in a place where we fail properly to understand the lessons of World War II and the impact of that war on the world we have inherited. Many scholars of World War II have cautioned us about the way we remember.90
It is really the veterans who have carried the heaviest burden. They did move on with their lives, and many have lived lives of tremendous accomplishment and continuing contributions. But many veterans have kept hidden the memories, to protect their families from knowing, to protect themselves from remembering. They succeeded, if at all, only in protecting others from knowing their experience. James Johns wrote about watching men leave the horrible battleground of Guadalcanal. “They were dirty, sick, ragged; their eyes looked as if they had been to hell and back. I remember wondering how they could ever be the same again. . . . Many of them never were. I don’t suppose any of us ever were.”91 Of course, to the extent the hiding of memories worked, the harder it became for Americans to understand and to know, truly know, this war. It was not a glamorous war. It was savage and dirty, and sometimes those fighting it demonstrated uncommon courage and sometimes uncommon cruelty. It was a war.
William Manchester admitted more than forty years after the war that he declined to join a reunion in Okinawa because it would be jointly held with Japanese veterans. “There are too many graves between us, too much gore, too many memories of too many atrocities.” He had earlier agreed to meet with a Japanese veteran, now a businessman, when he was at a gathering on Guadalcanal:
I had expected no difficulty; neither, I think, did he. But when we confronted each other, we froze.
I trembled, suppressing the sudden, startling surge of primitive rage within. And I could see, from his expression, that this was difficult for him, too. Nations may make peace. It is harder for fighting men. On simultaneous impulse we both turned and walked away.
I set this down in neither pride nor shame. The fact is that some wounds never heal.92
One of Studs Terkel’s interviewees talked about veterans with seriously disfigured bodies who were being treated in a Pasadena hospital. When these men would go out on the street, people would look away. Some even wrote letters to the local newspaper asking, “Why can’t they be kept on their own ground and off the streets?” The interviewee, an army nurse, told Terkel, “It’s only the glamour of war that appeals to people. They don’t know real war.”93