Chapter Two
PATHS TO THE PRESENT
For your Fourth of July reading, open a mind-opening book about an immensely important American war concerning which you may know next to nothing. King Philip’s War, the central event in a bestseller that is one of this summer’s publishing surprises, left a lasting imprint on America.
Americans in this era of sterile politics have an insatiable appetite for biographies of the Founders. But why are so many readers turning to a book—Mayflower: A Story of Courage, Community, and War, by Nathaniel Philbrick—that casts a cool but sympathetic eye on an era usually wrapped in gauzy sentimentality?
One reason might be that it is fun to read about one’s family: Philbrick estimates that there are approximately 35 million descendants of the passengers on the Mayflower. (Do the math: 102 passengers; 3.5 generations in a century. But remember, 52 passengers died of disease and starvation before the first spring.) Perhaps a second answer is that the story is particularly pertinent as America is engaged abroad in a clash of civilizations, and is engaged at home in a debate about immigration and the common culture.
“In the American popular imagination,” Philbrick writes, “the nation’s history began with the Pilgrims and then leapfrogged more than 150 years to Lexington and Concord and the Revolution.” That version misses, among much else, the history-turning fourteen months of war in 1675 and 1676 that set in train events that led to Lexington and Concord. The war was between the English settlers and the Pokanoket Indians led by Metacom, whose English name was Philip.
In a six-decade downward spiral of mutual incomprehension and unintended consequences, the uneasy but growing coexistence of English settlers and Native Americans dissolved in mutual suspicions, conflicts, and retaliations. During the war, the colony lost 8 percent of its men (compared to the 4 percent to 5 percent of adult men killed in the Civil War). But Native Americans fared far worse. Of the 20,000 in the region at the war’s beginning, 2,000 died of wounds, 3,000 of sickness or starvation, 2,000 fled west or north—and 1,000 were shipped to the West Indies as slaves. Taxation and other costs of the war so injured economic life that a century passed before New England’s per capita income returned to the prewar level.
Philbrick writes that after ethnic cleansing, or at least ethnic sorting out, there were no friendly Indians as buffers between the settlers and the unfriendly ones. So the settlers were forced to look to London for help. Soon a royal governor was appointed to govern New England. Then came irritating taxation—of stamps, of tea—and arguments about representation. Exactly one hundred years after King Philip’s War ended, the United States began.
But an American frame of mind began in 1623. Mayflower illustrates a timeless fact of politics everywhere—the toll reality takes on ideology—and a large theme of American life: the fecundity of individualism and enlightened self-interest.
The first important book-length manuscript written in America was Of Plymouth Plantation, the journal of William Bradford, the colony’s governor for nearly thirty-six years. Not published in full until 1856, it was then avidly read by a nation bent on westward expansion and fearing civil war.
In a section on private versus communal farming, Bradford wrote that in 1623, because of a corn shortage, the colonists “began to think how they might raise” more. After much debate, they abandoned their doctrine, which they brought with them on the Mayflower, that all agriculture should be a collective, community undertaking. It was decided, Bradford wrote, that “they should set corn every man for his own particular, and in that regard trust to themselves.” That is, they “assigned to every family a parcel of land,” ending communal cultivation of that crop.
“This,” Bradford reported, “had very good success, for it made all hands very industrious, so as much more corn was planted than otherwise would have been by any means.” Indeed, “the women now went willingly into the field, and took their little ones with them to set corn; which before would allege weakness and inability; whom to have compelled would have been thought great tyranny and oppression.” So began the American recoil from collectivism. Just three years after the settlers came ashore (not at Plymouth Rock, and far from their intended destination, the mouth of the Hudson), they began their ascent to individualism.
So began the harnessing, for the general good, of the fact that human beings are moved, usually and powerfully, by self-interest. So began the unleashing of American energies through freedom—voluntarism rather than coercion. So began America.
[JULY 4, 2006]
The twelve-second flight one hundred years ago this morning reached a height of just 10 feet, less than the 63-foot height of a Boeing 747, and covered just 120 feet of ground, less than a 747’s 195-foot wingspan. But the Wright brothers’ fourth and final flight that day in North Carolina lasted fifty-nine seconds and went 852 feet. So by sunset the twentieth century’s themes—farther, faster, higher, now—were, so to speak, in the air.
Almost everything—commerce, war, art—would change as aviation began altering, as nothing had ever done, humanity’s experience of the most basic things: time and space. Politics, too. The first important politician to campaign by air was a militant modernist, Adolf Hitler. The newsreels screamed: “Der Führer fliegt über Deutschland.”
Aviation’s infancy was not for the fainthearted. In the early 1920s, an airmail pilot named Dean Smith, on the Chicago-to-Omaha route, cabled his superintendent:
“On trip 4 westbound. Flying low. Engine quit. Only place to land on cow. Killed cow. Wrecked plane. Scared me. Smith.”
Airmail was one way government subsidized aviation, which drew government into deep involvement with technology. So, of course, did the great driver of social change, war. In their new book, Reconsidering a Century of Flight, Roger D. Launius and Janet R. Daley Bednarek note how rapid was the development of the airplane “from a machine in some ways most lethal to those who used it to a machine of great lethality to those against whom it is directed.”
In 1905, the Wright brothers testified to Congress that airplanes’ military uses would be “scouting and carrying messages.” Forty years later, cities would be laid waste from the air. But city bombing was not as lethal as was feared. In April 1939, the British government, anticipating city bombing, issued to local authorities 1 million burial forms. The actual British casualties from aerial bombardment, 1939–1945, were sixty thousand.
One early theory, refuted by experience, was that strategic bombing might make wars less bloody by bypassing bloody clashes between armies, such as the First World War’s trench warfare, and instead quickly inducing an enemy’s surrender by disrupting his “vital center.” The fallacious assumption was that modern economies and societies are fragile.
It was nearly a century after Kitty Hawk, and due less to developments of aircraft than of munitions, that military aircraft really became lethal for targets smaller than whole cities. Until recently, the question about bombing was how many sorties it would take to destroy a target. Suddenly, because of precision munitions, the question is how many targets one sortie can strike. In World War II, about one bomb in four hundred landed close enough to affect—not necessarily destroy—its target. Now nine out of ten do.
The most astonishing consequence of aviation is not its military applications or their civilian echoes. (After World War II, Harley J. Earl, General Motors’ chief stylist, turned his fascination with the twin tails of the P-38 fighter into automobile tail fins that defined the chrome-plated 1950s.) Rather, the amazing consequence was the banality of flight—the routinization of mobility—especially after 1958, when Boeing’s 707 speeded the democratization of air travel. Unfortunately, this had some negative public health consequences because viruses—HIV, for one—also became mobile.
From the first, flight expressed the essence of the modernist movement—freedom understood as the absence of limits, and a future of infinite possibilities. While developing cubism, Pablo Picasso sometimes painted wearing aviator’s gear. His response to the April 26, 1937, bombing by Germans of Guernica during Spain’s civil war—a rehearsal, or overture, for what was soon to come from Europe’s skies—moved Picasso to produce what may be the twentieth century’s iconic painting.
Cubism itself was influenced by a perspective no previous generation knew, that of the earth—the geometry of its urban grids and rural plots—seen from above. The Eiffel Tower had provided Europeans their first downward vision of their environment. Robert Hughes, the art critic, says that what was spectacular was not the view of the tower from the ground but the view of the ground from the tower. Until then, almost everybody lived their entire lives no more than forty feet—the height of an ordinary apartment building—above the ground.
Modernism shaped another expressive activity that flourished in tandem with aviation, the competition to build the tallest skyscraper. In Manhattan, epicenter of the competition, the race was eventually won by the twin towers of the World Trade Center, where ninety-eight years after Kitty Hawk the histories of aviation and architecture intersected.
[DECEMBER 17, 2003]
When giving thanks this year, think of Lena Woebbecke. She and many others paid a terrible price for misreading the prairie sky on the afternoon of January 12, 1888.
That day was unseasonably balmy, by prairie standards—some temperatures were in the twenties—and many children scampered to school without coats or gloves. Then, at about the time schools were adjourning, death, in the shape of a soot-gray cloud, appeared on the horizon of Dakota Territory and Nebraska.
In three minutes the temperature plunged eighteen degrees. The next morning hundreds of people, more than one hundred of them children, were dead beneath the snowdrifts. David Laskin, a Seattle writer, reconstructs this tragedy in a terrifying but beautifully written new book, The Children’s Blizzard.
It picks up the many threads of the story in Norway, Ukraine, Germany, Vermont, and other tributaries to the river of immigration set in motion partly by the 1862 Homestead Act. In return for an $18 filing fee and five years farming, the act conferred ownership of 160 acres. By the tens of thousands the homesteaders came, to live in sod houses, heated by burning buffalo bones and twisted hay.
Of immigrants, the saying was that the cowards stayed home and the weak died on the way. One in ten crossing the Atlantic in steerage did die. But Laskin says “the mystique of the Dakotas” was such that the territory’s population nearly quadrupled in the 1880s. Those who made it, with a trunk or two and the clothes on their backs, reached towns that were perishable scratches on the prairie. They got land, freedom, and hope.
And prairie fires. And grasshoppers, 100 billion at a time in roaring clouds a mile high and a hundred miles across. And iron weather in which children, disoriented by horizontal streams of snow as hard as rock and fine as dust, froze to death groping their way home from a school 150 yards away.
Lena was five in 1882 when her father, a German immigrant, died of smallpox. Her mother remarried twice, having eleven children, eight of whom survived. In August 1887, Lena, her marriage prospects diminished by her smallpox scars, was sent to live with the Woebbeckes and their three children in a two-room house. It was half a mile from the school where she was, five months later, when a cataclysmic cold front came dropping southeast out of Canada at forty-five miles per hour.
“To those standing outside,” Laskin writes, “it looked like the northwest corner of the sky was suddenly filling and bulging and ripping open.” In four and a half hours, the temperature at Helena, Montana, fell fifty degrees. The prairie air tingled with the electricity of a horizontal thunderstorm. All over the region, schoolteachers, many of them not much older or more educated than their pupils, had to make life-and-death decisions about how to get the children home.
“The fear came first,” Laskin writes, “but the cold followed so hard on its heels that it was impossible to tell the difference.” In minutes, nostrils were clogged by ice. Eyelids were torn by repeated attempts to prevent them from freezing shut. Unable to see their hands in front of their faces, people died wandering a few yards from their houses, unable to hear, over the keening wind, pots being pounded a few yards away to tell them the way to safety.
“For years afterward,” writes Laskin, “at gatherings of any size in Dakota or Nebraska, there would always be people walking on wooden legs or holding fingerless hands behind their backs or hiding missing ears under hats—victims of the blizzard.” Lena learned to walk on a wooden foot. In 1901, at twenty-four, she married. At twenty-five, she died, perhaps in childbirth, or perhaps of a complication from the amputation necessitated by frostbite.
“Lena was laid to rest in her wedding dress in the graveyard of the Immanuel Lutheran Church near the country crossroads called Ruby. If there ever was a town called Ruby, it has disappeared, as has the Immanuel Lutheran Church. The church cemetery remains—a fenced patch of rough grass studded with headstones between two farmhouses not far from the interstate. A tiny island of the dead in the sea of Nebraska agriculture.”
This Thanksgiving, when you have rendered yourself torpid by ingesting an excess of America’s agricultural bounties, summon thoughts of thanks for the likes of Lena, those whose hard lives paved the stony road to America’s current comforts.
[NOVEMBER 25, 2004]
The soil is the one indestructible, immutable asset that the nation possesses. It is the one resource that cannot be exhausted.
—FEDERAL BUREAU OF SOILS, 1878
Seventy-five years ago, America’s southern plains were learning otherwise. Today, amid warnings of environmental apocalypse, it is well to recall the real thing. It is a story about the unintended consequences of technological progress and of government policies. Above all, it is an epic of human endurance.
Who knew that when the Turks closed the Dardanelles during World War I, it would contribute to stripping the topsoil off vast portions of Texas, Oklahoma, Colorado, and Kansas? The closing cut Europe off from Russian grain. That increased demand for U.S. wheat. When America entered the conflict, Washington exhorted farmers to produce even more wheat, and guaranteed a price of $2 a bushel, more than double the 1910 price. A wheat bubble was born. It would burst with calamitous consequences recounted in Timothy Egan’s astonishing and moving book, The Worst Hard Time: The Untold Story of Those Who Survived the Great American Dust Bowl.
After the war, the price plunged and farmers, increasingly equipped with tractors, responded by breaking up more prairie, plowing under ever more grassland in desperate attempts to compensate for falling wheat prices with increased volume. That, however, put additional downward pressure on the price, which was forty cents a bushel by 1930.
The late 1920s had been wet years, and people assumed that the climate had changed permanently for the better. In that decade, another 5.2 million acres—equivalent, Egan says, to the size of two Yellowstone Parks—were added to the 20 million acres previously in cultivation. Before the rains stopped, fifty thousand acres a day were being stripped of grasses that held the soil when the winds came sweeping down the plain.
In 1931, the national harvest was 250 million bushels, perhaps the greatest agricultural accomplishment in history. But Egan notes that it was accomplished by removing prairie grass, “a web of perennial species evolved over 20,000 years or more.” Americans were about to see how an inch of topsoil produced over millennia could be blown away in an hour.
On January 21, 1932, a cloud extending ten thousand feet from ground to top—a black blizzard with, Egan writes, “an edge like steel wool”—looked like “a range of mountains on the move” as it grazed Amarillo, Texas, heading toward Oklahoma. At the end of 1931, a survey found that of the 16 million acres cultivated in Oklahoma, 13 million were seriously eroded.
On May 10, 1934, a collection of dust storms moved over the Midwest carrying, Egan says, “three tons of dust for every American alive.” It dumped six thousand tons on Chicago that night. By morning, the storm was eighteen hundred miles wide—“a great rectangle of dust” weighing 350 million tons—and was depositing the surface of the Great Plains on New York City, where commerce stopped in the semidarkness.
On the southern plains, dust particles, one-fifth the size of the period at the end of this sentence and high in silica content, penetrated lungs, jeopardizing newborns and causing “dust pneumonia” in others. Houses were so porous that the only white part of a pillow in the morning was the profile of the sleeper. Storms in March and April 1935 dumped 4.7 tons of dust per acre on western Kansas, denting the tops of cars. During one storm, the wind blew at least forty miles per hour for one hundred hours. Egan reports that it would have required a line of trucks ninety-six miles long, hauling ten loads a day for a year—46 million truckloads—to transport the dirt that had blown from western to eastern Kansas.
In Washington, in a Senate hearing room, a man was testifying to bored legislators about the need for federal aid for the southern plains. A senator suddenly exclaimed, “It’s getting dark outside.” The sun vanished and the air turned copper-color, thanks to red dust that the weather bureau said came from the western end of Oklahoma’s panhandle. The aid was approved the next day.
The southern plains got what Egan calls frenzied skies of grasshoppers—sometimes 14 million per square mile—because the insects’ natural predators were gone. Eventually, however, rain fell on the convulsed land and on the tenacious people who never left it, and the government devised soil conservation measures. The Earth turned out to be more durable, and the people who wrested their livings from the earth more resilient, than had been thought.
[APRIL 29, 2007]
Confined to her bed in Atlanta by a broken ankle and arthritis, her husband gave her a stack of blank paper and said, “Write a book.” Did she ever.
The novel’s first title became its last words, “Tomorrow Is Another Day,” and at first she named the protagonist Pansy. But Pansy became Scarlett, and the title of the book published seventy years ago this week became Gone With the Wind.
You might think that John Steinbeck, not Margaret Mitchell, was the emblematic novelist of the 1930s, and that the publishing event in American fiction in that difficult decade was his The Grapes of Wrath. Published in 1939, it captured the Depression experience that many Americans had, and that many more lived in fear of. Steinbeck’s novel became a great movie and by now 14 million copies of the book have been sold.
But although the $3 price of Gone With the Wind ($43.50 in today’s dollars) was steep by Depression standards, it sold 178,000 copies in three weeks and 2 million by April 1938, when it ended a twenty-one-month run on the bestseller list. By now nearly 30 million have been sold. About 250,000 are still purchased in America every year, and 100,000 elsewhere.
In 1935, there had been an early indicator of the American yearning that Mitchell’s novel satisfied. That year saw the publication of the final two volumes of another durable work of Southern sympathy, Douglas Southall Freeman’s Pulitzer Prize–winning four-volume biography of Robert E. Lee. What was afoot?
By the middle of the 1930s, with the Depression entering its second half-decade and showing no sign of succumbing to the New Deal’s attempts to end it, Americans were rightly skeptical about the idea that happy days would soon be here again. Their world having been turned upside down, they saw a parallel between their plight and the story of the disappearance of the antebellum South. Hence their embrace of Mitchell’s epic about a society pulverized to human dust that is blown about by history’s leveling wind.
Parts of the novel reek of magnolia and cloying sentimentalism. But Mitchell writes sarcastic passages about the Lost Cause:
“How could anything but overwhelming victory come to a Cause as just and right as theirs?…Of course, there were empty chairs and babies who would never see their fathers’ faces and unmarked graves by lonely Virginia creeks and in the still mountains of Tennessee, but was that too great a price to pay for such a Cause?”
Scarlett certainly was no sentimentalist. When Rhett Butler, the embodiment of unapologetic realism, asks her if she ever thinks “of anything but money,” she replies with words that struck a chord with a nation that had heard quite enough of the song “Brother, Can You Spare a Dime?”: “No…. I’ve found out that money is the most important thing in the world and, as God is my witness, I don’t ever intend to be without it again.”
In 1936, the Washington Post reviewer called the novel “unsurpassed in the whole of American writing,” which was a bit strong, considering what Hawthorne, Melville, Twain, and Wharton had produced. What could, however, accurately have been said of Gone With the Wind was that it was the most cinematic novel yet written in America. A month after it was published, $50,000 was paid for the rights to turn it into the movie that has grossed (adjusted for ticket-price inflation) a record $3.8 billion worldwide.
Like another Southern woman who wrote a novel about her region, a novel that is still in print nearly half a century later and that became a classic movie (Harper Lee, To Kill a Mockingbird, published in 1960), Mitchell never wrote another. In 1949, at age forty-eight, she was killed by a taxi driven by a drunk in Atlanta, which was already on its way to becoming the symbol of the New South.
Mitchell had been born in 1900, just thirty-five years after Appomattox and twenty-three years after Reconstruction ended. Her sensibilities were not what ours are. The novel has passages that cannot be read without cringing. (“Not trust a darky! Scarlett trusted them far more than most white people…. They still stuck with their white folks and worked much harder than they ever worked in slave times.”) But to read such passages is to be stunned, once again, by the amazing speed with which America has changed for the better. In 1936, in Mitchell’s Atlanta, the pastor of the Ebenezer Baptist Church, Martin Luther King, had a son who was seven.
[JUNE 25, 2006]
DEARBORN, MICHIGAN—A suitable venue for contemplating organized labor’s current disarray is here, at the footbridge over Miller Road. In 1937, it led to the main entrance of the foremost example of America’s manufacturing might—the Ford Motor Company’s River Rouge plant, then the world’s most fully integrated car-manufacturing facility, from blast furnaces to assembly line. Five years later, the plant would exemplify America as the “arsenal of democracy.” It made jeeps, tanks, trucks, and engines for B-24 bombers. But on May 26, the footbridge to the plant made history.
“The Battle of the Overpass,” a heroic event in American labor history, began when Walter Reuther, president of UAW Local 174, and three colleagues started across the footbridge to distribute leaflets as part of their campaign to unionize the plant. They were savagely beaten by perhaps forty Ford thugs and thrown down the overpass stairs. The thugs confiscated most photographers’ film, but James (Scotty) Kilpatrick of the Detroit News surrendered only blank film. His pictures made Reuther a national figure and aroused American opinion against tactics then used to thwart unionization.
Ford came to terms with the UAW in May 1941, seven months before American industry was conscripted into war. Even though women flooded into factories (Rose Will Monroe, who as “Rosie the Riveter” symbolized this social transformation, worked in Ford’s Willow Run plant), labor was scarce and wage controls limited companies’ means of competing for workers. So companies offered medical and pension benefits as untaxed compensation not covered by wage controls. Today whole industries are buckling beneath the weight of these “legacy costs.”
By 1955, 33 percent of the nation’s nonfarm workforce was unionized. But few government employees were. As Steven Malanga of the Manhattan Institute writes in his book The New New Left: How American Politics Works Today, the assumption was that because there is no competition in the delivery of government services, strikes could cripple cities. That was then.
Now, Bentonville, Arkansas (population: 28,000, up 40 percent in four years), has supplanted Detroit as the home of the nation’s largest company. Intense and protracted efforts by organized labor have failed to unionize a single one of Wal-Mart’s 3,190 North American stores. General Motors, the largest corporation in 1955, is unionized. Two credit-rating agencies have reduced its debt, and one has reduced Ford’s to junk-bond status as both companies struggle to finance medical and pension benefits for current and—even more numerous—retired workers. Such costs also are major reasons for the parlous condition of all the older airlines.
In 1955, when Japan was still struggling to recover from the damage done by B-24s with their Ford-built engines, the American automobile industry was riding high and the UAW was riding along. With negligible foreign competition—their market share was 95 percent—American car companies could pass along to consumers the costs of labor contracts. Today, while domestic carmakers are planning to shed jobs by the thousands, employment is surging in the nonunionized plants, most in the South, where foreign automakers build one-quarter of all the cars and trucks made in America.
In the 1930s, American workers were literally fighting to get into unions. Today, unions are fighting with themselves about the appropriate tactics to adopt in response to the fact that just 7.9 percent of private-sector workers are union members. But at the apogee of the American labor movement in the 1950s, the transformation of the movement began.
In 1958, the American Federation of State, County and Municipal Employees won collective-bargaining rights from New York mayor Robert Wagner—son of Senator Robert Wagner, author of the Wagner Act, aka the National Labor Relations Act of 1935, the most important federal action enabling private-sector unionization. AFSCME’s members rose from one hundred thousand in 1955 to 1 million in 1985. It has 1.4 million members today and is just one of many government employees’ unions.
The River Rouge complex, which is still humming, is a National Historic Landmark, and “the Battle of the Overpass” and the UAW’s successes are commemorated at a plaza—built by Ford—on Miller Road, by the footbridge. In the auto industry, where labor and management used to fight over the allocation of abundance, the coming showdown will be over the reduction of benefits won in palmier days.
Soon, perhaps, a majority of organized labor will be government employees. The labor movement will be primarily government organized as an interest group to lobby and pressure itself. Already New York City, which has about the same size population it had forty years ago, has 30 percent more city employees. Antonio Villaraigosa, the new mayor of Los Angeles, is a former organizer for that city’s teachers union. The footbridge over Miller Road led to this. The heroic era of organized labor is long gone.
[JULY 18, 2005]
Some mornings during the autumn of 1933, when the unemployment rate was 22 percent, the president, before getting into his wheelchair, sat in bed, surrounded by economic advisers, setting the price of gold. One morning he said he might raise it twenty-one cents: “It’s a lucky number because it’s three times seven.” His treasury secretary wrote that if anybody knew how gold was priced “they would be frightened.”
The Depression’s persistence, partly a result of such policy flippancy, was frightening. In 1937, during the depression within the Depression, there occurred the steepest drop in industrial production ever recorded. By January 1938, the unemployment rate was back up to 17.4 percent. The war, not the New Deal, defeated the Depression. Franklin Roosevelt’s success was in altering the practice of American politics.
This transformation was actually assisted by the misguided policies—including government-created uncertainties that paralyzed investors—that prolonged the Depression. This seemed to validate the notion that the crisis was permanent, so government must be forever hyperactive.
In his second inaugural address, Roosevelt sought “unimagined power” to enforce the “proper subordination” of private power to public power. He got it, and the fact that the federal government he created now seems utterly unexceptional suggests a need for what Amity Shlaes does in a new book. She takes thorough exception to the government he created.
Republicans had long practiced limited interest-group politics on behalf of business with tariffs, gifts of land to railroads, and other corporate welfare. Roosevelt, however, made interest-group politics systematic and routine. New Deal policies were calculated to create many constituencies—labor, retirees, farmers, union members—to be dependent on government.
Before the 1930s, the adjective “liberal” denoted policies of individualism and individual rights; since Roosevelt, it has primarily pertained to the politics of group interests. So writes Shlaes, a columnist for Bloomberg News, in The Forgotten Man: A New History of the Great Depression. She says Roosevelt’s wager was that by furiously using legislation and regulations to multiply federally favored groups, and by rhetorically pitting those favored by government against the unfavored, he could create a permanent majority coalition.
In the process, says Shlaes, Roosevelt refined his definition of the “forgotten man.” This man had been thought of as a general personality, compatible with the assumption that Americans were all in it together. “Now, by defining his forgotten man as the specific groups he would help, the president was in effect forgetting the rest—creating a new forgotten man. The country was splitting into those who were Roosevelt’s favorites and everyone else.”
Acting with what Shlaes calls “the restlessness of the invalid,” Roosevelt implemented the theory that (in her words) “spending promoted growth, if government was big enough to spend enough.” In only twelve months, just one Roosevelt improvisation, the National Recovery Administration, “generated more paper than the entire legislative output of the federal government since 1789.”
Before Roosevelt, the federal government was unimpressive relative to the private sector. Under Calvin Coolidge, the last pre-Depression president, its revenues averaged 4 percent of GDP, compared to 18.6 percent today. In 1910, Congress legislated height limits for Washington buildings, limits that prevented skyscrapers, symbols of mighty business, from overshadowing the Capitol, the symbol of government.
In 1936, for the first time in peacetime history, federal spending exceeded that of the states and localities combined. Roosevelt said modern “civilization” has tended “to make life insecure.” Hence Social Security, which had the added purpose of encouraging workers to retire, thereby opening jobs to younger people. Notice the assumptions of permanent scarcity, and that the government has a duty to distribute scarce things, such as work.
In 1938, when the New Deal’s failure to spark recovery made Roosevelt increasingly frantic, he attempted to enlarge the Supreme Court so he could pack it with compliant justices. He said Americans had the right to “insist that every agency of popular government” respond to “their will.” He included the court among “popular,” meaning political and representative, institutions.
Roosevelt’s overreaching called forth an opponent whom Shlaes rescues from obscurity. Wendell Willkie, who would be Roosevelt’s opponent in a 1940 election overshadowed by war, called upon Roosevelt to “give up this vested interest you have in depression” as the justification for a “philosophy of distributed scarcity.”
War, as has been said—and as George W. Bush’s assertion of vast presidential powers attests—is the health of the state. But as Roosevelt demonstrated and Shlaes reminds us, compassion, understood as making the “insecure” securely dependent, also makes the state flourish.
[JULY 8, 2007]
Thirty days hath September,
April, June and November.
All the rest have thirty-one,
Until we hear from Washington.
The country heard from Washington—the man, not the place—when he issued a National Thanksgiving Proclamation for November 26, 1789. The new nation had much for which to be thankful, including the fact that it would be 150 years before Thanksgiving was officially made into a handmaiden of commerce and turned into the starting gun for the sprint of Christmas shopping.
By now the sprint is a marathon that seems to begin around Labor Day. Soon there will be after-Christmas sales before Halloween, such is the relentless expansion of what is called, with telling vagueness, “the holiday season.”
The country heard from Washington—the place; the mentality—in 1939 when President Franklin Roosevelt threw Thanksgiving into the battle to get happy days here again. FDR’s governmental hyper-kinesia had failed to banish the Depression. Unemployment was still 17.2 percent and the ultimate cure for the Depression—Admiral Yamamoto’s fleet approaching Hawaii—was still twenty-four months over the horizon.
Even the calendar was conspiring against prosperity because in 1939, as in FDR’s first year in office, 1933, November had five Thursdays, and Thanksgiving was to fall on the thirtieth. So FDR moved Thanksgiving from the last Thursday in November to the fourth.
Although President Washington was a Virginian, the idea of a national Thanksgiving Day had seemed somehow New Englandish, tainted by Yankee sanctimony and, worse still, Federalist notions of national supremacy over states’ prerogatives. Even President John Quincy Adams of Massachusetts thought a national Thanksgiving observation might be “introducing New England manners” where they were unwelcome.
President Lincoln, a great affirmer of the national facets of the nation’s life, was the first to set the last Thursday in November as the national day of Thanksgiving. Bill Kauffman, who has explained all this (in “New Deal Turkey,” the American Enterprise magazine, December 2000), says Lincoln’s successor, President Andrew Johnson, who had quite enough fights on his hands without picking another one—he was the first president impeached—shoved Thanksgiving into December. President U. S. Grant, who rarely retreated but knew how to, put it back to the last Thursday in November.
But Appomattox notwithstanding, states remained free to do as they liked about Thanksgiving, and Southern states liked to observe it when they chose. Or not at all, as in Texas during the governorship of Oran Milo Roberts, who said, “It’s a damned Yankee institution anyway.”
But in 1939, many of the nation’s larger merchants—the National Retail Dry Goods Association, the presidents of Gimbel Brothers and Lord & Taylor—asked FDR for relief from the fact that in 1939 Thanksgiving would arrive so late—November 30—that it would injure the economy by delaying the start of Christmas shopping.
However, the class struggle erupted, pitting smaller merchants against the larger merchants. The proprietor of Arnold’s Men’s Shop in Brooklyn wrote to urge FDR to allow the later Thanksgiving: “If the large department stores are overcrowded during the shorter shopping period before Christmas, the overflow will come, naturally, to the neighborhood store…. We have waited many years for a late Thanksgiving to give us an advantage over the large stores.”
FDR felt the pain of the large merchants. But some people felt pained by FDR’s tampering with Thanksgiving, including Oregon’s attorney general, author of the doggerel printed above. A West Virginian wrote FDR to say, while you are at it, please declare it “strictly against the Will of God to work on Tuesday” and “have Sunday changed to Wednesday.” A South Dakota real estate man admonished FDR to “remember we are not running a Russia or communistic government,” and he added: “Between your ideas of running for a third term, and your changing dates of century-old holidays, we believe you have practically lost your popularity and the good will of the people of the Northwest.” FDR lost South Dakota in 1940.
But in 1939, twenty-three states followed FDR’s lead and celebrated Thanksgiving on November 23. Twenty-three stayed with November 30. Colorado and Texas celebrated on both days, Texas doing so to avoid having to reschedule—speaking of things to give thanks for—the Texas–Texas A&M football game.
FDR, who enjoyed fiddling with things, promised in 1941 to return Thanksgiving to the last Thursday in November. But history has its hold on us and Congress shoved it back to the fourth Thursday, partly because many constituents believed the Pilgrims had put it there in the first place.
[NOVEMBER 27, 2003]
Imagine how tiresome it would be to have, at Christmas, a houseguest of whom your spouse disapproves and who you have met only twice before, the first time twenty-three years ago (annoyingly, your guest does not remember the meeting), the second time four months ago, for a few hours, out of town, on business. Imagine that the houseguest invites himself to your home, stays almost three weeks, and one morning early on during his stay he summons your butler (you don’t have one? pity) and issues the following ukase:
“Now, Fields, we had a lovely dinner last night but I have a few orders for you. We want to leave here as friends, right? So I need you to listen. One, I don’t like talking outside my quarters; two, I hate whistling in the corridors; and three, I must have a tumbler of sherry in my room before breakfast, a couple of glasses of scotch and soda before lunch and French champagne and 90-year old brandy before I go to sleep at night.”
Furthermore, this Guest from Hell declares that for breakfast he requires hot “eggs, bacon or ham and toast” and “two kinds of cold meats with English mustard and two kinds of fruit plus a tumbler of sherry.” You would be forgiven for asking your guest if he had been born in a palace.
He who so firmly addressed President Franklin Roosevelt’s butler Alonzo Fields sixty-four Christmases ago was, in fact, born in Blenheim Palace, England’s gift to the first Duke of Marlborough. And if no whistling and lots of sherry and whiskey would help the duke’s great-great-great-great-great-great-grandson, Winston Churchill, function, stop whistling and pour liberally. There is a war to win.
The story of this December 1941 visit is told by two Canadians, David Bercuson and Holger Herwig, in an entertaining book with an idiotic subtitle, One Christmas in Washington: The Secret Meeting Between Roosevelt and Churchill That Changed the World. Secret meeting? It was about as secret as a circus, featuring a press conference with FDR and a speech to a joint session of Congress in which Churchill said: “I cannot help reflecting that if my father had been American and my mother British, instead of the other way round, I might have got here on my own.” But the meeting did change the world by constructing the machinery of cooperation that led to the defeat of the Axis.
How ancient it now seems, 1941. The city of Washington had fifteen thousand outdoor privies. German U-boats sank 432 ships in the Atlantic. In August, FDR could deceive everyone, including the Secret Service, for a really secret meeting with Churchill—their only previous meeting had been at a London dinner in 1918—at Placentia Bay, Newfoundland. In the days after Pearl Harbor, some of the antiaircraft guns on the White House were wooden fakes—real ones were scarce. On his voyage, sometimes through forty-foot waves, to his Christmas visit with FDR, Churchill watched American movies, including Santa Fe Trail, starring Errol Flynn, Olivia de Havilland, and Ronald Reagan.
FDR greeted Churchill in Washington in a black limousine the Treasury Department had confiscated from a tax evader named Al Capone. Churchill met here with Admiral Ernest King, commander in chief of the U.S. fleet, who had served in the Spanish-American War, and with General Henry “Hap” Arnold, the head of the Army Air Forces, who in 1911 received flight training in Dayton, Ohio, from the Wright brothers.
What could have been the most important event of Churchill’s almost three weeks in America was not known until his doctor published his memoirs in 1966: Churchill suffered a heart attack while straining to open a stuck window in his White House bedroom. Had it been fatal, that could have changed the world.
Eleanor Roosevelt disapproved of Churchill the imperialist, but on Christmas Day 1941 she, he, and the president attended Washington’s Foundry Methodist Church, the second iteration of a church founded by Henry Foxall, who in 1812 vowed that he would build a church as a thanksgiving offering if the British did not destroy his cannon foundry when they took Washington and burned the White House.
Christmas Day was the birthday of General Sir John Dill, chief of the Imperial General Staff, so a cake was found and adorned with a set of American and British flags which, Dill discovered when he removed them, were made in Japan. This occasioned laughter, at a time when that, like much else, was scarce.
[DECEMBER 25, 2005]
PEARL HARBOR, HAWAII—Sixty-one years later, it is still not over.
One by one, oil drops still seep from the submerged hull of the USS Arizona. And one by one, some men who survived the bomb that hit the battleship’s forward magazine still return, the urns containing their ashes lowered through the water into the hole of a gun turret, because as one of them said, “Ever since December 7, 1941, I’ve been living on borrowed time. My place is with my shipmates.”
Here, about as far as you can get in America from the scenes of last year’s attacks, you see this difference: In 1941, a mighty empire—an enemy with a serious if reckless geopolitical strategy—struck at real sinews of U.S. power. In 2001, a delusional, premodern enemy lashed out at American symbols—iconic buildings—and instantly magnified American power by dispelling an American mood. Call it end-of-history complacency.
The attack that came here sixty-one years ago from across the broadest ocean erased forever the belief that geography—wide oceans, placid neighbors—confers permanent security on America. The attacks last year erased the comparably soothing belief that the logic of military technology (deterrence) and the march of modernity (the retreat of primitivism) had written an end to history, meaning the immunity of great powers to attacks.
Sixty-one Decembers ago, as last year, America suffered from intelligence failures. But in both instances, for the officials charged with protecting the nation’s security, the attacks came not as bolts from the blue but as bolts from what they knew to be ominously darkening skies.
Both times American officials knew enough to know that the international atmosphere was charged with hidden menace. Which is why the first shots fired on December 7, 1941, were fired by America’s wary military: At 6:45 a.m., the destroyer USS Ward attacked one of the two-man midget submarines—its wreckage was found two weeks ago—lurking at the mouth of the harbor, poised to participate in the attack that was still seventy minutes over the horizon.
The attacks of December 7, 1941, like the attacks of September 11, 2001, were a curious mixture of virtuosity and primitivism. Al Qaeda skillfully used nineteen suicidal fanatics wielding box cutters and commercial airliners to attack a continent. It was deadly, and absurd.
Japan’s military achievement—moving thirty-two warships four thousand miles undetected; designing shallow-running torpedoes; brilliantly coordinating the bomb and torpedo attacks—was military sophistication of the highest order. Yet the pilots skimming thirty feet over the water used carpenters’ levels in their cockpits to make sure their planes were properly positioned to insert their torpedoes into the water.
The Americans who died here—on the Arizona, twenty-three sets of brothers, and a father and his son—were mostly military men. Those whom the terrorists targeted last September were mostly civilians.
For all Americans, being a focus of furies—which a muscular nation, extending almost five thousand miles from the cavity in southern Manhattan to the Arizona’s hull, will be—is a dangerous destiny. It is a destiny that, in a sense, was just dawning 104 years ago when the USS Baltimore sailed from here.
On March 25, 1898, that cruiser left to join Commodore Dewey’s fleet in Hong Kong. It entered Manila Bay with the fleet on May 1 and participated in the destruction of the Spanish fleet. The Spanish-American War established the United States as a global power, its power projected then entirely by its Navy. In 1941, an important portion of the Navy was based here because—westward the course of empire takes its way—the United States had annexed these islands in that eventful year of 1898.
On a December Sunday forty-three years later, the Baltimore, which had been decommissioned in 1922, was a ghost ship moored at the end of battleship row, where it escaped damage. But in a sense its career was still not over. In 1942, it was turned into scrap. No doubt some of it was sent back to war in bits and pieces.
Half a million gallons of fuel oil remain in the Arizona’s hull. With leakage of a quart a day, no one now living will be alive when the surface sheen from the last drop drifts away. And no one now living will live to see a day when Americans forget the lesson now associated with September 11 as well as December 7: A powerful nation embodying a powerful idea and spanning six time zones is permanently exposed to dangers from all the other eighteen zones.
[SEPTEMBER 8, 2002]
Every night my honey lamb and I
sit alone and talk
and watch a hawk
makin’ lazy circles in the sky
—“Oklahoma!”
NEW YORK—“Honey lamb”? That is as corny as Kansas in August. Perhaps such lyrics please people in Manhattan, Kansas, where the waving wheat can sure smell sweet when the wind comes right behind the rain. But surely the sophisticates in this Manhattan prefer to tap their patent leather dancing pumps to the urbane lyrics of Ira Gershwin.
Surely not. The show-business collaboration of Richard Rodgers and Oscar Hammerstein long ago ended, but their melodies and lyrics linger on. In fact, rather more than just linger. They reverberate. Advance ticket sales topped $12 million by the time the revival of Oklahoma! opened two weeks ago—in the Gershwin Theater, so there.
Whoever said that America’s imperishable gifts to the world are the Constitution, baseball, and jazz should have included a fourth: a distinctive kind of stage musical, the greatest of which is Oklahoma! It took that top ranking away from Show Boat (1927) fifty-nine years ago and has kept it ever since. Oklahoma! was the first “integrated” musical, meaning the singing and dancing arose organically from the action, advancing rather than interrupting the narrative. Audiences were ready for this maturation of the musical.
When Oklahoma! first came to Broadway, the opening—it was a sleety March night—was not sold out. Soon tickets were scarce. However, servicemen in uniform were admitted to standing room without charge. It was 1943, a terrible year.
But the first words of the first song were—are—“There’s a bright, golden haze on the meadow.” The song, “Oh, What a Beautiful Mornin’!” was an anthem of an American optimism that not even world war could dent. It is serendipitous that this revival, which played for four years in London, arrives with the nation again at war.
And, as usual, the nation is fretting, as Oklahoma Territory fretted, about keeping the peace between disparate and sometimes rivalrous factions, such as “The Farmer and the Cowman” of the rousing number that opens the second act. But cultural events are always filtered through the mental lenses of the moment, so Ben Brantley of the New York Times writes that this revival of Oklahoma! is saturated with…well, he says:
It suggests “the West was won on the strength of sexual hormones.” It finds “rushing erotic currents in the frontier spirit.” It is “dewy with an adolescent lustiness” and a “darker sexual element.” There is “a glistening sense of young people eagerly groping their way through an unfamiliar landscape” and that a world parallel to “the virgin land” of Oklahoma Territory is the “shadowy realm of sexual initiation.” And the choreography—as in “The Farmer and the Cowman,” which has “wild, procreative energy”—vibrates “with the sense of sensual restlessness in search of an outlet.”
Whoa. Perhaps Brantley should take a long walk or a cold shower, or the latter after the former. Yes, boys and girls in Oklahoma Territory had a keen interest in girls and boys. Otherwise there would not have been so many subsequent Oklahomans. But Oklahoma! is about which boy—the sunny cowboy Curly or the glowering hired hand Jud Fry—should take the girl, Laurey, to the box social, for Pete’s sake. How steamy can that be?
Some critics whose admiration for this revival of Oklahoma! is as high as an elephant’s eye nevertheless emphasize its “dark” side. But this is not new. Curly has been fighting Jud Fry since March 31, 1943, when Fry first stabbed himself to death with his own knife while fighting with Curly. And for fifty-nine years, before every final curtain, Curly has been quickly acquitted of manslaughter.
Is this “dark”? Ethan Mordden, a historian of Broadway musicals, notes that Oklahoma! is especially American in presenting “the unpleasant truth that evil will keep coming at you until you kill it. One piece of democracy is the harmonizing of discordant agendas. But another piece is the expunging of the wicked.” The revival of Oklahoma! is timely, Mordden says, “because it defines Americans as a people, generous but plainspoken and tough on spoilers.”
The first Broadway run of Oklahoma! lasted five years and nine months, a record not broken until My Fair Lady ran from 1956 to 1962. But in a sense Oklahoma! has never closed since 1943. In a normal year, there are about six hundred new North American productions of it. It is part of the permanent music of the American people, who know that the land they belong to is grand.
[APRIL 7, 2002]
OMAHA BEACH, NORMANDY—On a bluff above the sand and a half a mile from the ocean’s edge at low tide, which was the condition when the first Allied soldiers left their landing craft, a round circle of concrete five feet in diameter provides a collar for a hole in the ground. On the morning of June 6, 1944, the hole was Widerstandsnest (nest of resistance) 62, a German machine-gun emplacement.
Hein Severloh had been in it since shortly after midnight, by which time U.S. aircraft were droning overhead, having dropped young American paratroopers Severloh’s age behind the beaches in order to disrupt German attempts to rush in reinforcements. Severloh had been billeted near Bayeux, home of the eleventh-century tapestry depicting a cross-channel invasion that went the other way, taking William, Duke of Normandy, to become William the Conqueror, England’s sovereign.
Severloh believed he killed hundreds of GIs, so long and slow was their walk to the safety, such as it was, of the five-foot embankment where the beach meets the bluff. Severloh returned here in sorrow and was consoled by survivors of the forces that waded ashore.
Today, in an America understandably weary of a war of choice that has been defined by execrable choices, a frequently seen bumper sticker proclaims: “War is not the answer.” But here, especially, it is well to remember that whether war is the answer depends on the question.
War was the answer to what ailed Europe in 1944. “In 1942,” writes Timothy Garton Ash of Oxford and Stanford’s Hoover Institution, “there were only four perilously free countries in Europe: Britain, Switzerland, Sweden and Ireland.” Twenty years—a historical blink—later, almost all of Western Europe was free. Twenty years after that, Spain, Portugal, and Greece had joined the liberal democracies. Today, for the first time in twenty-five hundred years, most Europeans live under such governments.
Garton Ash argues that Europe cannot define itself negatively—as not America or not Islam. “Europe’s only defining ‘other’ is its own previous self”—its self-destructive, sometimes barbaric past. “This is,” Garton Ash says, “still a very recent past.”
In 1951, just seven years after Severloh and some other Germans surrendered on June 7 to Americans at the village of Saint-Laurent, Europe began building the institutions that, it hoped, would keep such young men out of machine gun emplacements. It created the European Coal and Steel Community, precursor of the Common Market (1958), which led to the single market in 1993 and the common currency in 2002.
The implicit hope was that commerce could tame Europe’s turbulent nations. The perennial problem of politics—mankind’s susceptibility to storms of passions—could perhaps be solved, or at least substantially ameliorated, by getting Europe’s peoples to sublimate their energies in economic activities. The quest for improved material well-being would drain away energies hitherto tapped and channeled by demagogues.
Reminders of Europe’s problematic past were recently found a few miles from Saint-Laurent. Workers preparing a foundation for a new house overlooking Omaha Beach came upon parts of the bodies of two German soldiers. There was scant media attention to this because such discoveries have not been rare.
Also near here, 21,160 German soldiers are buried at the La Cambe Cemetery. Thirty percent—more than 6,000—were never identified, so some German parents conducted “assumed burials.” They placed metal markers bearing the names of their missing sons near the graves of unknown soldiers who were known to have died near where the parents’ sons were last known to be fighting.
Such heartbreaking stories are written into Normandy’s lovely landscape. At the American Cemetery overlooking this beach, amid the many rows of white marble gravestones, are two, side by side, marking the burial places of Ollie Reed and Ollie Reed Jr., a father and his son. The son, an Army first lieutenant, died in Italy on July 6. His father, an Army colonel, was killed July 30 in Normandy. Two telegrams notified the father’s wife, the son’s mother. The telegrams arrived in Manhattan, Kansas, forty-five minutes apart.
The nineteenth-century French scholar Ernest Renan, from a Brittany town on the English Channel, defined a nation as a community of shared memory—and shared forgetting. Europe’s emotional equipoise, and the transformation of “Europe” from a geographical to a political expression, has required both remembering and forgetting. Americans who make pilgrimages to this haunting place are reminded of their role, and their stake, in that transformation.
[SEPTEMBER 2, 2007]
“Don’t cheer, boys. The poor devils are dying.”
—CAPTAIN JOHN PHILIP of the USS Texas to his crew as they watched the Spanish ship Vizcaya burn off Santiago Bay, Cuba, in 1898
On March 9, 1945, 346 B-29s left the Marianas, bound for Tokyo, where they dropped 1,858 tons of incendiaries that destroyed one-sixth of Japan’s capital, killing eighty-three thousand. General Curtis LeMay, then commander of the air assault on Japan, later wrote, “We scorched and boiled and baked to death more people in Tokyo…than went up in vapor at Hiroshima and Nagasaki combined.”
That was inaccurate—eighty thousand died at Hiroshima alone. And in his new biography of LeMay, Barrett Tillman writes that the general was more empathetic than his rhetoric suggested: “He could envision a three-year-old girl screaming for her mother in a burning house.” But LeMay was a warrior “whose government gave him a task that required killing large numbers of enemy civilians so the war could be won.”
It has been hotly debated how much indiscriminate killing of civilians in the Asian and European theaters really was “required” and therefore was morally permissible. Even during the war there was empathy for civilian victims, at least European victims. And less than fifteen years after the war, movies (e.g., The Young Lions, 1958) offered sympathetic portrayals of common German soldiers swept into combat by the cyclone of a war launched by a tyrant.
But attitudes about the Japanese soldier were especially harsh during the war and have been less softened by time than have attitudes about the German soldier. During the war, it was acceptable for a billboard—signed by Admiral William F. “Bull” Halsey—at a U.S. Navy base in the South Pacific to exhort kill japs, kill japs, kill more japs. Killing America’s enemies was Halsey’s trade. His rhetoric, however, was symptomatic of the special ferocity, rooted in race, of the war against Japan: “We are drowning and burning them all over the Pacific, and it is just as much pleasure to burn them as to drown them.” Halsey endorsed the Chinese proverb that the “Jap race” was the result of “a mating between female apes and the worst Chinese criminals.”
Wartime signs in West Coast restaurants announced: this restaurant poisons both rats and japs. In 1943, the Navy’s representative on the committee considering what should be done with a defeated Japan recommended genocide—“the almost total elimination of the Japanese as a race.”
Stephen Hunter, movie critic for the Washington Post, says that of the more than six hundred English-language movies made about World War II since 1940, only four—most notably The Bridge on the River Kwai (1957)—“have even acknowledged the humanity” of Japanese soldiers.
Perhaps empathy for the plight of the common enemy conscript is a postwar luxury; it certainly is a civilized achievement, an achievement of moral imagination that often needs the assistance of art. That is why it is notable that Clint Eastwood’s Letters from Iwo Jima was one of five films nominated for Best Picture.
It is stressful viewing. An unsparing attempt to come as close as cinema can to conveying the reality of combat, specifically the fighting that killed 6,821 Americans and all but 1,083 of the 22,000 Japanese soldiers on the small (eight square miles) black lava island. Remember the searing first fifteen minutes of Saving Private Ryan—the carnage at Omaha Beach? In Letters from Iwo Jima, it is exceeded, with harrowing permutations.
The Japanese commander on the island, Tadamichi Kuribayashi, was—like the admiral who attacked Pearl Harbor, Isoroku Yamamoto—a cosmopolitan warrior who had lived in, and never stopped admiring, America. In 2005, a team of Japanese archaeologists scouring the island’s man-made caves for artifacts of the battle found a sack of undelivered mail from Kuribayashi and other officers and soldiers. All the writers knew they faced overwhelming force—Japan had no assistance to send—and were doomed to die in accordance with the Japanese military code that forbade surrender and encouraged suicide.
Japanese forces frequently committed barbarities worse even than those of the German regular army, and it is difficult to gauge the culpability of conscripts commanded by barbarians. Be that as it may, the pathos of the letters humanizes the Japanese soldiers, whose fatalism was a reasonable response to the irrational. Viewers of this movie, while moved to pride and gratitude by the valor of the U.S. Marines, will not feel inclined to cheer. We are catching up to Captain Philip’s sensibility.
[FEBRUARY 25, 2007]
The Supreme Court’s decision fifty years ago, although an immense blessing to the nation, also carries a melancholy lesson. It is that great events—the school desegregation ruling was the largest judicial event since the Dred Scott case of 1857—have myriad reverberations, some beneficial, others not.
Brown v. Board of Education accelerated the process of bringing this creedal nation into closer conformity to its creed. But the decision also encouraged the abandonment of constitutional reasoning—of constitutional law. It invested the judiciary with a prestige that begot arrogance. And it seemed to legitimize a legislative mentality among judges wielding an anticonstitutional premise. The premise is that “unjust” and “unconstitutional” are synonyms.
The board of education being sued for its segregation policies was not in the South, but in Kansas—Topeka. Segregation was widely practiced, and even more widely approved. Yes, in Montgomery, Alabama, it was illegal for a white to play checkers in public with a black. But Congress was running a segregated school system in the nation’s capital. In 1948, President Harry Truman could not persuade Congress to make lynching a federal crime.
When the case was first argued in 1952, the Supreme Court was composed entirely of Democratic—of Roosevelt and Truman—appointees. And if the court’s composition had not been soon and unexpectedly changed by the addition of a Republican nominee, the legal basis of segregation—the doctrine that “separate but equal” public facilities are constitutional—probably would have been affirmed.
No Republican nominee had served on the court since Owen Roberts, a Hoover nominee, resigned in 1945. But in 1953, eight months into Dwight Eisenhower’s presidency, there occurred the most fateful heart attack in American history. It killed Chief Justice Fred Vinson, a Kentuckian who believed the “separate but equal” doctrine, enunciated in an 1896 decision, should remain.
Four other justices were, to varying degrees, inclined to agree. Cass Sunstein of the University of Chicago Law School, writing in The New Yorker, notes that the waspish Justice Felix Frankfurter said that Vinson’s heart attack was “the first indication that I have had that there is a God.” But Frankfurter and another liberal-leaning justice, Robert Jackson, were FDR appointees who had learned the virtues of judicial modesty by watching the judicial hubris of the court as it struck down many of FDR’s early New Deal measures.
Vinson’s death preceded a rehearing of the case. His replacement, Earl Warren, former governor of California, was a post–New Deal politician. He was comfortable with the premise that the federal government’s responsibilities extend to the general amelioration of citizens’ conditions. A man of immense charm in the court’s face-to-face politics, he also was impatient with the idea that justices must go only where led by judicial reasoning about the Constitution’s text as it has been illuminated by precedents based thereon.
Some Northern states had segregated schools when they ratified the Fourteenth Amendment. It includes the guarantee of “equal protection of the laws” that in 1954 the court decided was incompatible with segregated schools. To reach this conclusion, the court cited social science evidence that segregation induced feelings inimical to young children’s self-esteem, thereby injuring their capacity to learn.
That this rationale was window dressing became clear when the court invoked the Brown decision to outlaw segregated beaches, golf courses, etc. The court would have done better with this simple argument:
The “separate but equal” doctrine came from a correct understanding that equality for blacks was the intent of the Fourteenth Amendment. But the court in 1896 erred because, when separation is enforced on racial lines, “separate but equal” is inherently oxymoronic.
When the Brown ruling was rendered, Thurgood Marshall, the NAACP’s lead litigator, expected segregation to be gone in five years. But ten years later, only 1.17 percent of Southern black schoolchildren attended public schools with whites.
In 1954, the court’s majesty could not compel compliance. Today, the court’s reserves of prestige are immeasurably greater, partly because of what it did then. What also is much enlarged is the public’s belief that judicial fiats can and should remedy many social ills, broadly defined to include the refusal of legislatures to adopt polices deemed just.
“John Marshall has made his decision, now let him enforce it.” That supposedly was President Andrew Jackson’s response to a Supreme Court decision he disliked. Then, as now, the court’s power flowed largely from its prestige, which was not sufficient to bend Old Hickory. No president could act similarly today. This progress owes much to what happened on May 17, 1954.
[MAY 16, 2004]
On Tuesday, July 11, the United States will become more geographically stable than it has ever been. It will have been 17,126 days since the admission of Hawaii to statehood on August 21, 1959. The longest previous span between expansions of the nation was the 17,125 days between the admission of Arizona on February 14, 1912, and the admission of Alaska on January 3, 1959. Since then the nation has become, in a sense, smaller through the annihilation of distance and, to some extent, of difference.
An important part of the groundwork—literally, it covered a lot of ground—for today’s America was begun fifty years ago this summer. A conservative Republican president, who grew up in a Kansas town where hitching posts for horses lined unpaved streets, launched what was, and remains, the largest public works project in the nation’s history—the Interstate Highway System. Its ribbons of concrete represent a single thread of continuity through the nation’s history.
With that program, Dwight Eisenhower, the thirteenth Republican president, helped heal the wounds of the war won by another general, U. S. Grant, the second Republican president. That war was related to “internal improvements,” as infrastructure projects such as roads and canals used to be called.
In 1816, South Carolina representative John Calhoun—then a nationalist; later, a secessionist—introduced legislation for a federal program of internal improvements. The legislation passed, but President James Madison vetoed it because he thought Congress was not constitutionally empowered to do such things. So, prosperous Northern states built their own improvements while the South sank into inferiority and increasing dependence on slavery.
The military handicap of an inferior transportation system was one reason the South lost the Civil War. Another reason was the industrialization of the North. Its transportation system (the Erie Canal, railroads) cut the price of shipping a ton of wheat from Buffalo to New York City from $100 to $10, and the difference between the wholesale price of pork in Cincinnati and New York plunged from $9.53 to $1.18. Suddenly, workers flooding into the North’s cities had more disposable income to spend on the North’s manufactured goods.
The first Republican president began his public life as a twenty-three-year-old candidate for the Illinois General Assembly by telling voters of Sangamon County his “sentiments with regard to local affairs,” the first sentiment being “the public utility of internal improvements.” The vigor of the union also was a preoccupation of Teddy Roosevelt, the eighth Republican president, whose great internal improvement, the Panama Canal, was external, although he thought of Panama as America’s private property. And Eisenhower’s message to Congress advocating the IHS began, “Our unity as a nation is sustained by free communication of thought and by easy transportation of people and goods.”
No legislator more ardently supported the IHS than the Tennessee Democrat who was chairman of the Senate Public Works subcommittee on roads. His state had benefited handsomely from the greatest federal public works project of the prewar period, the Tennessee Valley Authority, which, by bringing electrification to a large swath of the South, accelerated the closing of the regional development gap that had stubbornly persisted since the Civil War. This senator who did so much to put postwar America on roads suitable to bigger, more powerful cars was Al Gore Sr. His son may consider this marriage of concrete and the internal combustion engine sinful, but Tennessee’s per capita income, which was just 70 percent of the national average in 1956, today is 90 percent.
The IHS—combined, as Fortune magazine’s Justin Fox writes, with another bright idea from 1956, the shipping container—made America’s distribution system more flexible. This benefited manufacturers, foreign and domestic, especially in America’s hitherto lagging region, the South. This is one reason there is a thriving Southern-based automobile industry (BMW in South Carolina; Mercedes in Alabama; Honda in both Carolinas, Georgia, and Alabama; Toyota in Tennessee, Alabama, and Kentucky). Furthermore, the South is home to some of today’s “big-box” retailers—Wal-Mart (Bentonville, Arkansas), Home Depot (Atlanta)—as well as FedEx (Memphis).
American scolds blame the IHS and the automobile for everything from obesity (fried food at every interchange) to desperate housewives (isolated in distant suburbs without sidewalks). Nikita Khrushchev, during his 1959 visit to America, told Eisenhower, “Your people do not seem to like the place where they live and always want to be on the move going someplace else.” Eisenhower knew that wherever people are going on their nation’s roads, they are going where they live.
[JULY 9, 2006]
Leaving no talent untapped in its quest for perfection, the Ford Motor Company asked Marianne Moore, one of America’s foremost poets in the 1950s, to suggest a name for the product it would debut in late summer, fifty years ago. She replied: “May I submit Utopian Turtletop? Do not trouble to answer unless you like it.”
Ford instead named the product for Henry Ford’s late son Edsel. The Edsel would live twenty-six months.
The short, unhappy life of that automobile is rich in lessons, and not only for America’s beleaguered automobile industry. The principal lesson is: Most Americans are not as silly as a few Americans suppose.
No industry boomed more in the 1950s than the manufacturing of social criticism excoriating Americans for their bovine “conformity,” crass “materialism,” and mindless manipulability at the hands of advertising’s “hidden persuaders.” Vance Packard’s The Hidden Persuaders was atop the New York Times bestseller list as Edsels arrived in showrooms. No consumer product in history had been the subject of so much “scientific” psychology-based market research.
Remember the basketball coach who said of his team, “We’re short but we’re slow”? The Edsel was ugly but riddled with malfunctions. So many malfunctions that some people suspected sabotage at plants that had previously assembled Fords and Mercurys. Those two Ford divisions perhaps hoped the Edsel would bomb.
“It was,” wrote John Brooks, a student of American business, in The New Yorker, “clumsy, powerful, dowdy, gauche, well-meaning—a de Kooning woman.” Chrome seemed to be piled upon chrome. Potential buyers recoiled from the vertical egg-shaped grille, which reminded them of a toilet seat. The transmission was worked by push buttons placed—convenience sacrificed on the altar of novelty—in the center of the steering column. The larger Edsels weighed more than two tons, were 219 inches long—longer than the grandest Oldsmobiles—and 80 inches wide. These were not the cars for a year in which the surprise success was American Motors’ little Rambler.
By Sunday, October 13, barely more than a month after the Edsel’s debut, anemic sales caused the company to preempt The Ed Sullivan Show with a Sunday evening Edsel extravaganza featuring Bing Crosby and Frank Sinatra. But there was no sales spurt. Nine days earlier, the Soviet Union had launched its first Sputnik satellite, provoking a crisis of confidence in America’s technological prowess and a reaction against chrome-laden barges as emblems of national self-indulgence. On November 27, Manhattan’s only Edsel dealer gave up his franchise and switched to selling Ramblers.
In the spring of 1958, S. I. Hayakawa, a professor of semantics (and later a Republican U.S. senator from California), ascribed the Edsel’s failure to the Ford executives’ excessive confidence in the power of motivational research to enable them to predict—and modify—Americans’ behavior. In their attempt to design a car that would cater to customers’ sexual fantasies, status anxieties, and the like, Ford’s deep thinkers had neglected to supply good transportation.
“Only the psychotic and the gravely neurotic act out their irrationalities and their compensatory fantasies,” Hayakawa wrote. “The trouble with selling symbolic gratification via such expensive items…is the competition offered by much cheaper forms of symbolic gratification, such as ‘Playboy’ (fifty cents a copy), ‘Astounding Science Fiction’ (thirty-five cents a copy), and television (free).”
In 1958, with the Edsel already turned to ashes, John Kenneth Galbraith, with bad timing comparable to the launch of the Edsel, published The Affluent Society. It asserted that manufacturers, wielding all-powerful advertising, were emancipated by the law of supply and demand because advertisers could manufacture demand for whatever manufacturers wished to supply.
This theory buttressed the liberal project of expanding government in the name of protecting incompetent Americans from victimization, and having government supplant the market as the allocator of wealth and opportunity. But all of Ford’s then-mighty marketing prowess could not keep the Edsel from being canceled in 1959. Brooks calculated that it would have been cheaper for Ford to skip the Edsel and give away 110,000 Mercurys.
Today, the United Auto Workers union and General Motors, Ford, and Chrysler are trying to reverse the slide of the American automobile industry. Fifty Septembers ago, the country was atingle with anticipation of a new product that turned out to be a leading indicator of the slide. As Detroit toils to undo some contractual provisions that have burdened the companies with crippling health care and pension costs, it should remember the real lesson of 1957: Americans are more discerning and less herdable than their cultured despisers suppose, so what matters most is simple. Good products.
[SEPTEMBER 6, 2007]
There was, too, a wonderful simplicity of desire. It was the last time that people would be thrilled to own a toaster or waffle iron.
—BILL BRYSON
What Thanksgiving is to gluttony, the three days after it are to consumerism—the main event. So, with Americans launching the Christmas season by storming the stores, let us recall when consumption had an exuberance remembered now only by those who experienced the 1950s.
Bill Bryson remembers. The author of thirteen books (e.g., A Walk in the Woods and A Short History of Nearly Everything), Bryson has most recently written The Life and Times of the Thunderbolt Kid, a memoir of growing up in Des Moines in the fifties, when downtown department stores—with white-gloved operators in the elevators and pneumatic tubes carrying money and receipts to and from cashiers—served the pent-up demands of a nation making up for consumption missed during the Depression and World War II.
In 1951, when the average American ate 50 percent more than the average European, Americans, Bryson says, controlled two-thirds of the world’s productive capacity, owned 80 percent of the world’s electrical goods, produced more than 40 percent of its electricity, 60 percent of its oil, and 66 percent of its steel. America’s 5 percent of the world’s population had more wealth than the other 95 percent, and Americans made almost all of what they consumed: 99.93 percent of new cars sold in 1954 were U.S. brands.
By the end of the fifties, GM was a bigger economic entity than Belgium, and Los Angeles had more cars than did Asia—cars for a gadget-smitten people, cars with Strato-Streak engines, Strato-Flight Hydra-Matic transmissions, and Torsion-Aire suspensions. The 1958 Lincoln Continental was nineteen feet long. And before television arrived (in 1950, 40 percent of Americans had never seen a television program; by May 1953, Boston had more televisions than bathtubs) America made almost a million comic books a month.
Consider what was new or not invented then: ballpoint pens, contact lenses, credit cards, power steering, long-playing records, dishwashers, garbage disposals. And remember words now no longer heard: icebox, dime store, bobby socks, panty raid, canasta (a card game). In 1951, a Tennessee youth was arrested on suspicion of narcotics possession. The brown powder was a new product—instant coffee.
Fifties food was, Bryson reminds us, not exotic: In Iowa, at least, folks did not eat foreign food “except French toast,” or bread that was not “white and at least 65 percent air,” or “spices other than salt, pepper and maple syrup,” or “any cheese that was not a vivid bright yellow and shiny enough to see your reflection in.”
But unlike today, when everything edible, from milk to spinach, has its moment as a menace to health, in the fifties everything was good for you. Cigarettes? Healthful. Advertisements, often featuring doctors, said smoking soothed jangled nerves and sharpened minds. “X-rays,” Bryson remembers, “were so benign that shoe stores installed special machines that used them to measure foot sizes.”
In Las Vegas, downwind from some atomic weapons tests, government technicians used Geiger counters to measure fallout: “People lined up to see how radioactive they were. It was all part of the fun. What a joy it was to be indestructible.” But, Bryson dryly notes, people knew without a warning label “that bleach was not a refreshing drink.”
White House security precautions were so lax that on April 3, 1956, a somewhat disoriented Michigan woman detached herself from a White House tour and wandered through the building for four hours, setting small fires. When found, she was taken to the kitchen and given a cup of tea. No charges were filed.
The fifties did have worries. When a contestant on a TV game show said his wife’s astrological sign was Cancer, the cigarette company sponsoring the show had the segment refilmed and her sign changed to Aries. You could get fourteen years in an Indiana prison for instigating anyone under age twenty-one to “commit masturbation.” And to get a New York fishing license, you had to swear a loyalty oath.
Nothing has changed more for the worse since the fifties than childhood. The lives of children were, Bryson remembers, “unsupervised, unregulated and robustly” physical. “Kids were always outdoors—I knew kids who were pushed out the door at eight in the morning and not allowed back in until five unless they were on fire or actively bleeding.”
But as the twig is bent, so grows the tree: These children, formed by the fifties, grew up to be Olympic-class shoppers. They are indoors this Sunday, at malls.
[NOVEMBER 26, 2006]
Onward and upward with homo sapiens. A 7 million-year-old skull uncovered this year in Central Africa belonged to someone the size of a chimpanzee and is the earliest—by about a million years—yet discovered member of the human family. In 2002, his descendants were threatened by savage primitives who, in the name of the Creator, were possibly plotting to reverse, using smallpox spores, one of Homo sapiens’ recent triumphs over an infectious scourge. Much the most important event of 2002 was a nonevent—the second major terrorist attack on the American homeland that did not happen. Four homegrown terrorists from the 1970s, members of the Symbionese Liberation Army, pleaded guilty to a murder committed during a 1975 bank robbery in Carmichael, California.
Complex geometric carvings on a rock found this year in a South African cave suggest that complex and abstract thinking began in Africa, not in Cambridge, Massachusetts, and began twice as long ago—seventy-seven thousand years—as had been believed. When did it stop? Trent Lott regretted that Harry Truman rather than Strom Thurmond won the 1948 presidential election.
The perpetrators of this year’s most lurid skullduggery and corruption? No, not Wall Street stock analysts—Olympic figure-skating judges. Not long ago, tycoons were “masters of the universe.” This year they were mastering the perp walk. Enron and Arthur Andersen almost vanished. Martha Stewart wished she could. United Airlines and the Boston diocese of the Roman Catholic Church had reason to remember the aphorism of Frank Borman, who was president of Eastern Air Lines before it went bankrupt: “Capitalism without bankruptcy is like Christianity without Hell.”
Two Bronx teenagers, one four feet ten and the other five feet six, are suing McDonald’s because they weigh 170 pounds and 270 pounds, respectively. The legal theory behind their suit derives from the Garth Brooks lyric: “Longneck bottle, let go of my hand.”
Lieutenant John Kennedy’s PT 109, sunk in 1943, was found off the Solomon Islands. Oklahoma!, launched in 1943, was back on Broadway. Off-Broadway, Bill Clinton starred in “It—everything—is all about me.” Campaigning for Massachusetts gubernatorial candidate Shannon O’Brien, Clinton, whose self-absorption remains one of the wonders of the world, said an O’Brien victory would be “a wonderful way to celebrate the 10th anniversary of my victory in 1992.”
A little difference makes a big difference: A mouse’s genome has been mapped. Humans and mice have about thirty thousand genes. Less than 1 percent are unique to either species.
An Italian artist, seeking to make an “ironic statement,” produced ninety cans of his feces. London’s Tate Gallery paid $35,000 for one. If your cell phone rang in the Rising Sun pub in Brighton, England, the proprietor nailed the phone to the bar.
A member of the U.S. table-tennis team was suspended for steroid use. When Mets pitcher Shawn Estes lost his no-hitter in the seventh inning, Mets manager Bobby Valentine rejected the idea that Estes was jinxed in the fifth inning when Shea Stadium’s JumboTron announced that Estes had not yet given up a hit. Said Valentine: “I don’t believe in superstitions. They’re bad luck.”
Baseball avoided a season-ending strike, enabling a San Francisco woman, who wanted to be artificially inseminated, to advertise a barter: World Series tickets for “healthy sperm.” In his first three major-league at-bats, Seattle Mariners designated hitter Ron Wright caused six outs by striking out and hitting into a double play and a triple play. After the game, he was sent to the minors. Pete Gray, a one-armed outfielder who during the Second World War played a season for the St. Louis Browns, died at eighty-seven. Joe Black, the first black pitcher to win a World Series game (for Brooklyn, in 1952), was seventy-eight.
Montgomery, Alabama, bus driver James Blake died at 89, forty-seven years after he had Rosa Parks arrested because she refused to move to the back of his bus. Traudl Junge, a private secretary to Hitler and the last surviving witness to his final hours in the bunker, said he “gave me a feeling of security, safety and being cared for.” She died at 82, wondering: “If he discovered he had Jewish blood in his family tree, would he have gassed himself?” Chaike Spiegel, one of the last surviving combatants of the 1943 Warsaw ghetto uprising, was 81. Flags all across Australia were flown at half mast for Alec Campbell, 103, the last survivor of more than seventy thousand Australians and New Zealanders who fought in 1915 at Gallipoli, the ill-fated operation that almost destroyed the career of its architect, Winston Churchill. Of Gallipoli, Campbell said: “It was a lovely place, you know, if conditions had been better…” Queen Elizabeth the Queen Mother, born when Churchill was 25, died at 101.
And after half a century of sultry singing, Peggy Lee, who was eighty-one, left the stage. There lingered in our minds a lyric suitable for any year: “If that’s all there is, my friends, then let’s keep dancing.”
[DECEMBER 23, 2002]
“Whatever happens,” said Lord Salisbury (1830–1903), a conservative in thought, word, and deed, “will be for the worse, and therefore it is in our interests that as little should happen as possible.” By that sensible standard, eventful 2003 was not in our interests.
Make love and war, or else the terrorists will have won: During Valentine’s week in February, with war impending and the government elevating the terrorism alert, two of Wal-Mart’s hot-selling items were lingerie and duct tape. Talk about ingratitude: Terrorists struck in Saudi Arabia. The war with Iraq went well, aside from the detail that the reason for it—weapons of mass destruction—has been elusive. The following is a complete list of all those fired because of the intelligence failure: _______.
While America was trying to acquaint 25 million Iraqis with democracy, 144.5 million Russians fell under President Vladimir Putin’s “managed democracy”—czarism leavened by state-manipulated plebiscites. If only the world were more like California, where the people, groaning under the wicked choices of the electorate, chose a new governor to wrestle with the people’s chosen legislature over how to undo the damage done by voters in dozens of plebiscites.
In America’s “(court-) managed democracy,” judges began discovering a right to same-sex marriage. Forty-one years after the U.S. Army helped end racial discrimination in admissions at Ole Miss, the Supreme Court found nothing amiss in the University of Michigan’s racial discrimination in admissions.
That court, having said that the First Amendment protections extend to virtual child pornography and tobacco advertising, finally exclaimed “Enough!” In a 5–4 decision cowritten by Sandra Day O’Connor, it held that concern for political hygiene justified Congress’s passing the McCain-Feingold legislation to restrict the amount and regulate the content of speech about members of Congress. Thus did a Ronald Reagan 1980 campaign promise (to appoint the first woman justice) result, twenty-three years later, in ratification of George W. Bush’s decision to sign a bill that he, while campaigning—before taking the oath to defend the Constitution—called unconstitutional.
The Académie Française, which never stooped to admit Flaubert or Zola, admitted Valéry Giscard d’Estaing, author—sort of—of the European Union’s proposed 263-page Constitution, a screamingly funny political satire. Because of Jayson Blair, heads rolled at the New York Times, where all the “news” had not been fit to print.
American conservatives, controlling both elected branches of government for the first full year since 1954, used their power to…vastly expand the welfare state. The prescription-drug entitlement may cost $2.5 trillion over the next twenty years. Big deal. That sum amounts to only five years of deficits at the current level. Besides, Congress showed that it could act with dispatch against a menace that really annoys people: spam.
Howard Dean discerned what liberals want: attitude. In San Francisco, ground zero of Deanism, sensitivity police stipulated that pets’ owners shall also be called “guardians.” Kobe Bryant, impulse buyer, bought his wife a $4 million diamond ring. Friends headed for oblivion, American style: ubiquitous reruns.
Remember Henry Adams’s jest that the succession of presidents from Washington to Grant disproved the theory of evolution? After another year watching their royals, the British could say something similar about the progress, so to speak, from Elizabeth I to the son of Elizabeth II. Senator Pat Moynihan, dead at seventy-six, was America’s foremost public intellectual. Another giant of the Finance Committee, Russell Long, eighty-four, was sixteen when his father was assassinated, and not quite thirty when elected to the Senate in 1948. David Brinkley, eighty-two, the most famous son of Wilmington, North Carolina, until Michael Jordan came along, said of television, “When there is no news, we give it to you with the same emphasis as if there were.”
This year, the movie musical was revived with Chicago, and Sam Phillips died at 80. On July 5, 1954, at his Memphis recording studio, Phillips recorded a 19-year-old singing “That’s All Right Mama.” Elvis’s blending of white and black music helped end the world defended by Lester Maddox and Strom Thurmond, dead at 87 and 100.
In 1951, the Boston Braves’ Warren Spahn, en route to becoming baseball’s winningest left-hander, stood on the mound, sixty feet six inches from a New York Giants rookie who was 0–for–12. Willie Mays homered. Said Spahn, “For the first sixty feet, that was a helluva pitch.” Spahn was eighty-two. In the 1934 World Series, Tiger shortstop Billy Rogell’s relay to first hit Cardinal runner Dizzy Dean in the forehead. The next day a headline supposedly said: x-rays of dean’s head reveal nothing. Rogell died at ninety-eight.
At 114, Mitoyo Kawate was the world’s oldest person. She was working on her farm six miles from Hiroshima on August 6, 1945. Jack Davis, 108, was Britain’s oldest veteran of mankind’s final war—the war to end war, 1914–1918.
[DECEMBER 22, 2003]
In 2004, an IBM supercomputer set a world record with 36.01 trillion calculations per second. The U.S. electorate may have made its calculation the instant John Kerry, who is not a supercomputer, explained why Toy’s restaurant in Canonsburg, Pennsylvania, “is my kind of place”:
“You don’t have to—you know, when they give you the menu, I’m always struggling: ah, what do you want? He just gives you what he’s got, right?…whatever he’s cooked up that day. And I think that’s the way it ought to work, for confused people like me who can’t make up our minds.”
This year, some paleoanthropologists reported that our cousins the Neanderthals, who disappeared thirty thousand years ago, had better minds than has been thought: On a plain in Spain there is a mass grave containing evidence of funeral ritual, which means that Neanderthals had a capacity for symbolism. This year, Democrats stressed their superior brains. (Bumper stickers: some village in texas is missing their [sic] idiot; john kerry—bringing complete sentences back to the white house.) A campaign flier in Tennessee pictured George W. Bush’s face superimposed over that of a runner in the Special Olympics, and proclaimed this message: “Voting for Bush is like running in the Special Olympics. Even if you win, you’re still retarded.”
From an Indonesian island came evidence that as recently as eighteen thousand years ago—only yesterday, as paleoanthropologists reckon—there was a race of Hobbit-size (about three feet tall) semi-people. Their small brains probably were incapable of idealism of James Kilgore’s sort. In 2004, Kilgore, fifty-six, was sentenced to six years in prison for his part in the murder of a mother of four during a 1975 California bank robbery that was supposed to help finance the Symbionese Liberation Army. Martha Stewart was sentenced to prison for lying about a crime she was not charged with. Scott Peterson was convicted of double murder—killing his wife, and killing his unborn child, a problematic idea given the current understanding of abortion rights. Before death tardily overtook another dispenser of death, Yasir Arafat, he received a letter from People for the Ethical Treatment of Animals—well, of animals other than people—asking him to stop using donkeys in suicide bombings. It was said that the death of this winner of the Nobel Peace Prize might make peace possible.
Fifty years after Brown v. Board of Education, an African-American was nominated to replace an African-American as secretary of state. Nashville, Tennessee, schools stopped displaying the honor rolls of A students because some parents complained that the displays might hurt the feelings of dimmer students. In Washington State, the Puyallup school district ended the grade-school tradition of children parading in Halloween costumes, partly because some costumes might be offensive to real witches. Said a district spokeswoman, “Witches with pointy noses and things like that are not respective [sic] symbols of the Wiccan religion, and so we want to be respectful of that.” In Michigan, Jon Blake Cusack named his son Jon Blake Cusack 2.0.
Some American sub-Neanderthals photographed themselves abusing Iraqi prisoners. By October 24, the war in Iraq had lasted longer than the U.S. involvement in the First World War. Two movies symptomatic of the temper of the times were The Passion of the Christ and The Passions of the Faculty Clubs (aka Fahrenheit 9/11). The Massachusetts Supreme Judicial Court ignited a debate about whether homosexuals could do more damage to the institution of marriage than heterosexuals are doing. Britney Spears has been married twice this year, so far. What, other than Janet Jackson’s breast, do you remember about the Super Bowl?
At the Olympics, an elite collection of NBA stars lost to Puerto Rico and Argentina, but hey, they beat the Lithuanians after losing to them. Early in 2004, Alex Rodriguez, eager to win a World Series, was courted by the Yankees and Red Sox and signed with…oh, well. Paul Hopkins, the oldest former major-league player, died at ninety-nine. The first batter he faced as a pitcher for the Washington Senators was Babe Ruth, who homered. Hopkins’s career ended ten games later. In 2004, Washington was awarded another team.
The world’s oldest man, Joan Riudavets Moll, died at 114 in Spain. He was born the same year—1889—as Hitler and Charlie Chaplin. Alberta Martin, a Civil War widow, died in Alabama at 97. Born in 1906, in 1927 she married a Confederate widower born in 1845. She was poor and he had a $50-a-month Confederate pension—a not uncommon aphrodisiac causing May-December marriages early in the twentieth century. Did she at 21 love her 81-year-old groom? “That’s a hard question to answer…. You know the difference between a young man and an old man.” Two months after he died in 1931, she married his grandson.
Two great musical instruments fell forever silent—the voices of Ronald Reagan and Ray Charles. But their melodies linger on.
[DECEMBER 20, 2004]
Seeking the serenity that a sense of history confers in testing times, Mike Cameron, a Mets outfielder in 2005, said in defense of a teammate who lost a fly ball in the sun, “Stuff is going to happen sometimes. The sun has been there for five hundred, six hundred years.” Stuff happened in 2005, when an obituary in the Chicago Tribune advised, “In lieu of flowers, please send acerbic letters to Republicans.” At home, the president’s, and the nation’s, disagreeable year can be summarized by three female names: Terri Schiavo, Harriet Miers, and Katrina. The first involved grotesque overreaching by the federal government, undertaken by self-described conservatives whose action refuted their description. The second involved indifference to competence. The third displayed the consequences of incompetence. Abroad, Iraq illustrated one, two, and three.
In Russia, despotism continued to make a comeback, but Lenin, at least, may soon be buried: His cadaver in Red Square is said to sometimes sprout fungi. The 482-page European Union “constitution” was rejected, but the common currency marches on in white boots. Wearing those and a red miniskirt, Renate Dolle, sixty-three, told a Berlin newspaper she will soon end her forty-nine-year career as a prostitute (n30; $36) so she can spend more time with her husband and granddaughter.
Why was America’s consumer-driven economy not derailed by higher oil prices? In 2005, Americans’ housing stock increased in value $2.38 trillion more than their oil bill increased ($120 billion). Vox populi, vox dei? When Katrina’s disruption of supplies caused gasoline prices briefly to pass $3 a gallon, the public thought this proved that conniving oil companies control prices. When, a few weeks later, prices plunged toward, and some places below, $2, the public thought…what?
Rising oil prices and General Motors’ declining health reflected, among other things, the success of sixty years of U.S. policies promoting free trade and globalization. India and China are slurping up oil because, having joined the international economy, they are booming. This year, upwards of sixty thousand Americans were employed manufacturing more than 3 million “foreign” cars. Toyota, which in 2006 may sell more cars worldwide than GM, has opened a design center in Ann Arbor, just forty miles from Detroit.
Onward and upward with progressivism: In a Las Vegas suburb, the United Food and Commercial Workers union hired temp workers at $6 an hour to picket a nonunion Wal-Mart, where wages start at $6.75 an hour. A British teachers-union official proposed that instead of bad students’ receiving a “failing” grade, their grade should be called “deferred success.” A Milwaukee seventeen-year-old and his father sued to end summer homework because the stress of honors precalculus assignments spoiled the lad’s summer. When Jada Pinkett Smith, wife of actor Will Smith, told a Harvard audience that women “can have it all—a loving man, devoted husband, loving children, a fabulous career,” the campus Bisexual, Gay, Lesbian, Transgender, and Supporters Alliance said its members were made “uncomfortable” because Mrs. Smith’s words were “extremely heteronormative.” A majority of teachers, parents, and students at Jefferson Elementary School in Berkeley favored renaming the school Sequoia Elementary because Jefferson owned hundreds of slaves. Under Chief Sequoia, the Cherokee nation owned more than fifteen hundred black slaves. You cannot be too careful, so Timnath, Colorado, banned smoking in bars and restaurants, of which Timnath at the time had none. The often hilarious New York Times, which opposes capital punishment, reported disapprovingly that a life sentence “is death in all but name.”
An Oklahoma judge granted the request of a criminal who wanted his thirty-year prison sentence increased three years to match Larry Bird’s Celtics jersey number. Death, as it must to all, came to six-foot-ten George Mikan, eighty, who was the NBA’s first superstar. Madison Square Garden’s marquee once read: wed. basketball: geo. mikan vs. knicks.
Asked to switch from guitar to bass, which he could not afford to buy, Eric Griffiths quit the rock group the Quarry Men in 1958 and joined the British Merchant Navy. On a radio on a ship in the Persian Gulf in 1963 he heard “Please Please Me,” the first hit by the former Quarry Men, by then called the Beatles. Griffiths died in 2005 at 64. On November 22, 1963, in a darkened Dallas theater, the hammer of Lee Harvey Oswald’s revolver jammed on the flesh of the palm of the policeman arresting him, so Maurice McDonald lived another forty-two years. Vic Power, a Puerto Rican first baseman, was one of baseball’s first Hispanic stars. Sports Illustrated reports that when Power was playing in the minor leagues in the South, he was told by a waitress that the restaurant did not serve Negroes. He replied, “That’s OK, I don’t eat Negroes.” Mark Matthews, 111, was the oldest of the surviving Buffalo Soldiers, the African-Americans who fought Native Americans on behalf of Euro-Americans. Ah, multiculturalism.
[DECEMBER 19, 2005]
How gruesome was 2006? The year’s most consequential person was Iran’s president, who says the Holocaust did not happen and vows to complete it. Regarding his nuclear aspirations, Mahmoud Ahmadinejad, whose manias are leavened with realism, treated the United Nations as a figment of the imagination of a fiction—the “international community.” Democrats, given control of Congress because of Iraq, vowed to raise the minimum wage. Nimble and graceful Barack Obama became the Democrats’ Fred Astaire, adored because of, well, perhaps the way he wears his hat, the way he sips his tea. And the way he isn’t Hillary.
This year’s civil-rights outrage was “soaring” and “record” gasoline prices, a violation of Americans’ inalienable right to pay for a gallon no more than they paid twenty-five years ago. By December, the price of a gallon, adjusted for inflation, was eighty-three cents lower than in 1981. Kansas voters removed some skeptics of evolution from the state’s school board. A fossil 3.3 million years old revealed that a little girl from the human lineage had arms and shoulders suited to climbing and swinging through trees.
In order to show “tolerance of people’s beliefs,” government workers in England’s West Midlands were told, after a Muslim complained, to remove from sight all pig-related items, such as a tissue box featuring Winnie the Pooh and Piglet. But tolerance was episodic in Europe in 2006: In Sweden, police said the soccer fan who wore on his clothes a Swedish flag, which features a cross, “provoked some emotions.” Indeed. He was beaten nearly to death by Muslim immigrants. Inspector Clouseau, call your office: French police denied that anti-Semitism was involved in the kidnapping and murder of a Jewish man by Muslim immigrants who demanded a ransom from a synagogue. Angry about those Danish cartoons depicting the prophet, Iran’s bakers renamed Danish pastries “Roses of the Prophet Muhammad pastries.” Although no one had complained, the human-rights director for the provocatively named city of St. Paul, Minnesota, had a happy easter sign removed from city hall.
Two U.S. explorers went to the North Pole to study how global warming threatens polar bears. They had planned to go last year, but were forced to delay Project Thin Ice because of unusually heavy snow and ice. The “emerging hurricane problem,” which, after Katrina, the New York Times identified as a consequence of global warming, did not emerge. The unusually tranquil Atlantic hurricane season was explained as a consequence of…global warming affecting the Pacific. Two senators, Jay Rockefeller of West Virginia and Olympia Snowe of Maine, warned ExxonMobil that global warming is an undeniable fact, so the corporation should desist from its “dangerous” support of research by persons with doubts. The senators did not explain the danger involved in doubting the indubitable. There were dangers—disorder, sporadic violence—among those gathered outside stores in the predawn hours before the PlayStation 3 gaming console went on sale.
Great moments in government: The Florida woman who wounded with a shotgun the alligator that entered her house and attacked her golden retriever was given a warning citation for hunting without a permit. Compassionate social democracy: The Danish government continued to pay prostitutes to service the disabled.
Ancient Greece pioneered philosophy and democracy. Modern Greece this year gave the world a new wrinkle in creative accounting: It became 25 percent richer after its GDP was revised to account for such booming service industries as prostitution and money laundering. The intellectual fare served at the University of Wisconsin-Milwaukee included a course called the Social Construction of Obesity. (Fatness, like beauty, is in the eye of the beholder to whom society’s power structure, always eager to foment new forms of discrimination, has given false consciousness.) Elsewhere in higher education, at Bucknell’s “celebration of whore culture,” a woman stripped on a trapeze.
In Tacoma, Washington, a judge asked those in her courtroom to cheer “Go, Seahawks!” Then she sentenced a man convicted of manslaughter to thirteen years. The chief executive of Eternal Image Inc., which announced caskets and urns with logos of all thirty Major League Baseball teams, called this “a way to make team loyalty a final statement.” Red Auerbach, whose Celtics teams won seven championships without having a player among the NBA’s top ten scorers, died this year at eighty-nine. Romano Mussolini, who died this year at seventy-nine, son of Il Duce, had played jazz with Dizzy Gillespie, Duke Ellington, and Chet Baker.
Lillian Gertrud Asplund was five when her father smiled and said, “Go ahead, we will get into one of the other boats.” He did not. Lillian never married, and retired early to take care of her mother, who never recovered from losing her husband. Lillian, the last American survivor of the Titanic, was ninety-nine.
[DECEMBER 18, 2006]
In 2007 came the revolution. Determined to end the war in Iraq and begin the reign of justice in America, Democrats took over Congress and acted on the principle “ready, fire, aim.” They threatened to tell the Ottoman Empire (deceased 1922) that it should be ashamed of itself (about Armenian genocide) and raised the minimum wage to $5.85, which is worth less than the $5.15 minimum was worth when it was set in 1997. Onward and upward with compassionate liberalism: The Democrat-controlled Senate flinched from making hedge-fund multimillionaires pay more than a 15 percent tax rate. At the year-end, there were more troops in Iraq than there were at the year’s beginning. Although it was not yet possible to say the war was won, it was no longer possible to say the surge was not succeeding. The McClatchy newspapers, with the media’s flair for discerning lead linings on silver clouds, offered this headline: as violence falls in iraq, cemetery workers feel the pinch.
The king of Spain told the president of Venezuela to “shut up,” and 51 percent of Venezuelans seconded the motion. Rudy Giuliani said, “I took a city that was known for pornography and licked it.” Hillary Clinton accused Barack Obama of having been ambitious in kindergarten. Disraeli once said of Lord Russell: “If a traveler were informed that such a man was leader of the House of Commons, he may well begin to comprehend how the Egyptians worshipped an insect.” Mike Huckabee became a leader among Republican presidential candidates.
In March, when a planned trek by two explorers to the North Pole, intended to dramatize global warming, was aborted because of temperatures 100 degrees below zero, an organizer of the consciousness-raising venture explained that the cancellation confirmed predictions of global warming because “one of the things we see with global warming is unpredictability.” Al Gore won the Nobel Peace Prize that should have gone to nine-time Grammy winner Sheryl Crow, who proposed saving the planet by limiting—to one—“how many squares of toilet paper can be used in any one sitting.” At the U.N. global-warming conference in Bali there was Carbon Footprint Envy—the airport did not have space to park all the private jets.
As Americans debated expanding government involvement in health care, Britain’s National Health Service told Olive Beal she would have to wait eighteen months to get her hearing aid. She is 108.
Thanks to federal supervision of K–12 education, when a Johnson City, New York, parent complained that cheerleaders lead cheers for the boys’ basketball team but not the girls’, the U.S. Department of Education, citing Title IX’s requirement of sexual equality in scholastic sports, demanded equal “promotional services.” Two Los Angeles teachers were fired after a controversy that began when one had her class, during Black History Month, make a presentation about Emmett Till, the Chicago fourteen-year-old who was tortured and murdered in Mississippi in 1955 after his wolf whistle at a white woman. Some students and teachers charged that school officials said Till’s whistle could be construed as sexual harassment. In an inexplicable (and probably temporary) spasm of good taste, public opinion sent Don Imus packing because he said on his radio program something no more tasteless than things he had been saying for years, to the delight of a large (and evidently fickle) public.
A Seattle day-care center banned LEGO building blocks because the beastly children “were building their assumptions about ownership and the social power it conveys, assumptions that mirrored those of a class-based, capitalist society.” The center reinstated LEGOs but allowed the children to build only “public structures” dedicated to “collectivity and consensus.” In other lingering reverberations of communism, scientists unearthed what they think are remains of two more of Czar Nicholas II’s children, murdered by Bolsheviks, who never played with LEGOs. A Cuban exile, former CIA operative, and Bay of Pigs veteran announced plans to auction what he says is a lock of Che Guevara’s hair taken from the corpse before burial in Bolivia.
When the Confederate monument in Montgomery, Alabama, was desecrated, was that a “hate crime”? Saying he wanted to bring Alabama “into the twentieth century”—the twenty-first would be a bridge too far?—a legislator, worried that “a shower head” might be illegal, moved to repeal the state’s ban on the sale of sex toys. A mayor looked on the bright side of his city’s high homicide rate: “It’s not good for us but it also keeps the New Orleans brand out there.” Lucky Belgium has been without a government since June.
In 2007, for the first time, two Hispanic surnames, Garcia and Rodriguez, were among America’s ten most common. Paul and Teri Fields of Michigan City, Indiana, named their baby boy Wrigley.
Death, as it must to all, came to Paul Tibbets, 92. Eighty years ago, 12-year-old Paul flew with a barnstorming pilot who dropped Baby Ruth candy bars over a Florida racetrack. In 1945, Tibbets was pilot of the Enola Gay, the B-29 that dropped the atomic bomb on Hiroshima. “What about the shortstop Rizzuto,” asked Casey Stengel long ago, “who got nothing but daughters but throws out the left-handed hitters in the double play.” Phil Rizzuto, the oldest living Hall of Famer, was 89. Emma Faust Tillman, 114, of Hartford, Connecticut, had been the world’s oldest person. She was born during the presidency of Benjamin Harrison. Robert Adler, 93, gave the modern world its most beloved invention. The TV remote, of course.
[DECEMBER 22, 2007]