11

REMEMBRANCE

Now all roads lead to France
And heavy is the tread
Of the living; but the dead
Returning lightly dance

EDWARD THOMAS, 1916

If poetry could truly tell it backwards,
then it would.

CAROL ANN DUFFY, 20091

Maya Lin would probably have had little chance if the competition hadn’t been “blind,” judged by number not name. As a student at Yale—only twenty-one, female, from a Chinese American family—the cards would have been stacked against her in an open contest with the big boys of the architectural profession. And she received abuse aplenty when it became known that her design was the unanimous choice of the judges assessing more than 1,400 proposals for a Vietnam Veterans Memorial in Washington, DC—intended for the heart of the nation’s sacred space, between the Washington Monument and the Lincoln Memorial.

Lin’s proposal diverged sharply from the heroic statuary of previous American war commemoration, such as the Iwo Jima Memorial of 1954. To appease traditionalists one of the runner-up entries was later erected nearby—a life-size bronze of three soldiers, clearly distinguishable as a white, a black, and a Hispanic, all battle-weary but heavily armed and still ready for combat. By contrast Lin offered a stark, nonrepresentational memorial—two long walls gradually gouging into the earth to meet in an elongated V, ten feet high at its apex. Along the walls are listed in chronological order the names of all the US servicemen and -women who had died during the Vietnam War. The walls are constructed of black granite, highly reflective, so that the visitor sees his or her own face when tracing the name of a buddy or loved one. The result is at once intensely abstract and yet deeply personal, the encounter of the living with the dead through the mystery of names—a memorial that, unlike heroic statuary, does not seek to direct one’s response. Although modernist in spirit, Lin’s conception had echoes of the past, evoking the vast walls naming the missing from Britain’s Great War on the Menin Gate at Ypres and the Thiepval Memorial on the Somme. In fact, Lutyens’s design for Thiepval, studied for a course at Yale, was a major influence on Lin.2

Despite the initial controversy, since its inauguration in November 1982 the Vietnam Veterans Memorial has become one of Washington’s most popular spaces, attracting more than three million visitors a year. It is now one of America’s “sites of memory”—to render in English the untranslatable neologism les lieux de mémoire coined by the French scholar Pierre Nora to entitle his herculean seven-volume project published between 1984 and 1992. Nora’s use of the word “memory” was problematic, suggesting an autonomous, almost metaphysical, force that he romanticized as living national spirit, in contrast with what he considered the arid science of “history.” Nora believed that in 1980s France national memory had been overshadowed by scientific history: hence his project, assisted by more than 120 authors, to document the monuments, rituals, texts, and images that evoke what, he claimed, his countrymen understood by “France.” Although Nora’s idée fixe was in many ways inscrutably French, his work, translated into English by the end of the twentieth century, popularized the term “sites of memory.” Not only did this became almost a cliché for cultural historians, it also fit the broader public fascination with places and artifacts that, like Maya Lin’s wall, left them scope for private remembrance.3

Internationally memorialization of the Great War became much more intense after the end of the Cold War. The fall of the Berlin Wall and the other revolutions of 1989 redrew the map of eastern Europe, ending the de facto settlement since the demise of the Third Reich that had entrenched Soviet power across half the Continent. Germany, divided since 1945, was united again in November 1990; at the end of 1991 the Soviet Union fell apart, leaving a ring of fractious national states around the periphery of the new Russian Republic. These dramatic events in 1989–91 reopened many issues from 1917–18—that earlier era of imperial collapse and popular revolutions—and also prompted a new interest in the First World War, now finally emerging from the long shadow of 1939–45.

The dramatic denouement of the Cold War ended the division of Germany, but it also reopened the historic question of Germany’s place in Europe. Although publicly committed to eventual German unification after 1945, the Western allies had actually been quite content with the solidification of the Federal Republic and the German Democratic Republic, especially once the flashpoint of Berlin had been defused by construction of the Wall in 1961. “I love Germany so much that I am glad there are two of them,” quipped French intellectual François Mauriac. In 1990 the newly unified Germany finally accepted de jure the existence of Poland and renounced all territorial claims in eastern Europe, ending a revisionist project that had characterized German foreign policy in various forms since the Treaty of Versailles. But the highly controversial decision to move the capital from Bonn to Berlin, passed in 1991 in the Bundestag by a mere seventeen votes, opened up old controversies. Critics argued that Bonn and the Rhineland symbolized the country’s new start in 1945 as a Western democracy, whereas Berlin bore the indelible stigma of Germany’s disastrous “Prussian,” militaristic heritage. Claims that Germany could now become a “normal” sovereign state seemed utopian: “our neighbours and partners will not look at us as a normal country,” warned state secretary Wolfgang Ischinger, “due to our particular German history.”4

Many commentators saw worrying parallels between the end of the twentieth century and its beginning: Germany as an independent nation-state was too strong for Europe’s balance of power yet too weak to exert stable continental leadership.5 The interface between Germany’s power and weakness had lain at the heart of thirty years of instability and conflict from 1914 to 1945. But optimists within Germany and outside pointed to the European Union as a novel framework that had been lacking after 1918. “Germany is our fatherland, Europe is our future,” insisted Chancellor Helmut Kohl. His foreign minister, Hans-Dietrich Genscher, echoed the aphorism of Thomas Mann: “We do not aspire to a German Europe but want to live in a European Germany.” An amendment to the country’s Constitution was passed, requiring the Federal Republic to seek a European Union that was “bound to democratic, legal, social and federal principles”—in other words, not a centralized super-state.6

In 1989 Germany’s neighbors were not initially persuaded by such integrationist rhetoric. French president François Mitterrand warned of a return to the Europe of 1913, with Britain, France, and Russia aligned against the German menace. When Mikhail Gorbachev failed to resist Kohl, the French leader talked darkly of another Munich—with France and Britain, as in 1938, lacking the means to defy Germany. But once Kohl secured American support and the momentum for unification became unstoppable, Mitterrand revived the post-1950 French strategy, declaring that “the German problem will be regulated by the magnetic force of Europe.”7 As a condition for accepting rapid German reunification, he obliged Kohl to accept European monetary union to contain the burgeoning economic power of a bigger Germany. This meant an end to the cherished deutschemark, talisman for many Germans of the FRG’s postwar prosperity. The Maastricht Treaty of February 1992 paved the way for a single currency, the euro, as of January 2002.

In this whole process Britain took a negative and increasingly isolated line. Although the fall of the Berlin Wall was welcomed as a sign of freedom, some British commentators soon talked of an impending “Fourth Reich.” Monetary union, fumed cabinet minister Nicholas Ridley, was “a German racket designed to take over the whole of Europe.” As for handing over sovereignty to the European Community, he exclaimed, “you might as well give it to Adolf Hitler.”8 After such a tirade Ridley had to be sacked but Prime Minister Margaret Thatcher privately shared his views. Her opposition to German unification, though reflecting sensitivity for Gorbachev’s increasingly shaky position in Moscow, really derived from her reading of history as a child of the 1920s nurtured by a staunchly anticontinental father. As she made clear in her memoirs, Thatcher was sure that countries had distinct national characters. “Since the unification of Germany under Bismarck—perhaps partly because national unification came so late—Germany has veered unpredictably between aggression and self-doubt.” She doubted that 1945 had marked a fundamental change in the national character and dismissed the idea that German volatility could be contained within a European framework: a reunited Germany was “simply too big and powerful to be just another player within Europe.” The only answer, she argued, was a European balance of power based on a close Anglo-French accord backed by the United States. In other words, a Europe based on Atlantic alliance rather than the European Union. To this end she turned all her persuasive powers on Mitterrand and President George H. W. Bush, often pulling out of her notorious handbag a map showing the various configurations of Germany in the past, which, she said, were “not altogether reassuring about the future.” But Bush (whose World War II service was not in Europe but the Pacific) did not share her historical hangups about Germany, while Mitterrand—like his predecessors—eventually opted for the European answer to the German question. The end result of Thatcher’s opposition to unification, reported the British ambassador, was that “Britain’s public standing in Germany is at its lowest for years.” In 1992, Thatcher’s successor, John Major, though less of a Euroskeptic, secured a British opt-out from the Maastricht Treaty and monetary union.9

Thatcher and Major demonstrated the enduring grip of the essential British narrative about the two world wars—established, as we have seen, in the 1940s. Bush and most of his countrymen took a much more positive view of 1989–91, viewing the lifting of the iron curtain and the collapse of the Soviet Union as triumphant vindication of American power and values. The pundit Francis Fukuyama asserted that the end of the Cold War marked nothing less than “the end of history as such.” Although “events” would continue to happen, he argued that we had now seen “the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.”10

One expression of this triumphalism was a new interest in what Americans saw as the origins of the Cold War—the 1917 clash of ideologies between Wilson and Lenin. Books and essays on Wilsonianism proliferated in America after 1991: studies about the “Wilsonian century,” “impulse,” “moment,” or “persuasion.” Their authors discerned in Wilson’s inchoate ideas for replacing power politics with a liberal international order the fundamentals of American foreign policy for much of the twentieth century, which had guided Cold War diplomacy and even the shaping of German reunification. It was claimed that Wilsonianism “launched the transformation of the norms and standards of international relations” that led eventually to global decolonization; even that the president’s ideas had been the main instrument of the march of freedom, so that by the 1990s “the world has been made, if not fully democratic, then safe for democracy.”11

According to historian Frank Ninkovich in 1999, the “Wilsonian century” was over: “the post-Cold-War world” was not another opportunity to “institutionalize Wilsonian policies” but rather “the occasion for dispensing with them altogether.”12 Some older neoconservatives agreed with Ninkovich. “It is not within the United States’ power to democratize the world,” declared former Reagan adviser Jeane Kirkpatrick. “With a return to ‘normal times,’ we can again become a normal nation.” But Wilsonianism was given a new lease on life in Washington by the advent of the George W. Bush administration in 2001. Younger neoconservatives such as Charles Krauthammer insisted that the promotion of democracy must be “the touchstone of a new ideological American foreign policy”—a policy of what he grandiloquently called “universal dominion” in a “unipolar world.” Some neocons, such as Max Boot, described themselves as “hard Wilsonians,” meaning advocates of American power to enforce liberal values. Their favored target was the Middle East. “I think there is a potential civic culture in Arab countries that can lead to democratic institutions,” Richard Perle asserted in March 2001, “and I think that Iraq is probably the best place to put that proposition to the test.” Neocons exploited the al-Qaeda attacks on America on September 11, 2001, as pretext for an operation to topple Saddam Hussein. “In word, if not yet in deed, Bush is becoming the most Wilsonian president since Wilson himself,” declared Lawrence F. Kaplan on the eve of the Iraq war, urging the president to “complete the job Wilson began.” But the botched campaign in Iraq, predicated on simplistic assumptions that removing a “tyrant” would produce “freedom” and “democracy,” was a painful reminder that “hard Wilsonianism” did not bring about easy solutions. By the twilight of his presidency Bush was being denounced as “Woodrow Wilson on steroids, a grotesquely exaggerated and pridefully assertive version of the original.”13

While 1991 strengthened the Wilsonian grip on the discourse of American foreign policy, it finally loosened the stranglehold of Lenin over historical memory in Russia and across eastern Europe. As we have seen, in contrast to the commemoration of the Great Patriotic War of 1941–45, Russia’s First World War remained in the shade. There had been no Soviet equivalent of the Cenotaph in London, the Douaumont Ossuary at Verdun, the War Memorial in Canberra, or even the Neue Wache in Berlin, even though the Russian death toll from 1914–17 was roughly two million. Nor were there memorials to the war dead in towns and villages, again in marked contrast with France, Britain, and the countries of the British Empire—especially Australia, where roughly 1,500 public memorials were erected to express a “distant grief” for the bodies of loved ones interred in far-off foreign fields. For the Russians, however, 1914–17 was no distant conflict but a brutal struggle on home soil, yet for the bereaved, their grief was as “distant” as for Australians—perhaps more so given the chilling official silence.14

But the demise of the Soviet Union made it possible for Russian and Western historians to research the war of 1914–17—both in and for itself and as part of “Russia’s continuum of crisis” from 1914 to 1921, with the revolutions of 1917 as “fulcrum.” This research often followed themes from recent Western scholarship, such as the extent of “war enthusiasm” in 1914 or the mobilization of ethnic nationalism.15 Russians were also finally able to memorialize the conflict in public. On the site of the projected All-Russian War Cemetery of 1915 there now stands a Memorial Park Complex of the Heroes of the First World War. Much of its iconography is religious, with numerous Russian Orthodox symbols and a small chapel of the Transfiguration on the site of the original cemetery church. Prominent, too, are national symbols, particularly the double-headed eagle of the Russian Federation. The park was formally opened on August 1, 2004, the ninetieth anniversary of the outbreak of war. But this has been a highly contested site, with continuing arguments about how far to commemorate “all the defenders of Russia who fell during wars for the Fatherland,” including the “Whites” who fought against the Reds in the Civil War. This reflected the underlying controversy in post-Soviet Russia about the Bolshevik Revolution as a triumph or a disaster.16

In parts of eastern Europe, the 1990s also saw the collapse of the post-1918 settlement. Yugoslavia, the contorted South Slav state forged by the Serbs after the Great War, had been held together as a federal polity during the Cold War largely by the will and skill of Josip Broz Tito, half Croat and half Slovene. But economic and ethnic strains intensified after his death in 1980 and the end of communist rule in 1989 precipitated total dissolution. Slovenia seceded quickly in 1991 and Croatia the following year, but the struggles in Bosnia-Herzegovina (1992–95) and Kosovo (1998–99) saw ethnic violence comparable to eastern Europe after the Great War. Sarajevo, infamous as the trigger of the July crisis in 1914, gained new notoriety for one of the longest sieges in modern history during the Bosnian war. Czechoslovakia, fabricated by Tomáš Masaryk in 1918, was another casualty of the end of the communist era. Tensions between the Czech lands and Slovakia had been recurrent during the country’s troubled history and they reached a crisis point after communist rule collapsed. Czechoslovakia ceased to exist on New Year’s Eve 1992. Its breakup was far from harmonious, but compared with the horrors of Yugoslavia this was aptly described as a “velvet divorce” to follow the country’s “velvet revolution” in 1989.

Along the Soviet borderlands, however, the 1990s dynamic was rather different—a return to the post-1918 order rather than its rejection. The Baltic states of Estonia, Latvia, and Lithuania had a particularly checkered twentieth-century history. Although under Russian control since the eighteenth century, the Great War was their “freedom moment”: each gained independence in 1920 after savage battles with both Germans and Russians. But then they were gobbled up again in 1940 by the Soviet Union under the Nazi-Soviet pact, before being conquered by Nazi Germany in 1941 and then recaptured in 1944 by the Red Army. During this “double occupation” local people fought for both sides, and some collaborated in the Nazi extermination of the Jews. When anti-Soviet protest escalated in the late 1980s, Estonia, Latvia, and Lithuania took the lead—the human chain across all three states to mark the fiftieth anniversary of the Nazi-Soviet pact in August 1989 being a graphic example. In 1991 Ukraine also gained its independence from the USSR but, unlike the Baltics, it had never enjoyed freedom between the world wars. The short-lived Ukrainian People’s Republic had desperately sought international recognition at the Paris peace conference, only to be partitioned by Poland and the Soviet Union in 1921. During the Second World War thousands of Ukrainian partisans fought alongside the Germans against the Red Army before the region was again brought under Soviet control in 1943.

As Western historians began to recognize in the 1990s, the Baltic states and Ukraine had been the “shatter zones” of twentieth-century Europe, where Germany and Russia kept colliding. They were the “bloodlands” where ethnic conflict, political brutalization, and paramilitary violence had been especially ferocious during and after the two world wars.17

Coming to terms with such tangled and painful histories was not going to be easy. In the West, especially America, it seemed that the thawing of communist repression had simply revived historic nationalisms frozen during the Cold War—confirming a rather Wilsonian sense of the perpetually feuding Old World. Popular books gave the impression that, especially in the Balkans, ancient “ghosts” were emerging from the historical closet, from “a time-capsule world,” a “dim stage upon which people raged, spilled blood, experienced visions and ecstasies.” It was claimed that we were witnessing a “Rebirth of History” across eastern Europe: “it has emerged dramatically from its artificial hibernation after forty years, and it has much to catch up on.”18 But “history” was not an autonomous force: it was being exploited by contemporary politicians for their own ends. The most egregious example was Slobodan Miloševiimage in Yugoslavia. Seeking a new patriotic legitimation for his leadership when communism collapsed, Miloševiimage revived old Serbian folk-memories, especially of the battle of Kosovo Polje against the Muslim Turks in 1389—parading the coffin of the defeated hero, Prince Lazar, through every town and village in Serbia to arouse feeling ahead of the six-hundredth anniversary in 1989. Just as in the era of Masaryk and Piłsudski nationalism was being whipped up by nationalists as much as the other way around.

The so-called “memory wars” across former communist Europe were an extension of this process, as political groups used rival versions of the past to critique the present and shape the future. In the Baltic states, public monuments became particularly contentious. The “Bronze Soldier” in the center of Tallinn, erected in 1947 to honor the Soviet “liberators” of Estonia from Nazi rule, was a focus for riots in 2007, prompting the government to move it to a military cemetery on the outskirts of the city. To mark the “true” liberation of the country, a War of Independence Victory column was raised in Tallinn’s Freedom Square in 2009, to honor the 4,000 Estonian dead from the struggle against Russia in 1918–20. This finally realized a project planned in 1919 and started in the mid-1930s but then suppressed during the Soviet years. Yet such assertions of a nationalist narrative going back to the First World War are deeply contentious. Estonia’s Russian community, a quarter of the population, see the “war of monuments” as an identity issue about their place in society—marginalizing them in a newly nationalist state. This is a pattern that recurs right across the ethnically diverse states of post-Soviet eastern Europe and recalls the struggles of the 1920s and 1930s.19

Nationalist insistence across eastern Europe that the Soviet regime was on a par with that of the Nazis has also proved contentious because it questions the centrality of the Holocaust for western European memorialization of the twentieth century: “Whoever says memory, says Shoah,” to quote the aphorism of Pierre Nora.20 For the EU, by the new millennium the Holocaust had become a vital element of “being European” in an era when fear of communism was no longer a unifying force. The Jewish genocide was represented “as an absolute moral evil against which to define those values—tolerance and diversity—that were seen as essential characteristics of modern western civilization.” During the 1990s the EU encouraged member states to adopt the anniversary of the liberation of Auschwitz, January 27, as Holocaust Memorial Day. Endorsing the proposal in Britain in 1999, Prime Minister Tony Blair avowed his determination to “ensure that the horrendous crimes against humanity committed during the Holocaust are never forgotten”—citing the recent wave of “ethnic cleansing” in Kosovo as “a stark example of the need for vigilance.”21

In America the “universalization” of the Holocaust was fostered by Schindler’s List—a blockbuster 1993 movie from Steven Spielberg, which won seven Academy Awards. This reduced the tangled complexities to a morality play of good versus evil, embodied in two men—a German Nazi who helps Jews to escape and a sadistic SS camp commandant. Even more important was the opening that same year of the Holocaust Museum in Washington, DC. This project had been a long-standing goal of Jewish organizations, but its realization involved a deliberate “Americanization” of the Holocaust, which “allocated the Jews a privileged role as victims” while giving Americans “a privileged role as witness by emphasizing the moral failures of passive by-standerdom” on the part of Europeans. The success of this pioneering venture in the nation’s capital spawned similar museums or memorials in most major American cities. And so the Holocaust came to be regarded in the West as “unique with reference to the past and universal for the future”—in other words “the Holocaust past is something that happened predominantly to the Jews, while the Holocaust future might happen to anyone.”22

The putative uniqueness of the historical Holocaust was, however, rejected by many in post-Soviet eastern Europe, who asserted the moral equivalence of Nazi and Soviet terror. They questioned the distinction made in 2002 by one American historian of Jewish descent between the “hot” memory of fascist crimes, still a burning issue, and what he considered the increasingly “cold” memory of communist iniquities, whose embers were dying down as the Soviet era receded into history.23 Instead, the 2008 Prague Declaration on European Conscience and Communism, promoted by the Czechs but widely supported across eastern Europe, demanded “recognition that many crimes committed in the name of Communism should be assessed as crimes against humanity serving as a warning for future generations, in the same way Nazi crimes were assessed by the Nuremberg Tribunal.” This call was taken up by the European Parliament. It was, however, denounced by Russia and various Jewish groups, who pointed out that many nationalists in Ukraine and the Baltics had collaborated in Nazi killings of the Jews—a story rarely commemorated in the new national museums. This fraught debate served to direct renewed attention on the legacies of the Great War in the “bloodlands” of eastern Europe, where perhaps “fourteen million people were deliberately murdered by two regimes over twelve years” between 1933 and 1945, at least a third by the Soviets through starvation and shootings. Belatedly recognizing this crime and remembering its victims was a major demand in twenty-first-century eastern Europe, which Holocaust memorialization was not allowed to overshadow. As the Polish scholar Maria Janion said when her country joined the EU: “To Europe, yes, but with our dead.”24

In eastern Europe, as in Russia, the end of the Soviet repression had finally unleashed real historical debate. And this was a region with many skeletons in the closet—relics from being the prime battlefield in two world wars, from the double Nazi-Soviet occupation and from the Holocaust, right back to the bloody tangle of nationalism and revolution in 1917–18. After 1989 the closet was ransacked and its contents appropriated selectively by rival political and ethnic groups. The bitter, chaotic “memory wars” that ensued were a far cry from the steady, layered process of reflection and then refraction that had characterized Great War remembrance in Britain since 1918.

Beyond the Russo-German borderlands, in areas where the Great War was not so contentious for current politics, the post–Cold War era allowed opportunities to move beyond narrow nationalistic history into reconciliation. A few are particularly striking.

A moving example in east-central Europe is the First World War Museum at Kobarid, better known in Britain and America as Caporetto, scene of the great rout of the Italian Army in October 1917. The battlefield now lies in Slovenia, an ironic commentary on the twelve futile battles along the Isonzo River in which hundreds of thousands of Italians died. The Alpine foothills are still littered with the debris of the Great War and this museum began in 1990 as a project by enthusiastic local collectors. Slovenia’s early entry into the EU enabled them to secure European money to develop a small but significant international museum, with text in four languages (Slovenian, Italian, English, and German). The declared intent is not to score national points but to document the suffering of the soldiers on all sides during twenty-nine months of fighting, paying special attention in displays and film to the climactic battle in October 1917. Kobarid “is not a museum about victory and glory . . . about conquest and revenge, about revanchism and national pride,” the guidebook explains. “It is men that are at the forefront, the men who—aloud or silently to themselves, for themselves or for their fellow sufferers—in various languages of the world endlessly shouted: ‘Damn all war!’”25

The most sophisticated of these projects of transnational memorialization was the Historial de la Grande Guerre at Péronne—a joint French, German, and British commemoration of the Somme in a new purpose-built museum next to the medieval castle used by the Germans as their headquarters during the battle. Historial had complicated origins. It was in part a product of the 1980s passion for family history, brainchild of Max Lejeune, a powerful regional politician, whose father had fought on the Somme in 1916. The latter returned home a broken man and a difficult parent and by the 1980s Lejeune wished to come to terms in a practical way with the shadow cast over his own life by the Somme. His political clout secured government funding: at Péronne, like Kobarid, the prospect of battlefield tourism rejuvenating an economically depressed area was an added incentive.

So Lejeune’s idea for a museum “originated in family history, his family history,” observes Professor Jay Winter, one of the academics who helped establish Péronne, but Lejeune’s particular vision was “seeing that such a museum was a way of turning national narratives into family narratives, resonant to a very wide public of several nationalities.” His conception caught the mood of European cooperation in the late 1980s, especially after Germany was reunified: Péronne opened in 1992, the year of the Maastricht Treaty. By stressing the French role in the battle of the Somme as well as the extent of the British losses, the museum challenged the rooted national paradigms for 1916—the French transfixed by Verdun, the British mired in the Somme. The portmanteau “Historial” was intended to convey a blend of history and memorial and, at Winter’s insistence, the museum included a research center to promote scholarship and conferences. So the project reflected the growing transnational cooperation among scholars of the Great War.26

The museum’s internal design set precedents in many ways. It was a genuinely tri-national presentation, taking equally seriously the French, British, and German stories, with text in all three languages. The way the objects were displayed was also distinctive, largely in the form of fosses or shallow rectangular pits in the floor to evoke the trenches inhabited by all three troglodyte armies. But the exhibits paid as much attention to civilians as to soldiers, again breaking new ground. Visitors move from a room depicting prewar origins, through the deepening war of 1914–16, and then into total war in 1916–18. Yet there is a strange, gaping hole at the center of the museum—the battle of the Somme itself. Quite deliberately no attempt was made to explain or even chronicle it as a historical event, unlike the extensive examination of the buildup to the July crisis. Instead a blank white wall between the 1914–16 and 1916–18 rooms was intended to convey “the impossibility of representing battle in a direct, figurative manner” or of “conveying the physical and moral suffering of the soldiers”: all this, we are told, is “the realm of the ineffable.” Accepting the “invisibility” of the battle’s reality, a specially made film screens a montage of contemporary images, documents, and sounds often using a medieval triptych form. Most of the material comes from soldiers, though it is interspersed periodically by official communiqués such as Haig’s dispatch of December 23, 1916, stating that the aims of the battle had been achieved. The overall intention was to convey how the Somme “was perceived by those involved in it,” leaving the viewer to respond, but the impression strongly conveyed is one of an indescribable human tragedy. That impression is prefigured earlier in the museum by extensive use of Otto Dix etchings to summon up the bestiality of war and then underlined by the final room which seeks to “show that the First World War was the great catastrophe determining the flow of the entire century.” So, although Historial was innovative in both form and presentation, its content conveyed what had become familiar themes about 1914–18 as the modern Urkatastrophe expressed, in this case almost metaphysically, by the Somme.27

Moving outside Europe, the evolution of Anzac Day in Australia since the 1990s has been a striking hybrid of nationalism and reconciliation. Successive governments have continued to foster public interest, with the efforts of Labor leaders Bob Hawke and Paul Keating in the late 1980s and early 1990s given new momentum by John Howard, leader of the Liberal-National Government from 1996 to 2007. For Howard, Anzac Day was about celebration as much as commemoration—as he put it in 2003, “the celebration of some wonderful values, of courage, of valour, of mateship, of decency, of a willingness as a nation to do the right thing, whatever the cost.” Such Australian values, he argued, were as essential in the War on Terror after 9/11 as they had been in past struggles against dictators.28 Howard’s government provided generous funding for the Department of Veterans’ Affairs to develop its educational section and promote Anzac Day through resource packs for schools. In collaboration with the Australian War Memorial, the DVA financed completion of the Nominal Rolls of all Australians who served in war, making them available online. These databases, with the records of some 300,000 personnel from 1914–18 and more than a million from 1939–45, are an invaluable resource for scholars and genealogists. But critics argue that this encourages a “militarization” of family history, because records of Australians in peacetime are far less accessible, and indeed the “militarization” of Australian history as a whole, featuring twentieth-century foreign wars as the making of the nation in order to distract from the first century of white settlement and native dispossession.29

More than for New Zealanders, partners in the original Australia New Zealand Army Corps, April 25 became Australia’s national day. Attendance at the dawn service at the War Memorial in Canberra rose from a mere 2,000 in 1977 and 6,000 in 1989 to 12,000 the following year, the seventy-fifth anniversary of the Gallipoli landings, when the venue had to be moved from the forecourt to the more spacious esplanade. By 2007 the total was estimated at 28,000.30 There was also a surge in “pilgrimages” to the Dardanelles, amounting now to more than 60,000 Australians a year. Some critics have decried the pilgrimages as “sentimental nationalism,” arguing that if Aussies find Gallipoli “charged with meanings” it is because those meanings were “made in Australia and unpacked in Turkey” rather than being embedded in the local landscape: “pilgrims carry the sacred with them” rather than discovering it at their destination.31

Yet despite the strident and sometimes crass patriotism, there is now a genuinely transnational dimension to Australian remembrance. Following Prime Minister Bob Hawke’s pioneering visit in 1990, every Anzac Day service at Gallipoli involves Turkish government participation. Australian memorialization now recognizes the importance of the campaign for the Turks, who were, after all, fighting to repel invaders. What they call the Battle of Çanakkale (the town on the Turkish side of the Dardanelles) is celebrated as a great victory, which also helped make the Ottoman commander Mustafa Kemal into Atatürk (“Father of the Turks”) and architect of modern Turkey. In the words of one recent transnational history: “For the British, French, Canadians, Indians and Germans, the Gallipoli campaign is remembered as just another name in a long, tragic list of World War I battles. For Turks, Australians and New Zealanders, Gallipoli is something apart—a significant event in the self-development of their individual nations.”32

This more inclusive view of Gallipoli owed a good deal to domestic pressure from Australia’s Turkish community—a tiny fraction of the post-1945 Turkish diaspora compared to their presence in West Germany but still politically significant in cities such as Melbourne. Arriving in growing numbers from 1968 under Australia’s assisted migration scheme, Turks often bristled at a national commemoration that cast them as primal enemy. Their first attempts to join the Melbourne Anzac Day parade were resisted by veterans’ leaders, one of whom warned: “Anyone that was shooting at us doesn’t get in.” But attitudes gradually changed and Turks now march every year in most of the major parades on April 25. In 1985 the Turkish government renamed Ari Burnu Beach, where Kemal commanded, as Anzac Cove; in response an Atatürk Memorial Garden was created in Canberra, just across the road from the Australian War Memorial—all signs of a new approach to remembrance as reconciliation.33

Where do the “British Isles” fit into this story? How far have Britain and Ireland been affected by post–Cold War reconfigurations of the Great War and of twentieth-century history? In terms of national identity, the effect has been very profound, as the huge forces of attraction and division generated in 1914–18 finally began to weaken.

In the summer of 1914 the United Kingdom seemed close to disintegration—on the verge of civil war in Ireland but also challenged by campaigns for Home Rule in Scotland and for disestablishment of the Anglican Church in Wales. Yet as we have seen, the Great War kindled a new feeling of Britishness in England, Scotland, and Wales, but it split Ireland into two rival states—one reliant on Britain to safeguard its Protestant identity, the other winning independence only after a brutal war against the British and an even more savage internecine conflict. The events of 1914–18 redefined both the United Kingdom and Ireland for most of the twentieth century and it was only in the 1990s that what we might call the Great War settlement finally came apart.

It may seem ironic that while Scots and Welsh regiments were “fighting for the rights of small nations, the cause of Home Rule was one of the casualties of the First World War.”34 But general pride at Britain’s victory in the Great War coupled with numerous memorials to its human cost fostered a new acceptance of British identity. Although Plaid Cymru and the Scottish Nationalist Party (SNP) were both founded in the interwar years, they had limited impact, and the sense of Britishness was reinvigorated by the Second World War. The Scots and the Welsh shared in the national narrative about Britain’s “finest hour,” perpetuated in films through the 1950s. This was a period of strong economic growth, in contrast to the interwar slump that had hit Scotland and Wales especially hard. The interventionist economics of both Labour and Conservative governments for a quarter century after 1945 also made the Union seem directly beneficial through a nexus of state subsidies, welfare benefits, and public housing. A third of the Scottish workforce was employed in local or central government as late as the 1980s. Even rural areas benefited: by the 1950s the Forestry Commission had become Scotland’s largest landowner. Not until the 1960s and 1970s, when Britain’s defeated rivals Germany and Japan bounced back as economic powers, did the war dividend run out both economically and psychologically. The Scottish and Welsh economies—dependent on heavy industries such as coal, steel, and shipbuilding that had been sustained by Clement Attlee’s nationalization—became seriously uncompetitive. In this harsher climate nationalist politics had more appeal: in 1967–68 the SNP finally won a seat in at Westminster and Plaid Cymru dramatically cut Labour majorities in hitherto safe constituencies.35

The nationalist resurgence took different forms in the two countries, however. In Wales the dominant theme was culture, especially the survival of the Welsh language. In 1900 more than half the population spoke Welsh, by the 1960s barely a quarter, but the Language Act of 1967 gave Welsh equal official status with English. Nationalist feeling in Wales was mainly concerned with “the preservation of a disappearing way of life,” whereas Scottish nationalism was more aggressively about “building on to recognized institutions new ways of asserting distinctiveness from England.” The separate systems of law and education that had survived from 1707 were important foundations.36

It was in Scotland that pressures for devolution became particularly insistent, aided by the rapid demise of the British Empire in which the Scottish contribution in manpower, finance, and trade had been hugely disproportionate to the country’s size and population. The tartan-clad Scottish regiments, now being steadily disbanded, had enjoyed “unchallenged prominence in Scottish society as symbols of national self-image.”37 Anxious to head off the SNP, in 1979 a weak Labour government arranged a referendum on devolution in Scotland and Wales, which failed to win the necessary majorities. But the Thatcher government of the 1980s, with its centralizing tendencies, sale of nationalized industries, and drastic cuts in public spending, virtually killed off Scottish Conservatism and persuaded many Scots that the Union as currently constituted was no longer working to their advantage. Her manner did not help: in the words of one Scottish Tory, “the problem that Margaret had was that she was a woman; an English woman and a bossy English woman.” Thatcher’s reform of local taxation (the notorious “Poll Tax”) was the last straw: Tory support north of the border collapsed from 22 seats in 1979 to none by 1997. On a broader plane, the changing international scene by the 1990s also affected attitudes. With not only two world wars but now the Cold War receding into history, the UK had lost a clear Other, an external enemy to “help to sustain British national identity against a common foe.”38

When the Labour government of Tony Blair offered new referenda on devolution in 1997, Scots voted decisively in favor, while Welsh nationalists won only a bare majority. Nevertheless the new executives and elected assemblies established in Edinburgh and Cardiff in 1999 gradually acquired more and more devolved powers from Westminster. In Scotland the SNP, in office from 2007, maneuvered its way to holding a full-scale referendum on independence in 2014. This was a year with special resonance for Scottish nationalists—exactly seven centuries since the great victory over the English at Bannockburn. But 2014 also marked the centenary of the outbreak of the Great War. The recent devolution-independence debate is a reminder of how 1914 helped to freeze British constitutional development for much of the twentieth century.

If Britain has been revisiting debates from before the Great War, Ireland in the 1990s finally began to move beyond its great divide of 1916. This had become more deeply entrenched in 1966 by rival fiftieth-anniversary commemorations of the Easter Rising and the first day of the Somme, which were catalysts for “The Troubles.” For Irish nationalists in the north and most people in the Irish Republic, the Great War as a whole had become a closed book—the service of Irish Catholic soldiers largely forgotten. The 1914–18 Memorial at Islandbridge, on the edge of Dublin, was closed during most of the Troubles for fear of violence. And in 1987 the IRA deliberately chose Remembrance Sunday to blow up the war memorial at Enniskillen, in Northern Ireland, killing eleven people.

But attitudes changed dramatically during the 1990s. The new post–Cold War interest in the Great War as what historian John Horne called “the seminal event in the cycle of violence and ideological extremism that marked the twentieth century” directed attention on Ireland’s place in that larger story.39 Even more important, the intense efforts of John Major and Tony Blair to promote the peace process in Northern Ireland and involve the Irish government gradually paid off, culminating in the Good Friday Agreement of 1998. Not only did this ease communal tensions and provide a framework for loyalists and republicans to work together in a new devolved government, it also removed British troops from the streets of Ulster, allowing the past involvement of Irishmen in Britain’s wars to reemerge less contentiously. Some community leaders in Belfast, recognizing that rival versions of history had been a root cause of the sectarian divide, tried to retrieve the Western Front as a common site of memory. The growing passion for family history often provided a point of entry: meetings to discuss photos and memorabilia from an ancestor’s war service helped develop contacts and networks that would have been unimaginable during the Troubles. The Connaught Rangers proved particularly useful for this purpose because the regiment, lacking the word “Royal” in its title, was less problematic for Catholics and nationalists.40

This theme of remembrance as reconciliation was picked up officially in the Island of Ireland Peace Tower in Belgium. The chosen site was near Mesen/Messines, where the 36th (Ulster) Division and the 16th (Irish) Division went into battle almost alongside each other in June 1917. The 110-foot tower was a public acknowledgment that Protestants and Catholics, loyalists and nationalists, had fought together as volunteers in the British Army in 1914–18—more than 210,000 in all, of whom some 25,000 lost their lives.41 Previous monuments, notably the 1921 Ulster Tower on the Somme, had been effectively loyalist memorials. The Peace Tower was dedicated on November 11, 1998, after an eleven a.m. remembrance service by President Mary McAleese and Queen Elizabeth II. This was the first time the two heads of state of Ireland and the United Kingdom had appeared together in a public ceremony (plates 26–28).

The tower and surrounding Peace Park were initiated as a project in community reconciliation by Paddy Harte, an Irish Fine Gael politician, and Glenn Barr, a former loyalist paramilitary from Belfast. The “Peace Pledge” on a stone in the park declared: “From this sacred shrine of remembrance, where soldiers of all nationalities, creeds and political allegiances were united in death, we appeal to all people in Ireland to help build a peaceful and tolerant society. Let us remember the solidarity and trust that developed between Protestant and Catholic soldiers when they served together in these trenches.” This new emphasis on “equality of sacrifice” was somewhat contrived. Although some Irishmen who fought in the war, such as Tom Kettle, did hope that common service in the trenches might bridge the sectarian divide, probably “most Irish soldiers could not have cared one way or the other.” But their intentions no longer mattered. The Irish dead of the conflict have “been conscripted (as the living Irish of the war years never were) to serve in a very political, if well-meaning, project of mutual communal understanding and reconciliation.”42

Although the United Kingdom was moving on in complex ways from the Great War settlement, British images of the war itself continued to be those established in the 1960s and 1970s. Much to the dismay, that is, of some military historians who complained that there were virtually two Western Fronts—the literary and the historical—each self-contained, with the former still dominating the public imagination. These historians countered that the Western Front, although “a place of horror and violence,” was also “a place of learning and technological advance” which “ultimately marked the greatest military victory—at least in terms of scale—in British history.”43

The most significant of these revisionist works, Forgotten Victory by Gary Sheffield, published in 2001, asserted that “the First World War was a tragic conflict, but it was neither futile nor meaningless. Just as in the struggles against Napoleon and, later, Hitler, it was a war that Britain had to fight and had to win,” another round in “a long struggle to prevent one continental state from dominating the rest.” As for the cliché that the British Army were “lions led by donkeys,” Sheffield argued that “against a background of revolutionary changes in the nature of war, the British army underwent a bloody learning curve and emerged as a formidable fighting force.” Situating the first day of the Somme within that process as “an important point” on the “learning curve,” he highlighted the improvement in operational effectiveness, built around a precise and effective creeping barrage, flexible infantry tactics and all-arms cooperation, which reached its apogee in the last “Hundred Days” of 1918. That phrase, redolent of 1815, was intended to shift attention from clichéd moments of 1918 such as the near rout on March 21 or Wilfred Owen’s death a week before the Armistice. Sheffield insisted that in the autumn of 1918 Haig’s army—the largest ever deployed in battle by the British Empire—achieved “by far the greatest victory in British military history.” Although the psychological impact of fresh American troops was immense, Sheffield questioned their effect on the battles of 1918 and blamed their heavy losses to crude gung-ho infantry tactics reminiscent of the British on the Somme in 1916. In other words, the Doughboys of 1918 were way back on the learning curve.44

The thesis of Forgotten Victory was echoed by other military historians, for instance William Philpott in his massive study of the Somme, pointedly entitled Bloody Victory (2009), which surveyed the five-month battle in its entirety and from the perspective of the Germans as much as the British and French. For Philpott the attrition of Germany on the Somme was “the military turning-point of the war,” even though the denouement came only two years later. This battle was, he argued, the equivalent of Stalingrad in the Second World War, where the appalling human cost has never been used to deny the fact of victory. Why, then, British resistance to a similar proposition about the Somme? Partly because at Stalingrad the Germans were clearly defeated, indeed humiliated, whereas nothing so dramatic was evident by the time the Somme battle petered out. Also because the dead of 1942–43 were Russians, whereas in 1916 they were British, from a nation totally unused to attritional war on that scale. The term “learning curve,” borrowed from business psychology, sticks in the throat of many people in Britain because the curve was lubricated so plentifully with soldiers’ blood. The intent of historians such as Sheffield and Philpott was to rescue the British Army from the mud, both literally and metaphorically. They downplayed the contrary argument, propounded decades earlier by Basil Liddell Hart, that it was the maritime blockade that starved Germany into submission, on the Western Front and the home front. They also had difficulty addressing the fact that the eventual “victory” was far less clear-cut in 1918 than in 1945. The best Sheffield could claim was that the Great War produced “negative gains”—in other words, stopping something worse from happening, namely German domination of the Continent, even though there had to be a second round, at greater human cost, in 1939–45.45

The revisionists shifted the terms of debate among specialists, but they did not alter public perceptions of the Great War. This is evident from three best-selling histories of the war, published around the eightieth anniversary.

John Keegan had certainly not changed his mind since writing The Face of Battle in 1976. In The First World War (1998) he caustically dismissed the “learning curve” as “an argument akin to the thought that Dunkirk was a valuable rehearsal in amphibious operations for D-Day.” For Keegan the technology of warfare in 1914–18, however much generals refined the tactics, simply added up to mass slaughter. “Only a very different technology” based on tanks and aircraft, which was not available till a generation later, could have averted such an outcome. Keegan regarded the First World War as “a tragic and unnecessary conflict”—unnecessary because better diplomacy could have arrested the slide to war in 1914, and tragic because of the ten million dead and the poisonous legacies which led to a Second World War that was “the direct outcome of the First.” By the final page of his book Keegan had moved beyond the realm of historical explanation, asserting that the First World War was “a mystery,” both in its origins and its course. “Why,” he asked almost imploringly, “did a prosperous continent, at the height of its success . . . choose to risk all it had won for itself and all it offered to the world on the lottery of a vicious and local internecine conflict?” The only positive Keegan could discern was another “mystery”—the tenacious courage of the ordinary soldiers and the comradeship forged in what he called “the earthwork cities of the Western and Eastern Fronts.”46

Here was the now familiar post-1960s British narrative repackaged for the latest anniversary. Keegan’s book was not a work of original research, and many of its sources were somewhat dated. By contrast Niall Ferguson based his eightieth-anniversary offering The Pity of War (1998) on substantial research by a team of assistants in British and German archives and on a wide range of recent books and journal articles. The result was a 600-page tome, detailed and analytical yet highly readable and full of provocative arguments. An economic historian by training, Ferguson calculated that despite their substantial superiority in resources, Britain, America, and France waged war far less effectively than their enemies. In accountancy terms it cost the Allies $36,485 to kill an enemy soldier, more than three times the per capita cost for the Central Powers, who also “killed at least 35 per cent more men than they lost.” So the learning curve was appallingly expensive in cash as well as corpses. Intrigued by counterfactual history and viewing 1914–18 from the 1990s vantage point of German reunification and the impending euro, Ferguson also offered the tendentious opinion that, if the British had not gone to war in 1914, Germany would have won but both Britain and Europe would have been better off: “it would have been infinitely preferable if Germany could have achieved its hegemonic position on the continent without two world wars.” The kaiser’s Reich, he insisted, was not like Hitler’s: it was driven by insecurity and weakness rather than the lust for power depicted by Fritz Fischer. Had Britain stood aside in 1914, exclaimed Ferguson, his imagination surging into overdrive, “Hitler could have eked out his living as a mediocre postcard painter” and “Lenin could have carried on his splenetic scribbling in Zurich.” Meanwhile, continental Europe could have been “transformed into something not wholly unlike the European Union we know today—but without the massive contraction in British overseas power entailed by fighting two world wars.”47

Most of this was pure speculation but, like A. J. P. Taylor, Ferguson liked to stir things up: this was the Great War recooked in a hot sauce to offend traditional British palates. But for all its piquant novelties, at root The Pity of War was a conventional view of the conflict, pivoting around the Western Front and informed by its poetry. The title came from Wilfred Owen, with whom the book began and ended. For Ferguson the war was indeed “piteous,” but he would not invoke the oft-used term “tragedy” because that had Shakespearean undertones of inevitability. No, he concluded, it was “nothing less than the greatest error of modern history.”48

The third major anniversary offering was 1914–18: The Great War and the Shaping of the Twentieth Century (1996) by Jay Winter and Blaine Baggett—a book but also a major TV series, shown in America and Britain. Thirty years on this was a very different spectacle from the BBC’s Great War of 1964. Winter viewed the conflict as “cultural history”: he wanted to explore how leaders and led “made sense of the war and its consequences” via images, language, and artistic forms. Baggett, a TV producer, had been inspired by reading Paul Fussell’s Great War and Modern Memory just as the Cold War ended—what interested him, like Winter, were legacies as much as 1914–18 itself. Personal factors also played a part in shaping the series: both authors were Americans whose attitudes toward war and its futility were colored by Vietnam, while Winter was a descendant of survivors from the Nazi death camps. For him, studying “what contemporaries of the time called the Great War” was “as close to the great horror of the twentieth century”—the Holocaust—as he could “bear to stand.” In the interpretation of Winter and Baggett the First World War started “a descent into darkness” which “normalized collective violence” as the “signature” of the twentieth century from Sarajevo in 1914 to Sarajevo in 1994. It was also the progenitor of “an industrial killing machine” that reached hellish perfection at Auschwitz.49

The television series was a co-production between the Public Broadcasting System (PBS) in America and the BBC in Britain. Reactions to it in the two countries were strikingly different. In America, where PBS is very much a minority interest, the response was huge and overwhelmingly positive—watched by about five million households, extensively reviewed in major newspapers and periodicals, and winning coveted Emmy and Peabody Awards. For many Americans—whose interest in their own Civil War had been kindled by Ken Burns’s immensely successful PBS series in 1990–91—this was their first serious exposure to Europe’s “civil war” of 1914–18. But in Britain, although audiences were relatively large (averaging 2.5 million), the response was more mixed. Not only was this for the British an unfamiliar take on the Great War, with much continental material and a cultural bias, but its relentless sense of futility offended military historians such as John Terraine, Haig’s great apologist. Winter had to fight very hard to keep the title “Slaughter” for the film about the Somme, Verdun, and Passchendaele. Cautious figures in the BBC urged “Sacrifice,” but he was insistent: “Sacrifices redeem; slaughter does not” and “the loss of life of three quarters of a million men had no redemptive ‘meaning.’ ” Correlli Barnett, like Terraine a consultant for the BBC’s Great War, was particularly incensed. In an article entitled “Oh What a Whingeing War” he lamented the spotty discussion of strategy and politics, without which, he argued, the fighting was bound to lack meaning. Barnett became so irritated by Jay Winter “honking away in academic’s American” and by “the cocksureness of his expression and pronouncements” that, he finally declared, “I longed to hang one on his hooter.”50

These big new histories of the war—presenting it as tragic “mystery,” supreme “error,” or horrific “slaughter”—served to confirm the familiar British narrative and attracted far more attention than the revisionist views of military historians. But the main way the conflict came alive for a general audience was still through literature. And in the 1990s the war poets were enfolded for the first time into popular fiction.

Novelist Pat Barker wrote a trilogy—Regeneration (1991), The Eye in the Door (1993), and The Ghost Road (1995)—around the story of Siegfried Sassoon and Wilfred Owen being treated for “shell-shock” in the Craiglockhart Hospital in Edinburgh. The first novel opens with the public declaration of protest in July 1917 by Sassoon, a veteran who has won the Military Cross, against a conflict “which I entered as a war of defence and liberation” but which “has now become a war of aggression and conquest . . . deliberately prolonged by those who have the power to end it.” His protagonist is the neurologist Dr. William Rivers, whose task is to “cure” shell-shocked men so that they can get back to the front. The trilogy revolves around the themes of madness—who exactly is sane in this situation of nightmare war?—and comradeship, not least the friendship between Sassoon and Owen that transforms the stuttering talents of the latter into a major poetic voice able to articulate for all time war’s insanity. Although Sassoon, Owen, and Rivers are historical characters, Barker creates Billy Prior, a rather anachronistic bisexual working-class junior officer, through whom she explores the issues of wartime homosexuality and the class structure of Britain and its army.51

Prior’s exploits in London in 1917–18 take center stage in the second novel, but The Ghost Road returns to the war during its final months. The novel’s title comes from lines by the poet Edward Thomas, quoted in the epigraph to this chapter, about the living soldiers treading heavily along the road to France, while “the dead / Returning lightly dance.” Prior has been sent back to the front and is serving in the 2nd Manchester Regiment, the same unit as Owen, whom he fancies. Both men are now inured to death and killing—to machine-gunning Germans “like killing fish in a bucket” and being splattered with the blood and brains of comrades. “We are Craiglockhart’s success stories,” Prior scribbles sardonically in his diary on October 6, 1918. “By any civilized standards (but what does that mean now?) we are objects of horror. But our nerves are completely steady and we are still alive.”52

Only for a few more weeks, however. Although armistice negotiations have become common knowledge (“nobody here sees the point of going on now”), the 2nd Manchester is thrown into a canal crossing, over sodden, open ground covered by enemy machine guns. “The whole operation’s insane,” one decorated officer exclaims. “The chances of success are zero.” But they are “told flatly, a simple, unsupported assertion, that the weight of the artillery would overcome all opposition.” Just in case the allusion is not clear, Barker has Prior note in his diary: “I think those words sent a chill down the spine of every man there who remembered the Somme.” (No learning curve here.) As Prior and Owen breathe their last on the Sambre Canal, at a hospital back in London, Rivers endures the final moments of one of their comrades, a young officer who has had half his face blown away. “Shotvarfet,” the lad keeps crying. “What’s he saying?” asks his anguished father, a retired officer and till now a staunch, unthinking patriot. Suddenly Rivers realizes: “It’s not worth it.” Just as he does so, the cry is picked up around the ward, “a wordless murmur from damaged brains and drooping mouths. Shotvarfet. Shotvarfet.” This, fumed military historian Brian Bond, “is the authentic whingeing note of the 1990s transposed unconvincingly to 1918.”53

The other bestselling Great War novel of the 1990s was Sebastian Faulks’s Birdsong (1993). The war had been rubbing away at the back of his mind for years: on November 11, 1965, as a twelve-year-old, growing hoarse while reading out the almost endless list of names of “old boys” from his school who had died in two world wars, and in November 1988 as a journalist covering a tour organized by Lyn Macdonald along the Western Front when old men talked of lost friends at their graves in immaculate English-garden cemeteries. Faulks began to feel that “the experience of this war had somehow slipped from public understanding,” overshadowed by “a second frenzy” twenty years later, “one aspect of which had been so well memorialized at the insistence of its victims that it seemed to leave no room in the public memory for earlier holocausts.”54

Birdsong tells the story of Stephen Wraysford, a battle-hardened junior officer who has become, like Owen and Prior in the Regeneration trilogy, almost addicted to the war that he hates. The centerpiece of the novel’s war narrative is what Faulks calls “the most infamous day in British military history,” July 1, 1916. His thirty-page account, which drew on Middlebrook’s First Day of the Somme, is told with familiar tropes such as the “comic opera” colonel assuring the men that “the enemy will be utterly demoralized” by the artillery barrage and that “only a handful of shots will be fired at you.” But also with vividly imaginative writing, such as Faulks’s evocation of darkness finally descending upon the battlefield: “The earth began to move. . . . It was like a resurrection in a cemetery twelve miles long” as the “bent, agonized shapes” of the wounded crawled back into their trenches “to reclaim their life.”55

Our sense of the soldiers’ troglodyte existence is accentuated by Faulks’s subplot—about the “sewer rats” who tunnel below no-man’s-land to explode mines under the enemy trenches, while the Germans try to do the same to them. Tunneling, declares Faulks, constituted “a hell within a hell” and he captures vividly its claustrophobia. Near the end of the book, set in the closing hours of the war, Stephen is freed from incarceration in a ruined tunnel, only to see his rescuer clothed in feldgrau, “the colour of his darkest dreams.” Wild-eyed, he raises his arms, ready to fight, and so does the German. But then the two men fall on each other’s shoulders weeping at what Faulks calls “the bitter strangeness of their human lives.” Here, for cognoscenti, was an echo of one of Owen’s most haunting poems, “Strange Meeting,” in which the poet dreams he has died and slipped “down some profound dull tunnel” into hell, where one of the “encumbered sleepers” springs up with “piteous recognition” in his eyes. “I am the enemy you killed, my friend.” But that was yesterday. “Let us sleep now . . .” Here, in Birdsong, life was imitating art, all within the realm of fiction.56

Around Stephen’s war story Faulks spins two other narratives. The first is Stephen’s passionate affair in 1910 with a married Frenchwoman in Amiens from which, unknown to him, a child is born. This prelude also enables Faulks to create ironic foretastes of things to come. There is, for instance, talk of a fishing trip on the River Ancre. “You must come,” Stephen is told: “They have famous ‘English teas’ at Thiepval.” But in order to address the reader’s anticipated query “What has all this remote horror to do with my modern life?” Faulks invented a more contemporary character to “pose just such questions.” This is Stephen’s granddaughter Elizabeth who, in the late 1970s, tries to find out about him—gradually deciphering his diary, discovered in the family attic, and meeting frail survivors of his forgotten war. Here, woven into the novel, is another thread of 1990s remembrance, the passion for family history. Elizabeth’s interest takes hold as she faces the prospect, aged thirty-eight and single, that she might die childless. “In the absence of her own children she had started to look back and wonder at the fate of a different generation,” feeling “almost maternal to them,” especially to the man who had been “her own flesh and blood.”57

Unlike the dark, savage conclusion of The Ghost Road, Faulks offers redemption from the past. Emulating Stephen’s “strange meeting” with the enemy at the end of his war, Elizabeth finally does find a partner and she gives birth to a new generation. Honoring a promise she now knows that her grandfather had made to Jack Firebrace, a “sewer rat” who once saved his life, the child is named John—in memory of Jack’s son who had died of diphtheria. Thus, the past is somehow redeemed in the present. This was what Faulks aspired to do through his book and, he hoped, his readers as well—making what he called “gestures of love and redemption towards the past.”58

In different, sometimes contrived, ways both Pat Barker and Sebastian Faulks drew on 1990s patterns of British remembrance—the axiomatic futility of the Great War, the cruciform centrality of the Somme, and the dominant voice of poets such as Thomas and Owen. Their novels became bestsellers—The Ghost Road winning the Booker Prize, Britain’s highest award for fiction—and clearly left many people with a vivid and enduring impression of the Great War. “People should read this book,” one fan of Birdsong commented, “to be aware, without reading a dull, factual history book, how dreadful things were in the Great War.” Another described Birdsong as “the book that brought me an understanding of a time in history that before I couldn’t identify with.” Fiction, in other words, seemed truer than fact. The influence of the novels was further enlarged via the screen. Barker’s trilogy was condensed into a film, Regeneration (1997), distributed in the United States under the title Behind the Lines, while Birdsong was finally adapted into a two-part television series in 2012 by the BBC and PBS in America (though removing the Elizabeth subplot entirely). In Britain the two episodes attracted viewing figures of seven million and six million respectively—attention on a scale that no history book could match. The American exposure was also significant. When Faulks first hawked his manuscript fruitlessly around Manhattan in 1993, one editor advised him to set the story “in a more recent conflict.” But by the 2000s, thanks to film and television as well as novels, the British narrative of the Great War was becoming entrenched on the other side of the Atlantic.59

For these novelists, sites of memory are central to the plots, inspiring some of their most lyrical writing. In Faulks’s Birdsong, Elizabeth’s quest turns into an obsession after visiting Lutyens’s vast war memorial at Thiepval on the Somme—its multiple, soaring arches chiseled with 73,000 British names “as though the surface of the sky had been papered in footnotes.” Are these “men who died in this battle?” she asks. “No,” a custodian replies: “The lost, the ones they did not find. The others are in the cemeteries.” So these are “just the . . . unfound,” she gasps: “From the whole war?” The man shakes his head: “Just these fields.” Elizabeth slumps on the steps of the monument: “My God, nobody told me.” In this personal encounter with a site of memory—her “strange meeting,” one might say—the past suddenly becomes present. In Another World (1998), Pat Barker’s grisly exploration of the haunted mind of a Somme veteran, Thiepval seems truly repulsive to Nick, the old man’s grandson. It reminded him of “a warrior’s helmet with no head inside. No, worse than that: Golgotha, the place of a skull.” Thiepval is a place of “annihilating abstractions”—not “a triumph over death but the triumph of death”—all very different from what Lutyens and Kipling intended in 1932 when raising the arches and etching into the stone “Their Name Liveth for Evermore.” In other words, sites of memory are also sights of memory—dependent on the eyes of the beholder.60

All of these novels revolved around individual soldiers, explored both in body and in mind, in ways that sum up the British identification of the Great War with the experience of the Tommies. This obsession became yet more marked during the 1990s. The British Legion mounted a determined campaign to revive the two-minute silence on November 11 itself, rather than simply on the nearest Remembrance Sunday. This reversion to the practice of the 1920s and 1930s became established in 1995, the fiftieth anniversary of the ending of the Second World War. In the new millennium the development of the Internet gave further impetus to family history of the Great War, as soldiers’ records became available online. This made it easier to undertake research from the comfort of home rather than by visiting the National Archives in southwest London.

Public fascination with the Tommies reached a peak early in the new millennium, as the last surviving veterans gradually passed away. This elegiac moment was not narrowly British but transnational. In all the former belligerent countries the surviving veterans were officially identified and their final years observed by officialdom and the media with almost macabre anticipation. Australia, for instance, witnessed what has been called “an increasingly hysterical countdown” to the death of the last Anzac in 2002.61 This man was Alec Campbell from Tasmania, who had served for a couple of months at Gallipoli as an ammunition carrier when only sixteen. By the end of his life he was lauded by politicians and the media as an “Australian Legend” and the country’s “last living link” with Gallipoli. After his death, at 103, Campbell was accorded a state funeral. In the United States “The Last Doughboy” from 1917–18, Frank W. Buckles, died in February 2011, at 110, and was interred in Arlington National Cemetery. Spry and articulate almost to the end, he had become the figurehead of a campaign to establish a proper National World War I Memorial on the Mall in Washington. In France, the last soldier of the Great War (“le dernier poilu”) was identified as Lazare Ponticelli, who, somewhat inconveniently, had fought in both the French and Italian armies during the Great War. He resisted official demands that he be interred among the nation’s great and good in the Panthéon, preferring to rest in the family grave in the Paris suburbs. But Ponticelli did consent to a state funeral which, after his death at the age of 110, was duly held in Les Invalides in March 2008, attended by President Nikolas Sarkozy, who then unveiled a plaque near the tomb of Marshal Foch to all who had fought.62

From these examples one can discern common threads. Awe at people who live to such a great age—their present frailties contrasted visually with photographs of them in full manhood in 1914–18. Ordinary men made extraordinary by their longevity. Repeated invocation of their status as “heroes,” or at least representatives of a heroic generation. And a feeling that the last frail threads of “communicative memory” back to the Great War were being severed.

All of these sentiments and more were evident at the “Service to Mark the Passing of the World War One Generation” at Westminster Abbey on November 11, 2009. Its cue was the death of the three remaining British veterans of 1914–18 earlier that year. The man officially designated as “The Last Tommy,” Harry Patch, born in 1898, had fought and killed at Passchendaele in 1917. After 1918 he lived an active life, becoming a plumber in peacetime and a volunteer fireman during World War II, and enjoyed a long retirement. Only in the twenty-first century did he start speaking about the Great War, returning to Passchendaele in 2005, and then being memorialized in a film, a book, and a poem.63

During the November 2009 service to mark “the passing of this remarkable generation,” the war poets were very audible. The choir sang the setting of Wilfred Owen’s poem “Agnus Dei” from Benjamin Britten’s War Requiem and the actor Jeremy Irons, standing by the memorial in Poets’ Corner to the World War I poets, read modern verses compose by the Poet Laureate Carol Ann Duffy. Her “Last Post” starts with the famous lines from Owen’s “Dulce et Decorum Est,” his nightmare vision of a soldier too slow in putting on his gas mask and now floundering “as under a green sea.”

In all my dreams, before my helpless sight,

He plunges at me, guttering, choking, drowning

But Duffy’s dying soldier is the victim of shrapnel, not gas. She imagines the poet telling his story backward, willing him to rewrite the past so we watch the Tommy rise up, amazed, from the “stinking mud” as blood spurts back into his body—and into thousands more men as “lines and lines of British boys rewind/back to their trenches.” And back further, dropping their guns, back into the town square for coffee and “warm French bread,” and out of the war itself, “released from History” with “several million lives still possible”—lives full of love and hope instead of “entering the story now / to die and die and die” in muddy oblivion. 64

“Last Post” is a quick but sharp piece of writing, playing off a poetic classic from 1917 and using the familiar tropes of mud and death, yet mingling them with images from the contemporary world, such as pressing the Rewind button and breakfasting in a French patisserie. Duffy is imaginatively pitting poetry against history, longing to reverse the horrors and redeem the past—also Faulks’s aspiration in Birdsong. Yet the poignancy of “Last Post” derives from the fact that she knows, as do we, that poetry cannot “tell it backwards.”

History can do that, however—if we understand “history” to be a process of interpretation and reinterpretation rather than the recitation of immutable facts. Even if historians write forward, telling a sequential narrative, they think backward from the present into the past. Such a dialogue between past and present has been the dynamic pivot of The Long Shadow. In the concluding chapter I shall try to “tell it backwards”—setting the complex legacies of 1914–18 against the current, constricted British view of the Great War and asking, “Why?”