Lessons of the Holocaust call to mind lessons of history, a subject which luminaries have pondered for centuries. Put otherwise, when we commit to the existence of lessons of the Holocaust we are acknowledging a more general capacity to formulate lessons from the past. For whatever else it was, the Holocaust was part of the past about which histories are written, and there should be no reason in principle why some histories should yield lessons and others not. Over the years thinkers have hardly been unanimous that discerning such lessons is possible and that individuals can shape the course of history by applying them. During the Enlightenment some thought deeply about this question because of its radical religious implications. If people could control how history unfolded, some thought, this precluded divine planning or intervention. As late as a century and a half ago, the idea of human agency in shaping historical events was something of a novelty and a contention sharply disputed. Many in the premodern era continued to believe that the course of history was set by Divine Providence, and that individuals could do little to effect particular outcomes. Against this view others sought to formulate a non-theological notion of history, “an ever more complex narrative of secular circumstances, contingencies and changes, [that] has been a principal instrument of the reduction of the divine to the human,” as the historian of political thought John Pocock puts it. People could shape the flow of history independently of divine intention, many came to believe, and one way of doing so was to understand and apply lessons of past.
Serious debate turned upon how this might happen, and whether human agency operated with a clean slate, or within fixed parameters. In one tradition, intellectuals invested great energy in delineating predetermined directions of history and speculating about patterns that they discerned. During the nineteenth and twentieth centuries, writers such as Oswald Spengler and Arnold Toynbee produced imposing narratives of the human past; they charted the rise and fall of empires and regimes, examining cycles they followed or trajectories they traversed, often in competition with each other. One of the purposes for constructing these schemas was that people could thereby learn where things were headed – that is, what the future held in store. Other thinkers despaired entirely of being able to deduce such patterns. To them, such grand visions were mythical constructs, and so were claims that one could tease out lessons from what had gone before. Their thoughts on History, with a capital H, could be unpretentiously short, much as those of the great theorists were grand and often portentous. Pressed at the end of his days to tell the world what he understood to be the lessons of history, the great polymath and visionary Aldous Huxley responded with humane simplicity: “It is a bit embarrassing to have been concerned with the human problem all one’s life and find at the end that one has no more to offer than ‘try to be a little kinder.’” Huxley’s comment has a disarming, appealing character. Perhaps we are relieved at the thought that what is asked of us may be simpler than we feared. And that we need not bother to chart the rise and fall of civilizations. Similarly appealing may be the conclusions of the Israeli statesman Abba Eban, who once observed, “history teaches us that men and nations behave wisely once they have exhausted all other alternatives.” We like to hear that even the great and famous may lose their way, just like us. Or perhaps, more accurately, that there is no way to lose.
Santayana’s famous dictum that “those who cannot remember the past are condemned to repeat it” enjoys, I would say, almost canonical authority, ritually intoned as a warning about not learning from the past. Less frequently appreciated is both that Santayana’s phrase has generally been torn from its context in his multi-volume The Life of Reason and that his intended notion has been challenged by those who have thought deeply about how history can be used. Understood popularly as an admonition to study history so as not to make mistakes in the present, Santayana’s observation should rather be taken, say his critics, as an explanation of how we acquire knowledge by retaining past experience, not as an injunction to make use of history for particular purposes. Santayana’s choice of the term “remembered” should stand as a warning sign.
As observers have frequently noted, history is quite different from memory, or even collective memory. Pierre Nora, the French authority on the subject, points out that far from being synonymous, the two are frequently at odds. “Memory is life, borne by living societies founded in its name. It remains in permanent evolution, open to the dialectic of remembering and forgetting, unconscious of its successive deformations, vulnerable to manipulation and appropriation, susceptible to being long dormant and periodically revived. History, on the other hand, is the reconstruction, always problematic and incomplete, of what is no longer.” The study of history requires objectivity and involves a quest for understanding. What results is a reasoned reconstruction of the past that may well have disappeared from popular memory. It involves the sifting of evidence, comparison, and analysis. Professionally, it depends upon peer review and publication. And it is part of a discourse, among both specialists and amateurs, in which there is a common commitment to truth. Memory is subjective, partial, and is itself subject to changes over time. Memory does not seek so much to understand the past as to recover parts of it, and for particular purposes. Memory is cultivated, not constructed. It intrudes, rather than being deduced. It is part of identity, instead of an assessment emerging from debate, comparison, and analysis.
In recent times there has been a great vogue among historians for the study of memory as a means to understand those who remember. Memory, says the French scholar Henry Rousso, “is currently the predominant term for designating the past, not in an objective, rational manner, but with the implicit idea that one must preserve this past and keep it alive by attributing to it a role without ever specifying which role it should be given.” Memory “has only a partial rapport with that past,” he writes. It is “a reconstructed or reconstituted presence that organizes itself in the psyche of individuals around a complex maze of images, words, and sensations.” The psychologist Daniel Schacter adds an additional point: In reconstructing the past, “sometimes we add feelings, beliefs, or even knowledge we obtained after the experience. In other words, we bias our memories of the past by attributing to them emotions we obtained after the experience.” And finally, memory, the historian Tony Judt once noted, “is inherently contentious and partisan: one man’s acknowledgement is another’s omission.” That is why it is such a poor guide to the past.
It is not really remembering things that happened, therefore, that those who commend Santayana to us have in mind when they see history as a source of lessons; it is rather ruminating on the conclusions that historians draw from their study of the past. And this is, of course, where the problems begin. For as every historian knows, history is subject to interpretation, and the effort to derive universally accepted lessons from it turns out to be a hazardous enterprise. This was the earliest lesson I had about the discipline of history, coming from an experience I still recall from my first year at the University of Toronto. One of the glories of the “honours courses” in the first year of Soc and Phil, the program described in the last chapter, was a grand sweep of European history that included a series of lectures given by a distinguished medievalist, Bertie Wilkinson. An eminent constitutional historian who had come to Toronto from Manchester, Wilkinson had a flair for popular history, once broadcasting a series of talks on great personalities in history – Simon de Montfort, Joan of Arc, John Wycliffe, and Oliver Cromwell, among others – over a popular Toronto radio station, CFRB, on Sunday afternoons. I can still recall, after so many years, how impressed I was to hear Wilkinson tell us, in a large lecture theatre, how he had been wrongly criticized in a review of one of his books in the New York Times. “He accused me of being an anachronistic liberal,” he said. I had only the faintest idea of what being a liberal meant in medieval constitutional terms, let alone an anachronistic liberal – but what so impressed me at the time was that my professor could be challenged in the New York Times, and that historians had disagreements worthy of debate in venues such as that.
Santayana’s dictum takes us to the heart of disputations about what can be done with historians’ findings. How do we understand the history that is supposed to give rise to lessons? Which interpretations do we choose? And which lessons are applicable to which situations? Have we overlooked some lessons by not asking the right questions? And do we sometimes get the wrong answers? The American historian Arthur Schlesinger, Jr once called the past “an enormous grab bag with a prize for everybody.” By this he meant that, as with much intellectual inquiry, if one looks hard enough at evidence one can come up with the answer – or the lesson – that one wants. That is probably why he once insisted that Santayana’s oft-quoted maxim is in fact quite unhelpful: “the historian can never be sure … to what extent the invocation of history is no more than a means of dignifying a conclusion already reached on other grounds.” Even worse, considering the authority of some interpretations as a civic duty can close down critical reflection. That is why the popular historian Otto Friedrich once quipped, “Those who cannot forget the past are condemned to misunderstand it.”
Examining lessons that used to be drawn from the past reminds us of how, with the best of intentions, our predecessors erred, missing the right cues or shrinking from responses that were thought unpalatable. Geoffrey Elton, the most famous Tudor historian of his day, once provided a striking example. To someone of Elton’s background and generation (he was born in 1921 into a family of Jewish scholars named Ehrenberg who fled to Britain from Germany in 1939), no historical issue was more important than the warlike dominance of Prussia and then Germany of the European continent. In his reflections on the historical past, Elton meditated on “that dangerous little phrase, learning from the past.” “Owing to two World Wars largely unleashed by imperialist ambitions entertained by successive German governments,” he wrote after 1945,
it is now widely felt that the resurgence of Germany once again threatens the peace of the world. But when, in 1870, Germany was united under the leadership of that (nowadays) notoriously warlike country, Prussia, the common reaction was very different. That event came after some two centuries of almost uninterrupted aggression by France, and German unification came as the result of a victorious war against that established disturber of the peace. Therefore, at the time and for some time afterwards, the unification of Germany was very widely regarded (outside France) as a most fortunate event, and Bismarck, later converted into a menace, received much praise and admiration.
With the passage of time, therefore, some lessons lose their salience, and other lessons appear. To many observers, what emerges strikingly from history is its inability to predict. “What can history teach us?” asks the eminent French thinker Jacques Ellul. “Only the vanity of believing we can impose our own theories on history.”
Historical recollections and the lessons drawn from them are not necessarily salutary appeals to humane outcomes. Aware of this, Margaret MacMillan warns that “there are … many lessons and much advice offered by history, and it is easy to pick and choose what you want.” To many in the world today, the lessons of history are nationalist calls to action – calls to “remember” ancient myths of battles lost or won, martyrdom suffered, or miraculous deliverances that keep old enmities alive. And sometimes, even, lessons of history are calls for revenge – with accompanying, sometimes constant fears of imminent annihilation. To this day the Polish inhabitants of Kraków hear from a spire in St Mary’s Church in the city’s Market Square the haunting sound of the “Hejnał,” a sentry’s trumpeted call, originating in the thirteenth century, to the citizens of that city warning them that the Tatars were about to attack, and calling upon them to defend their lives by holding off the infidels. (Broadcast at noon on Polish radio, the haunting five notes of this anthem – abruptly interrupted when, according to legend, an arrow pierced the trumpeter’s throat – communicate a message to the entire country, and indeed to the world: Poland is in danger! Understanding these five notes as an emblem of Polish national identity, some even consider the Hejnał as a reminder to the Poles that they are forever threatened by being overwhelmed by some horde or other of aliens. I heard it first with a chill down my spine; Poles hear it more casually now, like the chimes that announce the arrival of a subway train.) Similar nationalist rallying cries are Irish Protestants’ appeals to “remember” the Battle of the Boyne in 1690, when Protestants under King William of Orange held back the forces of the Catholic King James, ensuring the Protestant ascendancy in Ireland; or the cries to “remember” the Alamo in Texas, harking back to the massacre of American defenders of a fortified mission who were slaughtered by Mexican attackers near San Antonio in 1836. As historian Max Hastings writes, “the vast majority of people of all nations, even the most liberal democracies, cherish their national myths too much to want mere facts, or even assertions of historical doubt, to besmirch them. They prefer a nursery view of their past to an adult one, and a host of authors and television producers is happy to indulge them.” Those who commend the remembrance of history as a universal norm usually have something better in mind than the cultivation of war cries of centuries past.
History is littered with particular instances in which Santayana’s maxim is likely to seem misplaced to historians because the analogies themselves are unable to withstand scholarly scrutiny. The famous Maginot Line, designed in the interwar period to defend France against a German and Italian attack, is a classic example. Named after a French minister of defence, André Maginot, it was deployed by the French, according to the latest military technology of the day, as a network of fortifications, tank obstacles, and other obstructions stretching from Belgium to the Swiss border, behind which their military hoped to mount an impregnable defence along France’s eastern frontiers. The lessons of history, in this case the war of 1914–18, so it seemed to the builders of these great structures, were that defence trumped offence in modern warfare. The terrible bloodletting of the Western Front of the First World War, they believed, yielded this conclusion from a careful study of the recent military past – with its trench warfare, crippling artillery barrages, and rail transport of reinforcements. In the event, of course, the Maginot Line was built to no avail: in 1940 the Germans’ innovations in the use of armoured columns and their daring penetration of French territory through the Ardennes forest proved decisive and the French defences were circumvented. The French mistake was not a failure to attend to the lessons of history, at least as those lessons were understood at the time; it was rather that French decision makers proved unable to grasp the innovations in war making that upset the conventional paradigms. Generalized, people may be too disposed to draw lessons from the past, rather than the reverse.
One of the most commonly cited lessons of history is that dictators should not be appeased – that is, that those threatened with aggression should respond with a credible threat of force, rather than with concessions or peacemaking overtures. This is what is sometimes called “the lesson of Munich,” after the 1938 conference in which British Prime Minister Neville Chamberlain gave way to the Germans in their demands over Czechoslovakia, helping to prepare the way for the Second World War. To the public at large there is probably no more powerful set of lessons than those about Hitler and Czechoslovakia – pearls of wisdom deemed unassailable recommendations for statesmen in practically every conflict when war threatens. Historians, I submit, are more cautious than most in defining a “lesson” from this historical episode, knowing as they do that reference to the policy on which it is based itself has a history. Appeasement did not always have the pejorative connotation that we assign to it now, and indeed, from having been understood in the 1930s as a policy of active peace seeking, it only gradually came to be popularly understood as a cringing, delusionary attempt to satisfy aggressors by giving them what they demand. As historians have probed the roots of appeasement, they have deepened our understanding of its original dynamics, certainly understood as more complex in its origins and strategic vision than once was the case. Similarly, our more sophisticated understanding of what happened in 1938 has also made it possible to challenge the idea that there were simple alternatives to policies adopted by those who faced fascist dictators in the 1930s. And so if the “lessons” of Munich are relatively clear about what not to do, they are far less so when it comes to policy prescriptions for analogous situations than one might like. As an analogy, “Munich” remains constantly open to the question of the extent to which it is applicable, or partially applicable, or not applicable at all.
Historians, who should be familiar with the history upon which historical analogies rest, are generally the least comfortable when it comes to asserting their applicability. The more they refine and deepen their understanding of the particular situations historical actors faced, the more they become aware of the difficulties in seeing analogies to other situations, most notably those in the imperfectly understood present. And similarly, the more aware historians are of the conditions under which their fellow scholars write, the more aware they are of how they may miss vital elements of the past. “Events,” writes the theorist William Bain, “are transformed into icons of a didactic past that announce our desires, our purposes, and our intentions. They communicate who we are and what we believe.” As the present-day idiom will have it, “it’s all about us,” rather than about an objective, timeless portrait of the past. In consequence, the lessons of history are more often matters of dispute over clashing perspectives, rather than being taken as self-evident.
An additional problem with the use of historical analogies to shape perspectives on present-day problems is the way in which so-called lessons become part of “the unspoken and spoken lore” about a particular kind of problem, a “consensual interpretation,” as the political scientist Yuen Foong Khong puts it in his book Analogies at War. “At that point,” he notes, “analogies step beyond their roles as heuristic devices for discovering new explanations and assume the roles of explanations and facts themselves.” Margaret MacMillan refers to one frequently contested American case in the mid-1960s in which this very thing seems to have happened, the Vietnam War. The issue was whether the United States should commit ground troops to Vietnam or withdraw. (In present-day circumstances, this translates into seemingly endless disputes over “boots on the ground.”) Lessons, in the Vietnamese case, operated differently on each side of the debate. To the Left, the lesson was that the United States should avoid getting bogged down in a war against a popular insurrection, and indeed that the kind of counter-insurgency such a struggle required was doomed to failure. Their analogy was with anti-imperialist uprisings against the French in Indo-China. But for those on the Right, Vietnam suggested precisely the opposite. In their view what was necessary was to go all out – to bomb North Vietnam and put even more troops into the field. Their preferred analogy was appeasement in the 1930s. “Can’t you see the similarity to our own indolence at Munich?” asked American president Lyndon Johnson’s ambassador to Vietnam, Henry Cabot Lodge, of his opponents on the left. Sharing this view, President Johnson confessed, “Everything I knew about history told me that if I got out of Vietnam and let Ho Chi Minh run through the streets of Saigon, then I’d be doing what Chamberlain did in World War II.” This kind of dialogue of the deaf demonstrates that what is really at issue in such exchanges is not how to handle future conflicts, but rather how to understand the past in order to appreciate the applicability of the “lesson.”
One answer to scepticism about the ability to draw grand deductions from history has been to focus on the value of practical experience, keeping in mind the limits of what the past can teach us. Whether the accumulation of such experience constitutes full-fledged lessons is of course another matter. Geoffrey Elton seems to have thought of lessons as a matter of conditioning, a building up of experience, even while discounting explicit injunctions about large choices in human affairs. History’s lessons, Elton felt, “are not straightforward didactic precepts, either instructions for action (the search for parallels to a given situation) or universal norms (history teaches that everything progresses, history teaches the triumph – or futility – of moral principles); there is far too much variety about the past, far too much confused singularity about the event, to produce such simple results.” However, Elton did feel that “a sound acquaintance with the prehistory of a situation or problem does illumine them and does assist in making present decisions; and though history cannot prophesy, it can often make reasonable predictions. Historical knowledge gives solidity to the understanding of the present and may suggest guiding lines for the future.”
The famous American historian Carl Becker was even more modest about predictions, but perhaps more encouraging about values. In his view, the kind of guidance that history provided was not so much the gambler’s edge – insiders’ knowledge so as to bet on the right horse – than it was to instil a capacity to meet the unanticipated. “The value of history is … not scientific but moral: by liberalizing the mind, by deepening the sympathies, by fortifying the will, it enables us to control, not society but ourselves – a much more important thing; it prepares us to live more humanely in the present and to meet rather than to foretell the future.” A final thought: the deeper one delves into the working of analogies, the more one encounters differences among historians in understanding the past from which similarities are supposed to be divined. In short, thinking by analogy often quite wrongly assumes that the past is a given, and that the real problem is understanding the future. This way of looking at history calls to mind a Soviet-era joke repeated by Tony Judt: A listener telephones “Armenian Radio” to ask about predicting the future. “No problem,” is the reply. “Our problem is rather with the past. It keeps changing.”
When I studied history in graduate school in the mid-1960s, one of the most cited promoters of the lessons of history was the Pulitzer Prize–winning American historian Barbara Tuchman, whose books on the First World War and comparable international catastrophes drew widespread popular acclaim for their readability and their sober warnings, in the midst of the Cold War, that quite commonplace human blunders could trigger grave diplomatic miscalculations, with horrendous consequences. Of German-Jewish background, the daughter of Maurice Wertheim, who was president of the American Jewish Committee from 1941 to 1943, she was a probing journalist who reported from Madrid on the Spanish Civil War. Granddaughter of Henry Morgenthau, Sr, the American ambassador to the Ottoman Empire during the First World War, one of the earliest to call attention to the slaughter of Armenians by the Ottomans, and niece of Henry Morgenthau, Jr, secretary of the treasury under Franklin Delano Roosevelt, who campaigned during the war for aid to the Jews of Europe, Tuchman might be said to have been disposed by family connections to thinking out of the box when it came to man-made disasters.
Tuchman’s March of Folly: From Troy to Vietnam, published at the height of America’s failed adventure in Southeast Asia, detailed examples of the ravages of human fallibility that seemed particularly pertinent at the time. “Mankind,” she wrote, “makes a poorer performance of government than of almost any other human activity. In this sphere, wisdom, which may be defined as the exercise of judgment acting on experience, common sense and available information, is less operative and more frustrated than it should be.” “Can we learn the lessons of history?” she pointedly asked. She had her doubts, but she nevertheless wrote in order to help error-prone statesmen of her day manage better. The March of Folly, she told her readers, was intended to address “the ubiquity of this problem in our time.”
Tuchman’s popular volume The Guns of August, published in February 1962, a readable study of the blundering path to the First World War followed by statesmen and generals of the day, is said to have played a role in the resolution of the Cuban Missile Crisis in October of that year. Closeted with his aides as the crisis developed over Soviet missiles about to be placed in Cuba, less than three hundred miles from Miami, the American president, John F. Kennedy, cited Tuchman’s book as instrumental in his decision making. Having been given a copy by Tuchman’s friend, secretary of defence Robert McNamara, Kennedy told his aides that the book prompted him to offer the Soviets a way to avoid thermonuclear conflict. “I am not going to follow a course which will allow anyone to write a comparable book about this time, The Missiles of October,” the minutes of the meeting record him saying. “If anybody is around to write after this, they are going to understand that we made every effort to find peace and every effort to give our adversary room to move.” The goal was to present Kennedy’s opponent with a way back from confrontation – in this case a deal in which the Soviets would remove their missiles from Cuba and the Americans would remove theirs from Turkey. I note as well that even if Kennedy didn’t really learn from the past, invoking it helped him to explain his decision and lend authority to his actions.
Has Tuchman’s “lesson” worn well? One might think so if her lesson was the importance of leaving one’s adversary room to manoeuvre. However, one would have to add that the lesson would be banal. After all, few negotiators would dispute that principle today. Moreover, it is doubtful that communicating this bit of wisdom should require the authority of a historical lesson. Armies of lawyers who resolve civil suits could speak knowledgeably about such tactics without ever having known there was a Cuban Missile Crisis or for that matter Tuchman’s Guns of August. Moreover, it is not at all clear that very many students of history would have seen Tuchman’s book as a key to the Cuban Missile Crisis, or would even consider this as one of the lessons of the time. The truth is that we have “moved on,” as we are wont to say. During the 1960s, Americans had certain things in mind about the Cold War and how to avoid nuclear war, and we have other things in mind today. Blinkered, clumsy, blundering statesmen seemed fair targets to Tuchman in the early sixties, when Vietnam protests appeared so persuasive in the United States. More often than not, we see the Cold War in quite different terms today. Nowadays, American statesmen have a keener sense of Soviet intentions, and are perhaps less inclined to see the outcome of the crisis as being solely due to an American president having successfully stood up to his Soviet counterpart. Along with blunders, we have a sense, derived from much more information than was available in 1962, of a complex interaction among a variety of players, notably American president Kennedy, Soviet premier Nikita Khrushchev, and Cuban dictator Fidel Castro. To be fair, Tuchman herself seems to have understood that things she wrote in the sixties did not wear so well a half-century later. “History has a way of escaping attempts to imprison it in patterns,” she wrote, years after these events. By then, Tuchman was full of caution about defining the “lessons of history,” underscoring how “anticipation” seems to be beyond our capacities, and how “prejudgments” block historical actors from seeing the effective way forward.
What has changed since the 1960s is not only the context in which historians write about the First World War and the Cuban Missile Crisis, but our interpretations of these events themselves. These understandings have deepened as historical research has advanced, new evidence has been discovered, and new perspectives have been brought to bear. In both cases historians seem to be backing away from notions of blunders to an examination of habits of mind that imprisoned an entire generation of decision makers in “a complex interaction of deep-rooted cultures, patriotism and paranoia, sediments of history and folk memory, ambition and intrigue,” as Harold Evans, for example, writes about 1914.
Understood this way, one of the major impediments to applying lessons from history is the realization that those who made up history’s cast of characters were not made of the same stuff as those who act on stage at present, or for that matter in the future. World views change. Cultures operate differently. Leaders face new challenges. What moved some at one time might not in another. Historian Ernest May made this point in a book he once wrote on the application of lessons to the conduct of American foreign policy. During the eighteenth century, he observed, those who cited the past as a guide to contemporary policy thought of human nature as unchanging over time. But change occurred nevertheless. And with it thinkers became less sure about how much we can learn from how our predecessors acted. “Influenced by concepts of evolution, cultural relativism and the like, men have since become uneasy about citing Greeks, Romans, Saxons, or even contemporary foreigners as examples of how they themselves might behave. Also, education has changed, and ruling élites have become more heterogeneous, with the result that people in government are more hesitant about alluding to events or experience outside of living memory.” Thereby, students of historical mentalities sometimes warn us, the bottom has fallen out of the way lessons function: because people at the time of the French Revolution responded one way to specific situations does not necessarily indicate that people of our own time will do so. Drawing lessons for 2015, therefore, on the basis of what people did in 1789, 1914, or 1962 becomes a very complicated process indeed, not to mention an extremely hazardous one.
Years ago, in his mischievous way, the British historian A.J.P. Taylor took on the issue of the lessons of history, as his biographer, Adam Sisman, reports in a study of that great contrarian. A born academic troublemaker who influenced me and so many of my peers who studied international history in the 1960s, Taylor offered a somewhat flippant reason for studying history. “What’s it for?” asked one undergraduate, putting this chapter’s question to the great man. Taylor was then one of Oxford University’s leading and most controversial lights, a man who not only published in popular newspapers but also revolutionized academic communication by appearing regularly on television in its early days, famously delivering lectures, without notes, on a variety of historical subjects. Taylor’s reply to the student was classically Tayloresque: “Because it’s such fun,” he replied. “It’s fascinating. Men [sic] write history for the same reason that they write poetry, study the property of numbers, or play football – for the joy of creation; men read history for the same reason that they listen to music or watch cricket – for the joy of appreciation.” Taylor loved the irreverent reposte. Typical of the man, however, he left his questioner with something serious, something to think about: “Once [historians] abandon that firm ground,” Taylor continued, “once [they] plead that history has a ‘message’ or that history has a ‘social responsibility’ (to produce good Marxists or good Imperialists or good citizens) there is no logical escape from the censor and the Index, the OGPU and the Gestapo.” “He was what I often think is a dangerous thing for a statesman to be – a student of history,” Taylor once said about a famous architect of foreign policy, “and like most of those who study history, he learned from the mistakes of the past how to make new ones.”
Years later, another English scholar, the military historian Michael Howard, spoke similarly about the lessons of history at Oxford, although at a formal occasion much more in the idiom of academic respectability. The event was Howard’s inaugural lecture as Regius Professor of Modern History in March 1981, when he assumed a prestigious chair endowed by King George I in 1724. Howard’s audience – Oxford’s vice chancellor and the assembled fellows of the University, among others – may not have known precisely what to expect from him on the subject, but this would certainly not have been the case with his benefactor. In 1724 at least, the record shows that King George wanted an incumbent “of sober Conversation and prudent Conduct, skilled in Modern History and the Knowledge of modern languages,” who would groom “the Youth committed to their care for Several Stations, both in Church and State, to which they may be called.” History, the monarch believed, was supremely relevant for this task. Teaching it was how the Regius Professors were to serve the common good.
Present-day readers should note: Howard had no patience with what has become the modern universities’ preoccupations with propitiating donors, in this particular case faithfully serving the founder’s passions for a particular kind of “relevance.” (Admittedly, Howard’s donor was completely inaccessible, by 1981, to lawsuits or other protests about the uses of his endowment!) Like his Regius predecessor Hugh Trevor-Roper, Howard did not believe that the job of the incumbent was to familiarize his readers with arcane jargon or to train professionals; rather, his task was to address and to educate laymen. More specifically, Regius Professors were not to worry about imparting the lessons of history to those who would exercise authority. History, Howard told his audience, “whatever its value in educating the judgment, teaches no ‘lessons’, and the professional historians will be as sceptical of those who claim that it does as professional doctors are of their colleagues who peddle patent medicines guaranteeing instant cures. The past is infinitely various, an inexhaustible storehouse of events from which we can profess anything or its contrary.” No ambiguity there.
Possibly even more shocking had these words been spoken before his donor, Howard said that there was no such thing as “history” as a body of truths comprising “what really happened.” Instead, “history is what historians write, and historians are part of the process they are writing about.” New evidence constantly appears, and even more importantly, the shape of “history” changes with shifts in the historian’s mentalité and what he happens to be interested in. And so what is known as history is continuously evolving. That is why we should never assume that we will ever settle, definitively, “what really happened,” and why the history on which the lessons supposedly rest constantly changes. What is made of this shifting terrain changes accordingly, as new perspectives emerge, new questions are put to the evidence, and new imaginations are enlisted in the effort to understand the past. Historians know these things well, Howard suggested, for the armies of their successors openly assemble; and historians know better than most that “revisions” of their favourite theories and descriptions are only a matter of time. Regrettably for the professors, Howard suggested, lay readers seldom appreciate this. Indeed, this time-limited approach, “for the layman, is maddening.” The layman, he continued,
looks for wise teachers who will use their knowledge of the past to explain the present and guide him as to the future. What does he find? Workmen, busily engaged in tearing up what he had regarded as a perfectly decent highway; doing their best to discourage him from proceeding along it at all; and warning him, if he does, that the surface is temporary, that they have no idea when it will be completed, nor where it leads, and that he proceeds at his own risk.
Quite unlike his Oxford colleague A.J.P. Taylor, Michael Howard was attuned to the expectations of the guardians of the public purse, and in 1981 he was prepared to placate those who expended public funds on universities and history chair holders – even those as august as Regius Professors. Howard acknowledged historians’ civic responsibility. But he cast that obligation in terms that George I would probably have found deplorable. Knowing that collective values and social aspirations, understood as emerging from a society’s past, shaped policy making in the future, Howard championed the historian’s role as guarding that public lore, keeping it safe from propagandists and promoters who would distort public knowledge in the interests of their particular cause or interest. It was the historians’ job to keep people honest about the past. “Our primary professional responsibility,” he insisted, “is to keep untainted those springs of knowledge that ultimately feed the great public reservoirs of popular histories and school text books and are now piped to every household in the country through the television screens.” And of course, had Howard been delivering his message thirty years later, he would certainly have added, perhaps even in a tone of alarm, “through the Internet.” What Howard would not have said, however, was that historians’ predictive capacities have improved with time. “As I always tell everybody,” he told an interviewer, “the historian’s the last person to ask about the future. We know the record: if we say it’s going to happen, it’s not going to happen.”
True to form, it was the French who elevated the issue of who gets to define an official history to the level of a national debate. While elements of the controversy could be found elsewhere, it was in France that the government attempted to cultivate particular historical memories as a civic duty – hence a phrase heard from time to time: le devoir de la mémoire, the duty of memory. And it was in France that opposition mobilized against this effort. Emerging from anti-racism and concerns about xenophobia, and notably an effort to punish attacks on minorities, the “duty of memory” promoted the idea of a moral obligation for individuals, and by extension the state to which they belong, to keep alive the collective memory of particular victimizations. Thereby, the collectivity, sometimes being itself implicated in particular wrongdoings such as the wartime persecution and murder of Jews, undertook to express official contrition over these episodes, often held to be previously neglected parts of national histories. These efforts also illustrate the high stakes that many social groups have in seeing their own communal suffering, often experienced far in the past, validated by the official organs of the state. Associated with the efforts of many to solidify their identities as belonging to groups that have been victimized, members of such groups sometimes believe that they can achieve their communal goals by having their particular victimization codified as official parts of the historical record.
Conflict arose in France over the efforts of government, through legislation, not only to impose particular historical understandings of such matters as colonialism, slavery, the Armenian genocide, and the Holocaust, but even to criminalize “the denial, justification or gross trivialization of such crimes.” In 2005, in response to several “memory laws” that sought penal sanctions for abusive writings about particular genocide episodes, a group of public intellectuals launched an association called Liberté pour l’histoire – Freedom for History – and engaged hundreds of historians in France, including some of the most prestigious in the country. (In response to an international call for support, I added my name to the many on the list.) There followed vigorous debates and widespread disagreement between the government and these scholars both over the principle of the state assuming a public voice in such matters and over specific interpretations embedded in various memory laws. Such legislation was seen as a challenge to the independence of historians and an effort to politicize matters that should be left to academics and other writers of history to sort out. A Liberté pour l’histoire manifesto of 2008 put it this way: “In a free state, no political authority has the right to define historical truth and to restrain the freedom of the historian with the threat of penal sanctions.”
The historian Pierre Nora, president of Liberté pour l’histoire, denounced the kind of history the legislators seemed to be preferring: “a tendency to reread and rewrite the whole of history exclusively from the victims’ point of view,” involving “a regrettable tendency to project onto the past moral judgments belonging exclusively to the present without taking into account the change in the times; this being history’s very purpose and the very reason for learning and teaching history in the first place.” A major flash point turned out to be the effort to mobilize the French parliament to recognize the genocidal character of the massacre of Armenians in the Ottoman Empire. Eventually, these matters were resolved in France, although by no means to everyone’s satisfaction, as a result of determinations by parliament and the French Conseil d’Etat, the country’s highest administrative court, to put a halt to memory laws. Henceforth the deputies will be able to pass “resolutions” on historical matters, but not laws that carry penal sanctions.
What is important here is not so much the legal status of genocide denial in France, but rather what many felt was the state’s usurping the place of the historian.
Quite simply, the memorial laws suggested to opponents that, whatever the motivations of the legislators, they were mobilizing the history of great wrongdoing in order to define historical orthodoxies and to validate the grievances of particular groups. Historians do not generally like the state’s interference, through political or judicial proceedings, in matters of historical interpretation, and even less do they appreciate the definition of what can be said about the past through the instruments of criminal law – even if intended to thwart Holocaust deniers. Historians know that history is subject constantly to interpretation, that the focus of history constantly shifts depending upon what questions people choose to address, and they often take it as their role to demystify, rather than to promote, received wisdom – to say nothing of calling such verities lessons. Understandably enough, historians are vigorous champions of free expression in their chosen domains. Partly for that reason, they do not like official histories, histories that mask disagreement behind politically approved versions of the past, or histories that use the authority of government to define official truths. Historians, therefore, often feel nervous about how government and other institutions use the historical past in furthering institutionally defined courses of action.
Count on the pollsters and philanthropic foundations to attempt to define the mood of an entire generation on matters such as the lessons of history. As the millennium approached in 1996, some well-meaning promoters of good things internationally had this very idea in mind. The result was an ambitious and doubtless expensive project – together with paltry results. Looking perhaps for a useful way of marking the arrival of the year 2000 (whatever that meant), an energetic team of future-oriented consultants founded the Millennium Project in 1996, following a three-year feasibility study undertaken by the United Nations University and its American Council, the Smithsonian Institution, and a global health and development consulting firm, Futures Group International. What emerged from these deliberations was “an independent non-profit global participatory futures research think tank of futurists, scholars, business planners and policy makers who work for international organizations, governments, corporations, NGOs and universities,” as an early project prospectus had it. This was part of a larger project committed to improving “humanity’s prospects for a better future,” outlined in a series of manifestos of millennial futurology entitled State of the Future. So far so good. A part of the 1996 report was an examination of “Lessons of History.” Indeed, those in charge of this high-minded project devoted a chapter of the 1998 project volume to this very subject.
“What are the lessons of history?” the independent non-profit global participatory futures research think tank asked – after starting with their own, somewhat tortured rendering of Santayana, “if the lessons of history are ignored, they are doomed to be repeated” (sic). To probe more deeply into the lessons of history, the project’s research directors conducted three rounds of inquiry, starting with questions to two dozen or so historians from several countries, chosen from a sample of one hundred names. In the first round, historians proposed lessons and questions about them. In a second round, a more select group of sixteen historians assessed these lessons, both as to their validity and as to their applicability to the future. Then the last word, in a third round, was given to a group of “futurists,” who presumably were eager to see what all this amounted to.
A short answer, I think it fair to say, is not much, which is perhaps why the results of this particular inquiry did not, to my knowledge at least, make much of a splash among day-to-day practitioners of Clio’s craft. So far as I can see, the Millennium Project, which carries on some of its work to this day, decided to abandon its labours on the lessons of history and has moved on to many other fields. The first two rounds of inquiry identified some lessons – quietly demoted in the text to the term “items” – deemed, in comparison with others, to have “high historic validity” and even “higher future applicability”: “Wars in some form will continue,” “Some large scale projects turn out to be inefficient,” “Communications capabilities are important to survival of political organizations,” and “Political systems can collapse suddenly.” Other items had “higher historic validity and lower future applicability”: “Water shortages lead to social change,” and “Epidemics play a great role in evolution of the modern age.” And so it went, largely vacuous, unexceptionable generalizations, down to those with lower historic validity and lower future applicability: “Economic innovations may benefit a society or group, but may have a negative impact on social structure,” “Climate changes lead to social changes,” and “Population stability may be impossible.” One finding deemed “remarkable” by the authors of the study involved the “greatest differences” between the historians and the futurists over a core proposition: “History cannot be used to predict the future.” Futurists thought highly of this contention, rating it between “extremely useful” and “useful most of the time.” The historians, however, had less use for the proposition, with an average rating close to “Will be true as often as it will be false.”
I have struggled with what to make of this finding, and indeed whether anything can be made of it at all. My two cents would go with the futurists, sympathizing with their apparent scepticism about whether historians have any greater predictive capabilities than anyone else. I have certainly never found this to be the case and I sense that many of my colleagues would agree – knowing how unwilling they are to trust the judgment of their fellow practitioners in many other matters. More to the point of this book, this scepticism casts a shadow over lessons, for without having a good idea about how things are likely to turn out, one is hardly in a position to recommend one thing or another – let alone advise others through such a grave set of instructions as lessons of the Holocaust.
The Millennium Project staggered on beyond the millennium itself to develop “a concrete action plan … to relieve the grinding poverty, hunger and disease affecting billions of people,” as its website declares. In 2005 it presented its final recommendations to UN secretary general Kofi Annan, and assigned work to ten thematic task forces, which in turn presented their own recommendations. (None of these had to do with the history of the Holocaust, I hasten to say.) I have no idea whether this project will ever be evaluated, or whether the project’s efforts, monitored only to the end of 2006, will have an afterlife. What I do know is that those of us who study the past have less and less of an affinity with the kind of concerted policy work undertaken in this vast endeavour. Historians have a very different professional culture from Millennium Project workers, and observers may perhaps agree that whatever valuable achievements may come out of the project, among them will not be a clarification of historiographical problems. Where the Millennium think tank came up short, however, on the uses of history, I believe that the simpler work of historians has been more productive. I turn now to the history of the Holocaust to see how historians have undertaken their own global efforts to understand and to deal with the question of lessons.