4

The demolition merchants of reality

You propose then, Philo, said Cleanthes, to erect religious faith on philosophical scepticism; and you think that if certainty or evidence be expelled from every other subject of inquiry, it will all retire to these theological doctrines, and there acquire a superior force and authority. Whether your scepticism be as absolute and sincere as you pretend, we shall learn by and by, when the company breaks up: We shall then see, whether you go out at the door or the window; and whether you really doubt if your body has gravity, or can be injured by its fall; according to popular opinion, derived from our fallacious senses, and more fallacious experience.

DAVID HUME, Dialogues Concerning Natural Religion (1779)

Colin MacCabe, an obscure young Fellow of King’s College, Cambridge, was denied a lectureship by the English faculty’s appointments board in January 1981. Not the sort of news that would usually merit a paragraph in the university newspaper, let alone the national press: yet the rebuff to MacCabe, an expert on the novels of James Joyce and the films of Jean-Luc Godard, was reported on the front page of the Guardian. When MacCabe returned to England from a trip abroad a couple of days later, he found himself mobbed by reporters and photographers at Heathrow airport. His failure to gain tenure at the university provoked demonstrations in the streets of Cambridge and earnest debate on current affairs programmes. Newsweek cleared a page for the story (under the inevitable headline, ‘Unquiet Flow the Dons’), which it described as ‘one of the most extraordinary debates in the [university’s] eight-century history’:

Dons who normally confine their disputes to sherry parties leak damaging rumours about each other and threaten libel suits. Charges of academic sleaziness and intellectual persecution fly back and forth. Television crews roam King’s Parade to catch the carping of talkative academicians … Angry students began seeking to have the entire English faculty board suspended, and MacCabe sympathisers spoke of breaking away to form their own department.

Even some of his enemies agreed that MacCabe was an excellent scholar and teacher; but he was also a ‘post-structuralist’ who believed in analysing literature through study of its linguistic rules and cultural assumptions. Although MacCabe argued that these methods were no great radical departure from the traditions established by earlier generations of Cambridge dons – I. A. Richards and William Empson both undertook close formal analysis of the language of literary texts, while F. R. Leavis and Raymond Williams attempted to place novels within the general cultural history of the country – he did admit that it was the ‘enormous explosion of work in the mid-Sixties in Paris’ by structuralist and deconstructionist pioneers such as Jacques Lacan, Jacques Derrida, Roland Barthes, Louis Althusser and Michel Foucault which had ‘galvanised me and many others’, thus confirming the suspicion among traditionalists that MacCabe was the carrier of a dangerous foreign germ which would infect the whole corpus of English teaching unless he were swiftly quarantined. In the words of the anti-structuralist don Christopher Ricks, ‘It is our job to teach and uphold the canon of English literature.’ Ricks’s colleague Ian Jack added that ‘one does want to keep the attention of students focused on the great writers’. On the other side of the barricades, Dr Tony Tanner described the treatment of MacCabe as ‘the most unjust thing I have ever seen in academic life’ and resigned from the faculty’s degree committee in protest. Raymond Williams, the grand old man of Marxist criticism, was voted off the appointments board for defending MacCabe. So, more surprisingly, was Professor Frank Kermode; though not a structuralist or semiologist himself, he argued that the university ought to accommodate a plurality of critical styles and techniques.

Having succeeded in forcing out Colin MacCabe, the Cambridge conservatives continued to guard the gates against foreign barbarians for many years. (As a young lecturer observed, ‘Cambridge is an island in some ways, cut off from the rest of the country. When I ran into a colleague in London once, he said: “Fancy seeing you in England.”’) At a degree-awarding ceremony in March 1992, three of them shocked the hundreds of proud parents assembled in Senate House by standing up and shouting ‘non placet’ – thus imposing a temporary veto on the proposal to give an honorary doctorate to Jacques Derrida, the sixty-two-year-old doyen of deconstructionism. But although Cambridge may have won the odd battle, it was the continental theorists who won the war. When Derrida came to speak in Oxford a few weeks before the Cantabrigian yell of ‘non placet’, he drew an audience of 1,800 – as against the 400 who turned up at the Oxford Union that month to hear the Hollywood star Warren Beatty. The success of the theorists’ long march through the institutions can also be gauged by Colin MacCabe’s career: immediately after his eviction from Cambridge a full-blown professorship was created for him at Strathclyde University; three years later he was appointed head of production at the British Film Institute and, for good measure, professor of English at the University of Pittsburgh.

By the end of the 1980s, deconstructionists and their allies – generically labelled ‘post-modernists’ – had established something of a hegemony (to use one of their own favourite terms) on campuses in the United States. They dominated the powerful American Modern Languages Association, whose conferences were attended by up to 10,000 academic critics. They controlled the recruitment of lecturers in many universities, a power they exercised with the same Stalinist intolerance displayed a few years earlier by the crusty conservatives of Cambridge. This time, however, the victims were those who could not recite the post-modern shibboleths. Even a don sympathetic to Derrida admitted that ‘deconstruction, which began as a heresy, soon turned into a dogma, and hardened into a theology, sustained by a network of evangelists and high priests and inquisitors’. The Vatican of this new creed was Yale University, where the three ‘boa-deconstructors’ Jacques Derrida, Paul de Man and J. Hillis Miller reigned jointly as pontificating pontiffs, but the papal jurisdiction extended far beyond their own department of comparative literature. ‘Students taking courses in literature, film, “cultural studies”, and even, in some cases, anthropology and political science, were taught that the world is just a socially constructed “text” about which you can say just about anything you want, provided you say it murkily enough,’ the left-wing American author Barbara Ehrenreich complained. ‘One of my own children, whose college education cost about $25,000 a year, reported that in some classes, you could be marked down for using the word “reality” without the quotation marks.’ A critical theory that rejoiced in a multiplicity of meanings thus acquired the status of doctrine, excluding all viewpoints but its own. Ehrenreich described it as ‘one of the least lovable fads to hit American campuses since drinking-till-you-barf’.

Academic fashions, like literary texts, often have a greater ideological significance than is immediately apparent. English literature became a subject for study – not only in the universities, but in mechanics’ institutes and workingmen’s colleges – towards the end of the late nineteenth century, at a time when scientific advance and social change were eroding the dominance of religion as a source of moral guidance and timeless truths. One of the first occupants of the Merton professorship of English at Oxford, George Gordon, announced in his inaugural lecture that ‘England is sick, and … English literature must save it. The Churches (as I understand) having failed, and social remedies being slow, English literature has now a triple function: still, I suppose, to delight and instruct us, but also, and above all, to save our souls and heal the State.’

The early structuralists and semioticians were therefore quite right to argue that literature couldn’t be evaluated purely aesthetically or impressionistically, regardless of all historical and social context. A statement of the obvious, one might think; but sometimes, as with the emperor’s new clothes, demystification can be achieved by pointing out what ought to be self-evident. As Terry Eagleton writes:

Loosely subjective talk was chastised by a criticism which recognised that the literary work, like any other product of language, is a construct, whose mechanisms could be classified and analysed like the objects of any other science. The Romantic prejudice that the poem, like a person, harboured a vital essence, a soul which it was discourteous to tamper with, was rudely unmasked as a bit of disguised theology, a superstitious fear of reasoned inquiry … Meaning was neither a private experience nor a divinely ordained occurrence: it was the product of certain shared systems of signification … Reality was not reflected by language but produced by it: it was a particular way of carving up the world which was deeply dependent on the sign-systems we had at our command, or more precisely which had us at theirs.

Yet structuralism, like its forebears and descendants, had its own ideological mission. The Russian Formalists, who emerged in the years before the 1917 revolution, were closely associated with the Bolsheviks, and their militant insistence on a ‘scientific criticism’ that would expose the ‘material reality’ of texts had a close kinship with the ‘scientific socialism’ of Marxism-Leninism. Similarly, post-structuralism – exemplified by Roland Barthes’ The Pleasure of the Text – came into vogue soon after the Parisian eruptions of May 1968 had been comprehensively thwarted. ‘Post-structuralism was a product of that blend of euphoria and disillusionment, liberation and dissipation, carnival and catastrophe, which was 1968,’ Eagleton suggests, persuasively. ‘Unable to break the structures of state power, post-structuralism found it possible instead to subvert the structures of language … Its enemies, as for the later Barthes, became coherent belief systems of any kind – in particular all forms of political theory and organisation which sought to analyse, and act upon, the structures of society as a whole. For it was precisely such politics which seemed to have failed.’ No systematic critique of monopoly capitalism was possible since capitalism was itself a fiction, like truth, justice, law and all other linguistic ‘constructs’.

As post-structuralism morphed into deconstruction and then post-modernism, it often seemed a way of evading politics altogether – even if many of its practitioners continued to style themselves as Marxists. The logic of their playful insistence that there were no certainties or realities, and their refusal to acknowledge the legitimacy of value-judgments, led to a free-floating relativism that could celebrate both American pop culture and medieval superstition without a qualm. Michel Foucault visited Tehran soon after the fall of the Shah, and came back to Paris enraptured by the ‘beauty’ of the Ayatollah Khomeini’s neanderthal regime. Asked about the suppression of all dissent, he replied:

They don’t have the same regime of truth as ours, which, it has to be said, is very special, even if it has become almost universal. The Greeks had their own. The Arabs of the Maghreb have another. And in Iran it is largely modelled on a religion that has an exoteric form and an esoteric content. That is to say, everything that is said under the explicit form of the law also refers to another meaning. So not only is saying one thing that means another not a condemnable ambiguity, it is, on the contrary, a necessary and highly prized additional level of meaning. It’s often the case that people say something that, at the factual level, isn’t true, but which refers to another, deeper meaning, which cannot be assimilated in terms of precision and observation.

This is a magnificently Parisian method of avoiding a straightforward question: with enough intellectual ingenuity, even the absence of free speech and promotion of mendacity can be admired as exercises in irony and textual ambiguity.

Despite their scorn for grand historical narratives, universalist ideologies and general laws of nature, many post-modernists seemed to accept the demise of socialism and the success of capitalism as immutable facts of life. Their subversive impulse therefore sought refuge in those marginal spaces where the victors’ dominance seemed less secure. Hence the celebration of almost anything exotic or unincorporable, from Iranian theocracy to sado-masochistic fetishes. A fascination with the pleasures of consumption (TV soap operas, shopping malls, mass-market kitsch) displaced the traditional radical emphasis on the conditions of production. ‘Culturalism’ supplanted materialism, dialectic was ousted by discontinuity, reason yielded to random reflexivity. The consequence was, in Terry Eagleton’s words, ‘an immense linguistic inflation, as what appeared no longer conceivable in political reality was still just about possible in the areas of discourse or signs or textuality. The freedom of text or language would come to compensate for the unfreedom of the system as a whole.’ One after another, academic disciplines took a ‘linguistic turn’ as the steering-wheel was grabbed by theorists who insisted that fact and fiction were indistinguishable. Everything from history to quantum physics was now a text, subject to the ‘infinite play of signification’.

As Eagleton noticed, however, despite the post-modernists’ keen eye for irony they seemed oddly oblivious to the contradictions of their own posture:

In pulling the rug out from under the certainties of its political opponents, this post-modern culture has often enough pulled it out from under itself too, leaving itself with no more reason why we should resist fascism than the feebly pragmatic plea that fascism is not the way we do things in Sussex or Sacramento. It has brought low the intimidating austerity of high culture with its playful, parodic spirit, and in thus imitating the commodity form has succeeded in reinforcing the crippling austerities of the marketplace.

Even the most striking irony of all somehow escaped their notice: that at the end of the 1980s, when post-modernists had contemptuously written off the possibility and indeed the desirability of collective political action, the citizens of Czechoslovakia, East Germany and other Stalinist bureaucracies took to the streets and overthrew their masters by sheer force of ‘people power’.

Terry Eagleton’s bracing left-wing critique of post-modernism, published by the Monthly Review in July 1995, noted yet another irony almost parenthetically: ‘It believes in style and pleasure, and commonly churns out texts that might have been composed by, as well as on, a computer.’ The truth of this quip was proved a year later when a mischievous Australian academic, Andrew Bulhak, designed a computer program ‘to generate random, meaningless and yet quite realistic text in genres defined using recursive transition networks’. For the purposes of his experiment he needed a genre which employed ‘context-free grammars’; and he found it. Anyone who visits the website of his ‘post-modernism generator’ will be rewarded with an apparently serious academic paper, complete with footnotes, on ‘pretextual discourse that includes reality as a totality’ or perhaps ‘the subtextual paradigm of context’. In its first two years online, the generator delivered more than half a million such essays – each wholly original, and all utterly meaningless.

To the outsider, the babbling impenetrability of most post-modern texts arouses the suspicion that they are no more than atonal noise, signifying nothing – a fitting style, perhaps, for a theory that seeks to cast doubt on the very existence of ‘meaning’. As long ago as 1968, in the early days of structuralism, the great scientist Peter Medawar protested that clarity had become a dirty word:

A writer on structuralism in the Times Literary Supplement has suggested that thoughts which are confused and tortuous by reason of their profundity are most appropriately expressed in prose that is deliberately unclear. What a preposterously silly idea! I am reminded of an air-raid warden in wartime Oxford who, when bright moonlight seemed to be defeating the spirit of the blackout, exhorted us to wear dark glasses. He, however, was being funny on purpose.

The wilful opaqueness that distressed Medawar now seems almost pellucid beside what succeeded it. Here, for example, is a passage from the French theorist Gilles Deleuze:

In the first place, singularities-events correspond to heterogeneous series which are organised into a system which is neither stable nor unstable, but rather ‘metastable’, endowed with a potential energy wherein the differences between series are distributed … In the second place, singularities possess a process of auto-unification, always mobile and displaced to the extent that a paradoxical element traverses the series and makes them resonate, enveloping the corresponding singular points in a single aleatory point and all the emissions, all dice throws, in a single cast.

One can gaze at this paragraph for hours and be none the wiser. Read it back to front, break it up into constituent clauses, ingest a few hallucinogenic drugs to aid comprehension: it remains gibberish. Yet no less a figure than Michel Foucault praised Deleuze as ‘among the greatest of the great’, adding that ‘some day, perhaps, the century will be Deleuzian’.

Although much post-modernism may be nonsense, it is nonsense with a purpose: by using quasi-scientific terminology the po-mo theologians intended to explode the ‘objectivity’ of science itself. The fact that they knew nothing about mathematics, physics or chemistry was no obstacle. Luce Irigaray, a high priestess of the movement, denounced Einstein’s E=mc2 as a ‘sexed equation’, since ‘it privileges the speed of light over other [less masculine] speeds that are vitally necessary to us’. In similar vein, she protested at ‘the privileging of solid over fluid mechanics, and indeed the inability of science to deal with turbulent flow at all’, attributing this bias to the association of fluidity with femininity: ‘Whereas men have sex organs that protrude and become rigid, women have openings that leak menstrual blood and vaginal fluids … From this perspective it is no wonder that science has not been able to arrive at a successful model for turbulence. The problem of turbulent flow cannot be solved because the conceptions of fluids (and of women) have been formulated so as necessarily to leave unarticulated remainders.’ Jacques Lacan, whose oracular pronouncements were received with awe on many British and American campuses, sought to give his deconstructionism a semblance of methodical rigour by transforming it into the following equation:

image1

Any mathematically competent schoolchild can recognise that this is unmitigated poppycock. For Lacan, however, there is nothing that can’t be expressed algebraically: ‘Thus the erectile organ comes to symbolize the place of jouissance [ecstasy], not in itself, or even in the form of an image, but as a part lacking in the desired image: that is why it is equivalent to the √-1 of the signification produced above, of the jouissance that it restores by the coefficient of its statement to the function of lack of signifier (-1).’ What does it matter, Barbara Ehrenreich once asked, if some French guy wants to think of his penis as the square root of minus one? ‘Not much, except that on American campuses, especially the more elite ones, such utterances were routinely passed off as examples of boldly “transgressive” left-wing thought.’ Few progressives dared to challenge this tyranny of twaddle for fear of being reviled as cultural and political reactionaries – or, no less shamingly, ignorant philistines. ‘For some years I’ve been troubled by an apparent decline in the standards of intellectual rigour in certain precincts of the American academic humanities,’ Alan Sokal, a physics professor at New York University, wrote in 1996. ‘But I’m a mere physicist: if I find myself unable to make head or tail of jouissance and différance, perhaps that just reflects my own inadequacy.’

Sokal’s reluctance to make a fool of himself was, however, outweighed by his fury at the betrayal of the Enlightenment. As a socialist who had taught in Nicaragua after the Sandinista revolution, he felt doubly indignant that much of the new mystificatory folly emanated from the self-proclaimed left. For two centuries, progressives had championed science against obscurantism. The sudden lurch of academic humanists and social scientists towards epistemic relativism not only betrayed this heritage but jeopardised ‘the already fragile prospects for a progressive social critique’, since it was impossible to combat bogus ideas if all notions of truth and falsity ceased to have any validity. To test the prevailing intellectual standards, Sokal decided to perform a modest experiment: would the leading American journal of ‘cultural studies’ accept an article that made no sense whatsoever if it flattered the editors’ ideological preconceptions? The answer was provided in the spring of 1996 when Social Text – whose editorial board included many of the starriest post-modern professors in the US – produced a special issue on ‘Science Wars’, whose purpose was to ‘uncover the gender-laden and racist assumptions built into the Euro-American scientific method … to talk about different ways of doing science, ways that downgrade methodology [and] experiment’. It included an unsolicited paper by Alan Sokal titled ‘Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity’.

Anyone outside the self-referential, self-satisfied cult could have rumbled the essay as a spoof straight away. In his opening paragraph Sokal derided the ‘dogma imposed by the long post-Enlightenment hegemony’, which he summarised thus:

that there exists an external world, whose properties are independent of any individual human being and indeed of humanity as a whole; that these properties are encoded in ‘eternal’ physical laws; and that human beings can obtain reliable, albeit imperfect and tentative, knowledge of these laws by hewing to the ‘objective’ procedures and epistemological strictures prescribed by the (so-called) scientific method.

Even the editors of Social Text must have noticed the supposedly imaginary ‘external world’ from time to time, not least when the sun rises every morning. Yet their bullshit detectors failed to sound the alarm – possibly, as Sokal had guessed, because they were flattered to find a bona-fide physicist paying homage to their superior wisdom. In the very next paragraph he praised post-structuralist critics for demystifying Western scientific practice and ‘revealing the ideology of domination concealed behind the façade of “objectivity”. It has thus become increasingly apparent that physical “reality”, no less than social “reality”, is at bottom a social and linguistic construct.’ Not theories of physical reality, nota bene, but the reality itself. As Sokal commented when the hoax was revealed, recalling a joke made by David Hume more than two centuries earlier, ‘Fair enough: anyone who believes that the laws of physics are mere social conventions is invited to try transgressing those conventions from the windows of my apartment. I live on the twenty-first floor.’ (There’s also an echo here of a famous passage from James Boswell’s Life of Johnson: ‘After we came out of church we stood talking for some time together of Bishop Berkeley’s ingenious sophistry to prove the non-existence of matter and that everything in the universe is merely ideal. I observed, that though we are satisfied his doctrine is not true, it is impossible to refute it. I shall never forget the alacrity with which Johnson answered, striking his foot with mighty force against a large stone, till he rebounded from it, “I refute it thus.”’)

Sokal’s article was littered with scientific howlers and absurdities. He claimed that Jacques Lacan’s Freudian speculations had now been proved by quantum theory, and that Jacques Derrida’s thoughts on variability confirmed Einstein’s general theory of relativity:

In mathematical terms, Derrida’s observation relates to the invariance of the Einstein field equation Gμν =8π Tμν Gμν =8πGTμν under nonlinear space-time diffeomorphisms (self-mappings of the space-time manifold which are infinitely differentiable but not necessarily analytic)… The n of Euclid and the G of Newton, formerly thought to be constant and universal, are now perceived in their ineluctable historicity.

In truth, of course, n is constant and universal, a precisely defined number with the same value on the surface of the moon as in Outer Mongolia. It is beyond belief that a professor of physics would ever argue otherwise except for satirical purposes, but still the editors saw nothing amiss. Nor, incidentally, did they follow the usual procedure of scholarly journals by sending the article to an outside referee before publication. An assessor with some knowledge of mathematics would certainly have spared them the embarrassment of falling for tosh such as this:

Just as liberal feminists are frequently content with a minimal agenda of legal and social equality for women and ‘pro-choice’, so liberal (and even some socialist) mathematicians are often content to work within the hegemonic Zermelo-Fraenkel framework (which, reflecting its nineteenth-century liberal origins, already incorporates the axioms of equality) supplemented only by the axiom of choice. But this framework is grossly insufficient for a liberatory mathematics.

The axiom of choice, as formulated by Ernst Zermelo in 1904, states that within any set of mutually exclusive and ‘non-empty’ subsets there is at least one subset which has exactly one element in common with each of the others. Some mathematicians find it useful, others do not; all, however, would be amazed to hear that it had some relevance to the debate on abortion law. Sokal’s equally ludicrous assertion that it was a product of nineteenth-century liberalism led on to this hilarious political conclusion:

A liberatory science cannot be complete without a profound revision of the canon of mathematics. As yet no such emancipatory mathematics exists, and we can only speculate upon its eventual content. We can see hints of it in the multidimensional and nonlinear logic of fuzzy systems theory; but this approach is still heavily marked by its origins in the crisis of late-capitalist production relations. Catastrophe theory, with its dialectical emphasis on smoothness/discontinuity and metamorphosis/unfolding, will indubitably play a major role in future mathematics; but much theoretical work remains to be done before this approach can become a concrete tool of progressive political praxis.

In May 1996, a week after publication of the ‘Science Wars’ edition of Social Text, Alan Sokal wrote a short essay for Lingua Franca magazine revealing the hoax and explaining his motives. Not since the MacCabe affair had a theoretical academic dispute provoked such journalistic sensation: the story was reported on the front page of the New York Times, and in dozens of newspapers from Bombay to Buenos Aires. In France, Le Monde ran at least twenty articles on ‘l’affaire Sokal’, including a contribution from Jacques Derrida huffily dismissing the prankster-professor as ‘pas sérieux’. But the post-modernists’ attempts to discredit Sokal were hampered by the fact that his article, like his subsequent book Intellectual Impostures, included dozens of genuine quotations from their own work. Julia Kristeva, who had often referred to differential calculus, algebraic geometry, predicate logic and ‘infinite functional Hilbert spaces’ in her writings about poetry, psychoanalysis and politics, was invited by Le Nouvel Observateur to answer Sokal’s charge that she sought to dazzle readers with technical concepts which she manifestly didn’t understand. ‘Obviously,’ she conceded, ‘I’m not a real mathematician.’

The egg-spattered editors of Social Text could do little but splutter and grumble. One, displaying a fine deconstructionist contempt for authorial intention, insisted that ‘Sokal’s parody was nothing of the sort, and that his admission represented a change of heart, or a folding of his intellectual resolve’. Another, the sociologist Professor Stanley Aronowitz, accused Sokal of caricaturing post-modernism: ‘He got it wrong. One of the reasons he got it wrong is he’s ill-read and half-educated.’ Which was pretty rich coming from someone whose journal had unblinkingly accepted an article strewn with obvious errors. Besides, if Aronowitz and his colleagues were successfully conned by an ignoramus, what did that imply about their own critical intelligence?

Predictably enough, Sokal was also accused of political betrayal. According to an editorial statement from Social Text, ‘what Sokal’s confession most altered was our perception of his own good faith as a self-declared leftist’, because his stunt had exposed the journal to ‘derision from conservatives’. No doubt they were thinking of gleeful comments from right-wing pundits such as George F. Will of the Washington Post and Roger Kimball of the Wall Street Journal. Yet it was feminists and leftists – Katha Pollitt in the Nation, Barbara Ehrenreich in The Progressive, Barbara Epstein in New Politics – who led the applause for Alan Sokal. As Ehrenreich pointed out, the post-modernists were right about one thing: that many people subscribe to socially constructed ‘realities’ which flout common sense and experience, such as the belief that there is a Supreme Being who takes a personal interest in our careers, romances and efforts to lose weight. However, since the theorists also insisted that the whole world was nothing but a ghostly swarm of human-generated imaginings they had effectively opted out of the struggle against this fantasy.

To describe them as non-combatants or conscientious objectors would be too generous by half: while vilifying their left-wing critics as accomplices of conservatism, they themselves abetted some thoroughly rancid characters and ideologies. As a young man in Belgium during the early 1940s, the deconstructionist guru Paul de Man wrote columns for a Nazi-controlled newspaper, Le Soir, praising collaborationists and claiming that Jews had contributed nothing to European civilisation. (‘A solution to the Jewish problem that would lead to the creation of a Jewish colony isolated from Europe would not have, for the literary life of the West, regrettable consequences. It would lose, in all, some personalities of mediocre worth.’) After emigrating to the US in 1947 he never mentioned his wartime record, except occasionally to hint that he had fought for the Resistance, and when the articles eventually came to light in 1987, three years after his death, colleagues and disciples were flabbergasted. Unable (as ever) to face reality, some of them promptly set about deconstructing his wartime journalism to show that it didn’t mean what it said: the selfsame theorists who usually rejoiced in the infinite variety of possible textual interpretation now maintained that, with these particular texts, there was only one ‘correct’ reading – their own. Derrida, having persuaded himself that de Man’s Jew-baiting was somehow an implicit repudiation of anti-semitism, accused the professor’s critics of using the same ‘exterminating gesture’ as the Nazis, since they wished –‘at least figuratively’ – to censor or destroy his work.

In truth, of course, it was de Man himself who had sought to conceal the articles for Le Soir, if they really had been daring satires on Nazi attitudes to the Jews, why did he remain silent about them for the rest of his life? Yet he did leave a few clues to his guilty secret, if only someone had known how to decode them. He admired Julia Kristeva’s Powers of Horror (1982), which included a defence of the anti-semitic writings of the novelist Céline. (The chapter, titled ‘Ours to Jew and Die’, quoted with apparent approval Céline’s depiction of the Jew as ‘a fecalised, feminised, passivated rot’. In a laudatory blurb for the book, de Man singled out these ‘illuminating’ and ‘indispensable’ passages for special praise.) In a famous essay on Nietzsche, he wrote of ‘a past that … is so threatening that it has to be forgotten’, adding that we ‘try to give ourselves a new past from which we should have liked to descend’. There had also been this suggestive observation, from a critique of Rousseau in de Man’s Allegories of Reading:

It is always possible to face up to any experience (to excuse any guilt) because the experience always exists simultaneously as fictional discourse and as empirical event, and it is never possible to decide which of the two possibilities is the right one. The indecision makes it possible to excuse the bleakest of crimes because, as a fiction, it escapes from the constraints of guilt and innocence.

This recalls a comment by Professor Stanley Fish (the model for the flamboyant deconstructionist Professor Zapp in David Lodge’s comic novel Changing Places), who said that critical theory ‘relieves me of the obligation to be right … and demands only that I be interesting’. Harmless enough in a literature department, perhaps, where fictional discourse is an inescapable element of the syllabus anyway. By the 1990s, however, the post-modernists had also colonised many history faculties, teaching their students that ‘facts’ were a chimera: history consisted solely of competing narratives, none of which should be ‘privileged’ over another.

There is ample justification for questioning scientific pretensions to absolute and disinterested objectivity: one needn’t be a deconstructionist to notice the political and economic imperatives that fuelled President Kennedy’s determination to put an American on the moon, or President Reagan’s spending on ‘Star Wars’ research. But the post-modernists’ rampant subjectivism nurtured its own illusions, and aroused the most primitive monsters from their slumber. It took them some time to notice that their ‘anything goes’ motto chimed harmoniously with the arguments of Hitler apologists such as the British historian David Irving, who maintained that no Jews were gassed in Auschwitz. As Professor Richard Evans pointed out in his spirited retort to hyper-relativism, In Defence of History:

There is in fact a massive, carefully empirical literature on the Nazi extermination of the Jews. Clearly, to regard it as fictional, unreal or no nearer to historical reality than, say, the work of the ‘revisionists’ who deny that Auschwitz ever happened at all, is simply wrong. Here is an issue where evidence really counts, and can be used to establish the essential facts. Auschwitz was not a discourse. It trivialises mass murder to see it as a text. The gas chambers were not a piece of rhetoric. Auschwitz was indeed inherently a tragedy and cannot be seen either as a comedy or a farce. And if this is true of Auschwitz, then it must be true at least to some degree of other past happenings, events, institutions, people, as well.

Fearful of being associated with Nazis, especially after the de Man scandal, some relativists staged a tactical retreat by accepting that ‘the facts of the Holocaust closed off the possibility of using certain types of emplotment to describe it’ – while still reserving the right to treat almost everything else as a fictional narrative. Others turned the tables on the traditionalists, arguing that because historians such as David Irving adopted ‘the most conservative possible protocols of discovery, revelation and truth-telling’ – footnotes, bibliographical references – their work actually demonstrated the bankruptcy of classical scholarly approaches to objectivity and truth. ‘Although historians often frame their criticisms of colleagues’ work in terms of evidence – sources overlooked, misplaced emphasis, inappropriate categorisation – such criticisms cannot demonstrate the superiority of one interpretation or story-type over another,’ two post-modernists declared in the Journal of Social History. ‘These debates over evidence are largely diversionary.’

Not so. As Richard Evans wrote, the purpose of source citations is to allow historical interpretations to be tested by appeal to the evidence, ‘and some of the time at least, it really is possible to prove that one side is right and the other is wrong’. This was borne out in the year 2000, when David Irving sued the American historian Deborah Lipstadt for describing him as a pro-Nazi who distorted facts to suit his political purposes. Summoned as an expert witness, Evans spent many months investigating the footnotes in Irving’s books and produced a 700-page report listing all the omissions, misquotations and outright fabrications. Until then, even some reputable academics had maintained that despite Irving’s unsavoury politics his diligence as a researcher couldn’t be faulted. What finally destroyed this reputation was old-fashioned source-checking – the sort of exercise customarily sneered at by post-modernists as ‘fetishising the documents’.

The gas-chambers at Auschwitz were not a fiction. Nor was Stalin’s Gulag, though official historians in the Soviet Union affected to believe that it could be airbrushed from the record. Yet even verifiable facts can acquire the dangerous potency of myth in an intellectual climate where equal validity is granted to any interpretation, however perverse, tendentious or ‘transgressive’. ‘The most usual ideological abuse of history is based on anachronism rather than lies,’ the Marxist historian Eric Hobsbawm said in 1993, in a lecture delivered to an audience of students from the former Communist countries of Eastern and Central Europe who knew a thing or two about these abuses. Hobsbawm pointed out that there was indeed a battle of Kosovo in 1389 at which the Turks defeated the Serb warriors and their allies, and it did leave deep scars on the Serbian psyche – but ‘it does not follow that this justifies the oppression of the Albanians, who now form go per cent of the region’s population, or the Serb claim that the land is essentially theirs. Denmark does not claim the large part of eastern England which was settled and ruled by Danes before the eleventh century, which continued to be known as the Danelaw and whose village names are still philologically Danish.’

Nevertheless, Hobsbawm’s distinction is not as straightforward as he thinks. When facts are transmuted into myth for political and nationalistic purposes, as with the Battle of Kosovo, they can become both anachronisms and lies. In the spring of 1999, when the British parliament debated Nato’s military campaign against Serbia, the veteran MP Tony Benn complained that ‘the House suffers from its lack of knowledge of history’ and then proved his own knowledge by declaring that ‘Kosovo has been in Yugoslavia for centuries’ – no mean feat, given that the state of Yugoslavia didn’t exist until the twentieth century. The example given by Hobsbawm in his lecture was a study of the ancient civilisation of the Indus valley entitled Five Thousand Years of Pakistan – even though Pakistan was not even thought of until 1932–3, when the name was invented by student agitators, and has existed as a state only since 1947. There is, he said, ‘no evidence of any more connection between the civilisation of Mohenjo Daro and the current rulers of Islamabad than there is of a connection between the Trojan War and the [present] government in Ankara’.

To say that Elvis Presley is alive would be anachronistic and false; but in 1963 it was demonstrably true. To say that the world is flat has always been untrue, no matter how many people might once have believed it: a glance at a satellite photograph should settle any doubts. Yet the fractured logic of post-modernism misses this distinction, and leads to the conclusion that any perception of ‘reality’ is as valid (or illusory) as another. In their book Intellectual Impostures, Alan Sokal and the Belgian physicist Jean Bricmont quote from an epistemological primer written for high-school teachers in the mid-1990s:

What one generally calls a fact is an interpretation of a situation that no one, at least for the moment, wants to call into question … But a fact can be put into question. Example: for many centuries, it was considered to be a fact that the Sun revolves each day around the Earth. The appearance of another theory, such as that of the diurnal rotation of the Earth, entailed the replacement of the fact just cited by another: ‘The Earth rotates on its axis each day.’

In saying that the sun’s revolution around the earth was ‘considered to be a fact’, the author appeared to accept that it wasn’t really a fact at all. In the next sentence, however, it reverted to the status of ‘fact’ again – albeit a fact that had been ‘replaced’ by another. As Sokal and Bricmont comment, ‘Taken literally, in the usual sense of the word “fact”, this would mean that the Earth has rotated on its axis only since Copernicus.’

All human knowledge is provisional, but it is also incremental: the sum of what we know is far greater today than thirty years ago, let alone three hundred years ago. ‘I have no doubt that, although progressive changes are to be expected in physics, the present doctrines are likely to be nearer to the truth than any rival doctrines now before the world,’ Bertrand Russell wrote in 1959. ‘Science is at no moment quite right, but it is seldom quite wrong, and has, as a rule, a better chance of being right than the theories of the unscientific. It is, therefore, rational to accept it hypothetically.’ For those who regard rationality itself as a form of oppression, however, there is no reason why scientific theories and hypotheses should be ‘privileged’ over alternative interpretations of reality such as religion or astrology. The philosopher Paul Feyerabend – author of the suggestively titled Farewell to Reason, and one of the founding fathers of post-modern anti-scientific relativism – maintained that since all methodologies have their limitations the only rule should be ‘anything goes’. In his influential book Against Method he saw the teaching of science in schools as nothing less than state tyranny:

While the parents of a six-year-old child can decide to have him instructed in the rudiments of Protestantism, or in the rudiments of the Jewish faith, or to omit religious instruction altogether, they do not have a similar freedom on the case of the sciences. Physics, astronomy, history must be learned. They cannot be replaced by magic, astrology, or by a study of legends.

Nor is one content with a merely historical presentation of physical (astronomical, historical, etc) facts and principles. One does not say: some people believe that the earth moves round the sun while others regard the earth as a hollow sphere that contains the sun, the planets, the fixed stars. One says: the earth moves round the sun – everything else is sheer idiocy.

Like his post-modern disciples, Feyerabend held that science was little different from myth: ‘It is one of the many forms of thought that have been developed by man, and not necessarily the best. It is conspicuous, noisy and impudent, but it is inherently superior only for those who have already decided in favour of a certain ideology.’ The sole purpose of rationality was ‘to lend class to the general drive towards monotony’, whereas relativism endorsed ‘the phenomenon of cultural variety’. In short, those who tried to apply boring old reason to human affairs were pedantic, narrow-minded and unromantic.

It’s a familiar complaint: if small children believe that Father Christmas and the Tooth Fairy really exist, why spoil their pleasure? Most parents would probably agree, but they might begin to worry if a child reached the age of sixteen and remained convinced that Santa comes down the chimney on 24 December. Even so, many friends would no doubt reassure them that the endurance of the illusion was harmless and rather charming, while others would counsel against ‘overreaction’. As Cleanthes said in David Hume’s Dialogues Concerning Natural Religion, ‘humorous sects’ needn’t be taken too seriously: ‘If they be thoroughly in earnest, they will not long trouble the world with their doubts, cavils, and disputes: If they be only in jest, they are, perhaps, bad raillers; but can never be very dangerous, either to the state, to philosophy, or to religion.’ Certainly that was the atittude of many amused outsiders to the academic turf wars over deconstructionism. Why worry if Luce Irigaray thinks E=mc2 is a ‘sexed equation’?

But loopiness is not confined to senior common rooms. In their assault on reason, the post-modernists had far more allies than perhaps even they had realised. A Gallup poll in June 1993 found that only 11 per cent of Americans accepted the standard secular account of evolution, that ‘human beings have developed over millions of years from less advanced forms of life, but God had no part in this process’; 35 per cent thought that humans evolved over millions of years, but with divine guidance; and 47 per cent maintained that ‘God created human beings pretty much in their present form at one time within the last 10,000 years or so’ – the creation story as told in the Book of Genesis. Other polls at about the same time discovered that 49 per cent of Americans believed in demonic possession, 36 per cent in telepathy and 25 per cent in astrology; and that no fewer than 68 per cent approved of creationism being taught in biology classes. By then, however, few of creationism’s advocates actually used the word any more. ‘Religious America is awakening,’ President Reagan had announced jubilantly in 1980, shortly before the states of Arkansas and Louisiana passed Bills obliging public schools to teach creationism in science lessons. But the laws were struck down by the Supreme Court, which ruled that because creationism was indeed a religious belief it could not be added to the biology curriculum without infringing the constitutional ban on promoting religion, and thereafter the fundamentalists adopted a more scientific-sounding phraseology –‘abrupt appearance theory’, ‘intelligent-design theory’ – to disguise the fact that their only textbook was the Old Testament.

During his presidential campaign of 2000, George W. Bush often attacked the relativism that ‘liberals’ had inflicted on America – the idea that nothing was right or wrong, true or false. Yet only a few months earlier, when Christian fanatics on the Kansas board of education voted to remove evolution from the state’s science curriculum, Bush paraded his own relativism by arguing that creationism should be taught alongside evolution since ‘the jury is still out’ and ‘children ought to be exposed to different theories about how the world started’. Some theories, as George Orwell might have said, are more different than others. ‘Science is about fact,’ a Kansas newspaper, the Topeka Capital Journal, editorialised. ‘But it’s also about hypotheses; and creationism is as good a hypothesis as any for how the universe began.’ To judge by the newspaper’s letters page, many readers agreed. ‘I am writing in response to the poor souls out there who believe that the state board of education has taken education back to the Dark Ages,’ one wrote. ‘I say it’s about time! … Take my children back to the Dark Ages, where truth was taught and they received the education they deserved.’ No wonder some wags wondered if the Kansas board had decided to solve the Y2K problem by turning the clock back to Y1K. ‘In one pan of the scales,’ Salman Rushdie wrote in the Toronto Globe and Mail, ‘we now have General Relativity, the Hubble Telescope and all the imperfect but painstakingly accumulated learning of the human race, and, in the other, the Book of Genesis. In Kansas, the scales balance.’ And not only in Kansas. Even Al Gore, who had acquired a reputation as the ‘Mr Science’ of the Clinton administration, seemed reluctant to disturb this bogus equilibrium. A few months earlier one of his chief policy advisers had told the Boston Globe that ‘the Democratic party is going to take God back this time’, and on hearing the news from Kansas the candidate said that although he personally favoured the teaching of evolution, ‘localities should be free to teach creationism as well’.

Gore thus maintained the ignoble tradition of politicians from Tennessee – the same state which made itself the laughing stock of the civilised world in 1925 by prosecuting a young high-school teacher, John Scopes, for teaching Darwinian theory in biology class. The great reporter H. L. Mencken, in one of his many lacerating despatches from the Scopes trial, suggested that Tennessee hillbillies ‘are not more stupid than the city proletariat; they are only less informed’. Why, then, were even the most intelligent Tennesseans so reluctant to assist the cause of enlightenment by repudiating the antediluvian nonsense taught in local schools and endorsed by local nabobs? ‘I suspect that politics is what keeps them silent and makes their state ridiculous. Most of them seem to be candidates for office, and a candidate for office, if he would get the votes of fundamentalists, must bawl for Genesis before he begins to bawl for anything else.’ The ‘typical Tennessee politician’ was a man such as the then governor, Austin Peay, who sought to exploit the Scopes trial for his own political advantage before it had even begun. ‘The local papers print a telegram that he has sent to Attorney-General A. T. Stewart whooping for prayer,’ Mencken reported. ‘In the North a governor who indulged in such monkey shines would be rebuked for trying to influence the conduct of a case in court. And he would be derided as a cheap mountebank. But not here.’

Al Gore, who might best be characterised as an expensive mountebank, was another great whooper for prayer. As vice-president, he had on his desk a placard with the toe-curling motto ‘WWJD’ – What Would Jesus Do? Apparently he never pondered a more pertinent question: what would the founding fathers think? The American presidential election of 1800, in which John Adams stood against his old friend Thomas Jefferson, also happened to be a contest between two men who were, at the time, the president of the American Academy of Arts and Sciences and the president of the American Philosophical Society. The historian Henry May described this as ‘a coincidence very unlikely ever to be repeated in American politics’, and his prediction looks increasingly solid. Exactly two centuries later, the main contenders for the presidency were George W. Bush, a genial chump, and Al Gore, a moderately intelligent liar and influence-pedlar – a choice summarised by one British newspaper as ‘Dumbo vs. Pinocchio’.

The contrast between 1800 and 2000 went further than mere intellectual power and integrity. Adams and Jefferson, though flawed and complex characters, were both major figures of the American Enlightenment who believed that what the Europeans had merely imagined was being realised and fulfilled in the New World. Many of the European philosophers of the late eighteenth century thought so too: to the Marquis de Condorcet, America was of all nations ‘the most enlightened, the freest and the least burdened by prejudices’; Diderot saw it as ‘offering all the inhabitants of Europe an asylum against fanaticism and tyranny’. Tom Paine described the cause of America as ‘the cause of all mankind’, since political or clerical aristocracies would hold no sway in a state founded on secular reason and equal opportunity. Two hundred years later, the candidates Gore and Bush were respectively the son of a president and the son of a senator. (The one serious and substantial contender, Ralph Nader, was excluded from the televised debates and largely ignored by the mainstream media, perhaps for fear that he might show up his rivals as a couple of bozos. Liberal Democrats warned potential Nader supporters that unless they voted for Gore as ‘the lesser of two evils’ they would be responsible for letting Bush into the Oval Office, a counsel of despair and desperation likened by the columnist Alexander Cockburn to ‘a man on a raft facing the decision of whether to drink seawater or his own urine’.)

In this light, Henry May might seem to have been right in arguing that the 1800 election ‘marked the real end of the Enlightenment in America’: thereafter, the idealistic rhetoric and practice of the 1770s and 1780s adjusted to the realities of popular democracy, and what came into existence was a nation radically democratic in its suffrage but moderately conservative in its institutions. ‘The Secular Millennium gradually turned into Manifest Destiny … There was less and less disposition to dwell on political doctrines, including the political doctrines of the Enlightenment, closely associated with the increasingly different European world.’

One could appear to prove the point by marking further distinctions between the presidential candidates of 1800 and 2000. Jefferson commissioned for his library a composite portrait of Francis Bacon, John Locke and Isaac Newton, the English prophets of Enlightenment, hailing them as ‘the three greatest men who ever lived, without any exception’. In his book Earth in the Balance, Al Gore described the same Francis Bacon as the greatest villain who ever lived: ‘Bacon’s moral confusion – the confusion at the heart of much modern science – came from his assumption, echoing Plato, that human intellect could safely analyse and understand the natural world without reference to any moral principles defining our relationship and duties to both God and God’s creation.’ Jefferson advised his nephew to ‘question with boldness even the existence of a god; because, if there be one, he must approve the homage of reason rather than of blindfolded fear’; both Al Gore and George W. Bush, however, proudly proclaimed their blindfolded allegiance as born-again evangelical protestants. (At a hustings in December 1999, Republican hopefuls were asked ‘what political philosopher or thinker do you most identify with and why?’ Whereas Steve Forbes spoke of the enduring significance of John Locke, Bush replied simply: ‘Christ, because he changed my heart.’)

The influence of European Enlightenment ideals on American political institutions may have dwindled in the nineteenth and twentieth centuries; but the Enlightenment had never been a purely or even predominantly political movement in the first place. A more general respect for the secular, liberal humanism of the founding fathers – and for the spirit of scientific inquiry embodied by Benjamin Franklin, extravagantly depicted by the French Enlightenment philosopher Turgot as a liberating hero who ‘seized fire from the heavens and the sceptre from the tyrant’s hand’ – endured far beyond the lifetime of Thomas Jefferson. ‘Thank heaven I sat at the feet of Darwin and Huxley,’ Theodore Roosevelt wrote in 1918, explaining how he became a naturalist. Woodrow Wilson, asked in 1922 for his thoughts on evolution, replied that ‘of course like every other man of intelligence and education I do believe in organic evolution. It surprises me that at this late date such questions should be raised.’ Only three years later, they were propelled on to every front page by the Scopes trial.

The small courtroom in Dayton, Tennessee, became the arena for an extraordinary joust between two national figures – William Jennings Bryan, a former presidential candidate who installed himself as ‘associated prosecuting counsel’, and Clarence Darrow, the country’s most successful and famous attorney, who volunteered to represent Scopes. Each man saw the trial as nothing less than a battle between light and darkness, and it culminated in a direct showdown when Darrow called Bryan himself to the witness stand ‘to show the people what fundamentalism is’. Although the judge protectively reminded Bryan that he was not obliged to endure this cross-examination, the elderly statesman seemed willing – indeed honoured – to appear as an expert witness on behalf of God. ‘These gentlemen’, he snarled, gesturing at the defence team, ‘did not come here to try this case. They came here to try revealed religion. I am here to defend it and they can ask me any questions they please.’ Darrow duly did so, eliciting a torrent of absurdities. ‘If God had wanted a sponge to think,’ Bryan declared, ‘a sponge could think.’ He also insisted, to incredulous hilarity from the defence benches, that humans were not mammals. Everything in the Old Testament – from Jonah and the whale to Noah and the ark – was literally true. God really did make the world 5,000 years ago.

‘Do you say’, Darrow asked, ‘that you do not believe that there were any civilisations on this earth that reach back beyond five thousand years?’

‘I am not satisfied by any evidence that I have seen.’

‘Don’t you know that the ancient civilisations of China are six or seven thousand years old, at the very least?’

‘No, but they would not run back beyond the creation, according to the Bible.’

‘Have you any idea how old the Egyptian civilisation is?’

‘No.’

Metropolitan sophisticates all over the US, and far beyond, sniggered over the reports of Bryan’s buffoonery. But metropolitan opinion counted for nothing in Dayton. Since the judge had ruled that Darwinism was inconsistent with the tale of Eve being created from Adam’s rib (thus violating the state’s law banning the teaching of anything that denied Genesis), and since the jury had to decide only whether Scopes had used a biology textbook explaining evolutionary theory – which he admitted – there could be only one verdict. He was convicted, and fined $100. Over the next few years, as other states copied Tennessee’s anti-evolutionism statute, Darwin was removed from most school textbooks, not to return until the early 1960s.

The growing appeal of evangelical fundamentalism in America during the 1920s can most plausibly be interpreted as a quest for simple certainty by people who found the pace of change in society both bewildering and alarming. Mencken made the point with typical pugnacity in his article ‘Homo Neanderthalensis’, published a week or so before the Dayton hearings began in the summer of 1925:

The inferior man’s reasons for hating knowledge are not hard to discern. He hates it because it is complex – because it puts an unbearable burden upon his meagre capacity for taking in ideas. Thus his search is always for short cuts. All superstitions are such short cuts. Their aim is to make the unintelligible simple, and even obvious. So on what seem to be higher levels. No man who has not had a long and arduous education can understand even the most elementary concepts of modern pathology. But even a hind at the plough can grasp the theory of chiropractic in two lessons. Hence the vast popularity of chiropractic among the submerged – and of osteopathy, Christian Science and other such quackeries with it. They are idiotic, but they are simple – and every man prefers what he can understand to what puzzles and dismays him.

The popularity of fundamentalism among the inferior orders of men is explicable in exactly the same way. The cosmogonies that educated men toy with are all inordinately complex. To comprehend their veriest outlines requires an immense stock of knowledge, and a habit of thought. It would be as vain to try to teach to peasants or to the city proletariat as it would be to try to teach them to streptococci. But the cosmogony of Genesis is so simple that even a yokel can grasp it. It is set forth in a few phrases. It offers, to an ignorant man, the irresistible reasonableness of the nonsensical. So he accepts it with loud hosannas, and has one more excuse for hating his betters.

Mencken was an unashamed snob, and his assumption that truth is beyond the comprehension of all but a small elite overlooks the indisputable fact that a partiality to bunkum is not confined to ‘the lower orders’ – unless one extends the definition of that phrase to include eminent grandees. Mencken himself admitted elsewhere that superstition is often ‘cherished by persons who should know better’. Recalling the surprise expressed by Woodrow Wilson in 1922 that anyone should still question organic evolution, one can imagine his astonishment had he known that Messrs Gore and Bush, the two men vying to become the first American president of the next century, would both flaunt their sympathy for the militant simpletons who were still fighting the good fight on behalf of the Book of Genesis.

Fortunately, as history confirms time and again, America is not dependent on presidents to protect its intellectual standards and values. It may be infested with flat-earthers and TV evangelists, but it also has more Nobel prizewinners than anywhere else – and plenty of citizens who will strenuously defend the legacy of Thomas Jefferson and Benjamin Franklin. For every populist moron such as William Jennings Bryan there is always at least one Clarence Darrow. After the Kansas vote in 1999, while the presidential wannabes were havering, the Washington Post struck a Menckenesque note by publishing this spoof memo from God to the Kansas board of education: ‘Thank you for your support. Much obliged. Now, go forth and multiply. Beget many children. And yea, your children shall beget children. And their children shall beget children, and their children’s children after them. And in time the genes that made you such pinheads will be eliminated through natural selection. Because that is how it works.’ Even in Kansas itself, the state’s Republican governor, Bill Graves, proved willing to take on the blockheads. ‘This is a terrible, tragic, embarrassing solution to a problem that didn’t exist,’ he declared. ‘I think this decision is so out of sync with reality that in some ways it minimises the credibility and the oversight that the state board is supposed to have on schools. What are they going to do, hire the evolution police?’

Why was the Kansas governor able to issue a more forthright condemnation than either of the men hoping to occupy the office held by Thomas Jefferson? ‘It’s really not surprising, if you think about it,’ the governor’s press secretary told a reporter. ‘If you’re running for president, you have to be all things to all people. You don’t want to alienate anybody.’ In mitigation one could argue that George W. Bush, despite being an alumnus of both Yale and Harvard, was a bit of a goof who sincerely believed what he said about the jury still being out. The same indulgence cannot be permitted to Al Gore (Harvard, class of ‘69), who proudly paraded his scientific knowledge at every opportunity; nor to Tony Blair, the British prime minister, whose favourite word was ‘modernisation’. These two clearly come into the category of ‘persons who should know better’. Yet Blair, like Gore, took refuge in post-modern relativism to justify appeasing pre-modern zealots.

In March 2002 the Guardian revealed that Christian fundamentalists had taken control of a state-funded secondary school in north-east England and were striving to ‘show the superiority’ of creationist beliefs in their classes. ‘As Christian teachers it is essential that we are able to counter the anti-creationist position,’ the vice-principal of Emmanuel College, Gateshead, had advised colleagues. Another senior member of staff argued that Darwinians have ‘a faith which is blind and vain by comparison with the faith of the Christian … A Christian teacher of biology will not or should not regard the theory of evolution as axiomatic, but will oppose it while teaching it alongside creation.’

In Britain, as elsewhere in Europe, creationism has little appeal. Both the Anglican and Roman Catholic hierarchies have long since accepted Darwin’s theory: even Pope John Paul II said that it was ‘more than just a hypothesis’. But Tony Blair had already announced his intention of building more ‘faith-based schools’, and the news from Gateshead strengthened the suspicion that some of these academies would proselytise rather than educate. Jenny Tonge MP asked if the prime minister was ‘happy to allow the teaching of creationism alongside Darwin’s theory of evolution in state schools’. A simple ‘no’ was surely the only possible answer, especially as he was due to deliver a speech to the Royal Society a few days later in which he would extol ‘proper science’ and warn against ‘a retreat into a culture of unreason’. But it was not the answer he gave. Blair told Jenny Tonge that the creationists of Gateshead were doing a splendid job: ‘In the end, a more diverse school system will deliver better results for our children.’

A few Labour backbenchers gawped in amazement as the significance of Blair’s reply sunk in. Here was the leader of a supposedly secular, progressive government who, on being invited to assert that probable truth is preferable to palpable falsehood, pointedly refused to seize the opportunity – and indeed justified the teaching of bad science in the name of ‘diversity’. He might just as well have trotted out the pernicious old maxim that ignorance is bliss, the last refuge of tyrants ever since God banished Adam and Eve from Eden for sampling the fruit of knowledge or the classical deities unleashed misery on the world through Pandora’s box in revenge for Prometheus’ heroic disobedience.

Had Tony Blair meant what he said when he told his party conference in 1996 that New Labour took its inspiration from ‘the ancient prophets of the Old Testament’? A more likely explanation is that he had been infected (however unwittingly) by the cultural, moral and intellectual relativism of the post-modernists, and by the fashionable disease of ‘non-judgmentalism’. As if to confirm the modishness of this affliction, his statement went largely unchallenged, even though it marked a new low in contemporary British political discourse. What if some schools informed their pupils that the moon was made of Swiss cheese, or that the stars were God’s daisy-chain? Would that be officially welcomed as another healthy consequence of Blair’s ‘more diverse school system’?

This is the enfeebling legacy of post-modernism – a paralysis of reason, a refusal to observe any qualitative difference between reasonable hypotheses and swirling hogwash. At a time when countless loopy creeds were winning new converts it gave aid and comfort to the pedlars of nonsense. Even extra-terrestrial conspiracy theories were granted some academic respectability, notably through the publication in 1998 – by the reputable Cornell University Press – of Aliens in America: Conspiracy Cultures from Outerspace to Cyberspace, a post-structuralist study of UFO sightings and alien abductions. The author, Professor Jodi Dean, was not an astronomer but a political scientist who specialised in ‘identity politics’, and throughout the book’s 242 pages she strenuously avoided any kind of judgment on the likelihood of what she described – unsurprisingly, given her insistence that reality no longer exists anyway. Or, to quote her own professorial prose, ‘the fact that abduction accesses the stresses and excesses of millennial technoculture doesn’t get to the truth of abduction (as if getting to truth were still a possibility)’. Alien narratives, she argued, ‘challenge us to face head-on … the dissolution of notions of truth, rationality and credibility’. If notions of truth were disintegrating, shouldn’t a professor feel some duty to rescue them from the acid-bath? Not at all: Jodi Dean rejoiced at their destruction, since rational dispute was an instrument of oppression rather than a method of seeking some kind of verity. ‘Argument, thought by some to be an important part of the process of democracy, is futile, perhaps because democracy can bring about Holocaust.’

While indicting anyone who clung to the discredited methods of reason and critical engagement as an accomplice in genocide, she had no such strictures for the ufologists, who were presented as heroic dissidents opposing the ‘governmental-juridical discourse’ and the ‘elite, official “arbiters of reality”’. And, as she noted with pleasure, they were rapidly becoming a mass movement. According to one opinion poll in the 1990s, some 2 per cent of Americans said that they had been kidnapped by extra-terrestrials, which would translate into 3.7 million victims. (‘Since many claim multiple kidnappings,’ John Leonard wrote in the Nation, ‘we are talking about an air-traffic control nightmare.’) A 1996 poll for Newsweek found that 48 per cent of all Americans believed in UFOs and 27 per cent thought that aliens had visited the earth. If, as Dean argued, a belief in UFOs and alien abductions was a ‘political act’, since it ‘contests the status quo’, there must have been many more radicals in the United States than one might suppose; and although she categorised them as ‘the oppressed’, they included some of the most powerful people in the land.