Credulous thinking is spreading through society as fast and silently as a virus, and no one has a clue how long the epidemic will last. Counterknowledge is not like smallpox, which has been completely eradicated through vaccination. A better analogy would be HIV/AIDS, which has a frightening ability to mutate. No sooner do we think that a strain of counterknowledge is under control than we are confronted by an unexpected variant. Scientific Creationism morphs into Intelligent Design; neo-Nazi Holocaust denial becomes Muslim Holocaust denial.
As a society, we can take prophylactic measures against counterknowledge. Western civilization has developed the intellectual tools to dismantle pseudohistory and pseudoscience. Whether we choose to use them is another matter; the work is hard, usually technical and often unrewarding. There may still be time to turn counterknowledge into a chronic but containable disease. But we are a long way from reaching that stage.
Before we can even begin to control the effects of counterknowledge, we need to understand how the phenomenon developed. It is usually not difficult to follow a strand of information back to the cultic milieu. The doctrines of homeopathy derive from eighteenth-century quackery; Graham Hancock’s lost civilizations revive the fantasies of Victorian eccentrics; modern Islamic anti-Semitism draws on the medieval Christian blood-libel and Nazi racial ‘science’. But these historical precedents do not explain why intelligent people propagate and consume these ideas.
In his book Why People Believe Weird Things, Michael Shermer, the founder of Skeptic magazine, addresses this question. ‘As a culture, we seem to have trouble distinguishing science from pseudo-science, history from pseudohistory, and sense from nonsense,’ he writes. ‘But I think the problem lies deeper than this. To get to it we must dig through the layers of culture and society into the individual human mind and heart.’1 The reasons he comes up with are mostly psychological: it is comforting to believe that a psychic can put you in touch with your loved ones, or that eating broccoli will prevent you getting cancer; it is oddly reassuring to know that apparently random acts of evil are being coordinated by a satanic conspiracy. The practitioners of counterknowledge teach us that the universe is not arbitrary, that things happen for a reason.
The problem is that it is difficult to provide solid evidence for Shermer’s inferences. A few years ago, I carried out sociological research into Pentecostal Christian ideas about the end of the world. The more I talked to people, the more sceptical I became about broad-brush psychological explanations for unorthodox beliefs. Theories based on mass psychology are not necessarily wrong, but they deal in concepts such as ‘disorientation’ and ‘insecurity’ which are hard to measure. What we can do, however, is observe the social processes that create space for counterknowledge.
Consider the following statistics. Between 1980 and 2005, British church attendance fell from 4.7 million to 3.3 million.2 Membership of political parties has fallen from 3.5 million in the 1950s to around 0.5 million. The number of weddings in the UK dropped from 480,000 in 1972 to 284,000 in 2005.3
Each of these trends reflects the fragmentation of traditional authority structures–churches, political parties and the two-parent family–that previous generations rarely questioned. In the words of Boston sociologist Peter Berger, society is moving inexorably ‘from fate to choice’. Modernity and the marketplace dismantle all sorts of institutions, including many whose authority is implicit rather than explicit, such as publicly funded broadcasters and family-run businesses. And every change brings with it new possibilities that are both liberating and a burden. The subjective side of human experience takes over from the objective.
According to the sociologist Anthony Giddens, we are all being drawn into ‘the reflexive project of the self’.4 The disintegration of communities, the inescapable presence of electronic media, the growing influence of distant happenings on our everyday lives, the bewildering array of lifestyle options–all these factors force us to choose who we are, in a way that our grandparents never had to. More possibilities are presenting themselves every day; no sooner have we made these choices than we have to revise them. We are works in progress. And, like artists who present their work in progress to the public, we just love to talk about it.
Indeed, we are bound to talk about it: there are so many things to discuss. It is all very well to be given greater freedom to choose our jobs, sexual preferences, political identity, philosophy and religious beliefs, but in making those choices we first have to decide what we believe. ‘The modern individual must stop and pause, where pre-modern man could act in unreflective spontaneity,’ writes Berger. ‘The answers to the perennial human question “What can I know?” become uncertain, hesitating, anxious.’5
Our task is not made any easier by the fact that the public institutions that previously acted as the gatekeepers to intellectual orthodoxy are now telling us that we can believe more or less what we like. Universities, government departments and churches were all hugely affected by the upheavals of the 1960s. As the author of The Clash of Civilisations, Samuel Huntington, puts it: ‘People no longer felt the same compulsion to obey those whom they had previously considered superior to themselves in age, rank, status, expertise, character or talents. Within most organizations, discipline eased and differences in status became blurred.’6 This ‘democratic egalitarianism’ greatly increased the self-consciousness of minority groups: students, ethnic minorities, homosexuals, feminists and political activists. Each of these groups wanted the authority to decide what they believed, including the authority to decide what constituted a fact.
Far from opposing this trend, many intellectual gatekeepers took voluntary redundancy; all points of view (except right-wing ones) were regarded as valid. Institutions founded upon the ideals of the Enlightenment abandoned the very principles that made them. University lecturers across the humanities threw themselves into the task of constructing specialist disciplines, such as black history and feminist literary criticism. From there it was just a small jump to shifting the boundaries of fact in order to avoid offending delicate sensibilities. In 1998, for example, a university press published Aliens in America, a study of the alien abduction phenomenon by Professor Jodi Dean, a leading feminist scholar. In it, Dean refused to acknowledge that alien abductions do not exist and have never happened; instead, she praised the ‘UFO community’ for challenging oppressive and exclusionary ‘norms of public reason’.7
There is something appropriate about left-wing academics sticking up for the UFO community, since by this stage many of them were fantasizing as vividly as alien abductees. Scholars created ‘narratives’ that, in addition to challenging the perspective of a white male elite, shamelessly rewrote history, replacing data with fiction and facts with theory. The transformation of ancient Egypt into ‘Kemet’ is an example; so are the countless books in the field of queer studies ‘outing’ historical figures on the basis of zero evidence. And the cognitive dissonance created by this embracing of non-facts was neatly disposed of by the emergence of postmodernism, which sought to delegitimize the very notion of gathering and measuring data.
In the 1980s and 1990s, some French and American postmodernists infected academia with a fantastically pretentious form of scientific counterknowledge. Having decided that science was just another textual game, they started playing it themselves, with ludicrous results. The French feminist critic Luce Irigaray solemnly described E = mc2 as a ‘sexed equation’ because it ‘privileges the speed of light over other speeds that are vitally necessary to us’. She suggested that pure mathematics was biased by its ‘sexist’ concern with closed spaces, rather than the ‘partially open’ structures visible to the subtler female mind.8
In 1996, the French physics professor Alan Sokal produced a paper, ‘Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity’, which attacked his colleagues for ‘hewing to the “objective” procedures and epistemological structures prescribed by the (so-called) scientific method’. This was just the sort of message that avant-garde literary theorists wanted to hear. Finally, a real scientist had acknowledged that (as he put it) ‘the discourse of the scientific community, for all its undeniable value, cannot assert a privileged epistemological status with respect to counter-hegemonic narratives emanating from dissident or marginalized communities’.9 Sokal’s paper was snapped up by the influential American cultural studies journal Social Text. On the day of its publication, however, the author revealed that ‘Transgressing the Boundaries’ was a hoax–‘a pastiche of left-wing cant, fawning references, grandiose quotations, and outright nonsense…structured around the silliest quotations I could find’.
The editors of Social Text were not amused. Postmodernists may regard scientific discourse as linguistic tricks, but they don’t like it when someone plays a trick on them. The consensus among lecturers in cultural studies was that Sokal’s hoax was bad form and that his message could safely be ignored. In a 2006 Open University textbook called Science, Technology and Culture, David Bell, who teaches cultural studies at Manchester Metropolitan University, accused Alan Sokal of ‘boundary-policing’ and ‘trampling all over cultural theory’.10 He wrote: ‘Sokal’s point was an old one: leave science to the scientists.’ Actually, Sokal was saying something subtly different: that if humanities academics are going to invoke calculus and quantum physics, they should know what they are talking about. In most cases, they understand these subjects, if at all, only at the level of popularizations. In some universities, students are being offered two quite separate varieties of pseudoscience: the quasi-scientific theories of literary scholars, and the bogus claims of alternative medicine.
How can universities and other public institutions get away with promoting counterknowledge so brazenly? One answer is that in some respects they are still as powerful as ever. The revolution of the 1960s and 1970s may have persuaded them to surrender many of their traditional responsibilities, but it left much of their internal bureaucratic authority intact. Universities, government departments, local education authorities and publicly funded broadcasters are still run along autocratic lines, though that autocracy is disguised. Their administrators still want to challenge ‘elitist’ attitudes by promoting ‘counter-hegemonic narratives’ based on wish-fulfilment rather than data. But, paradoxically, they do so by exercising their own hegemony. And so it becomes official policy to promote versions of history that bolster self-esteem rather than convey useful facts. Passion passes for rigour; counterculture has turned into counterknowledge.
But it would be wrong to blame everything on ageing hippies. Another reason why institutions are able to promote bogus scholarship is that both counterculture and counterknowledge are surprisingly at home in the capitalist free market. The counterculture arose out of modernity, whose main driving force is capitalism. As long ago as 1942, the political scientist Joseph Schumpeter described ‘creative destruction’ as the essential fact about capitalism. The market is a tremendously radical agent of change; it does not just produce price competition, but also new commodities, new technologies and new organizations that strike at the very foundations of the old system.11 The middle-class campus radicals of the 1960s and 1970s imagined that they were dismantling capitalism; actually, they were themselves products of a consumerist society, and their narcissistic ‘alternative’ culture further stimulated consumer demand.
Once people are encouraged to redefine themselves, they need goods and services to help them construct their new identities. Afrocentric pseudoscholarship is tied in to a multi-million-pound race-relations industry that receives massive sponsorship from big corporations. Likewise, as we saw in the previous chapter, would-be practitioners of the new ‘science’ of nutritionism are batted back and forth between the University of Bedfordshire and Patrick Holford’s business empire, paying fees to both.
The free market likes counterknowledge. The troubled newspaper industry–all of it, not just the tabloids–increasingly relies on fascinating but untrue stories to sell papers. Specialist reporters are becoming an expensive luxury; and it is a brave young reporter who refuses to ‘follow up’ a report that has appeared in a rival publication simply because it is based on sloppy research. In particular, there is nothing like a health panic to boost circulation. ‘Dangerous’ foods and medicines pose an irresistibly scary threat to the modern project of the self; and, conversely, ‘superfoods’ offer a secular sacrament of redemption, an outward sign of inward grace. Counterknowledge, unconstrained by inconvenient facts, enables the media to repackage real life into ‘real-life dramas’ and history into ‘mysteries’. Fact is presented to us as entertainment–and, increasingly, though we may not be aware that it is happening, entertainment is presented to us as fact.
It is worth pointing out, however, that many of the most ridiculous non-stories in the press are not invented by journalists but derived from newly published books. The readiness of the world’s most distinguished publishing houses to make money from pseudohistory and quackery is particularly sad, since not so long ago these imprints performed the crucial role of closing the gap between inaccessible scholarship and the reading public. Of course, many titles still do this. But these days the presence of a prestigious logo on the spine of a book is no longer a guarantee of quality or even of truthfulness. If people are prepared to pay £10.99 to discover that the northern border of the state of Israel, 33º north of the equator, was decided by Thirty-third Degree Freemasons–one of the claims in Graham Hancock and Robert Bauval’s Talisman12–then the publisher will happily take its money.
The standards of academic publishers have proved similarly flexible. As we noted earlier, the distinguished British publisher Routledge was happy to publish Molefi Kete Asante’s History of Africa (2007), which does not provide references for its pseudohistorical claims about ancient Egypt. This was not Routledge’s first brush with pseudoscholarship. In 2002 it published Attachment, Trauma and Multiplicity, a volume of essays edited by Valerie Sinason on the subject of ‘dissociative identity disorder’ (DID), the new name for multiple personality disorder.13 Only a quarter of American psychiatrists (and even fewer outside America) believe that DID even exists.14 Sinason, a psychoanalyst, dismisses their criticisms, just as she dismisses non-believers in the ‘peacetime Auschwitz’–her phrase–of satanic ritual abuse. Sinason claims to have ‘clinical evidence’ of satanists practising infanticide and cannibalism; on closer inspection, however, the evidence consists of nothing more than her patients’ ‘memories’.15 Attachment, Trauma and Multiplicity contains an essay by Dr Joan Coleman, coordinator of RAINS (Ritual Abuse Information Network and Support), who refers to the ‘many’ DID sufferers who were ‘brought up in families that had practised satanism through several generations’. No evidence is offered for this claim.16 Coleman also believes that many abusers have ‘Masonic connections’.17
The reason this trahison des clercs matters so much is that, despite the privatization of knowledge caused by the explosion of intellectual choice, Western society still has such a thing as the public domain. Broadly defined, this is a place where ideas no longer carry the copyright of their inventors but are part of our shared culture. The consequences of counterknowledge finding its way into this arena can be very serious. The reason millions of parents have had to worry about the MMR jab is that Dr Andrew Wakefield managed to catapult his batty private theory about autism into the public domain. And there it stayed, thanks to medical professionals, science writers and journalists who failed to expose Wakefield’s terrible methodology or publicize the large-scale studies contradicting his claim.
This intellectual sloppiness is more scandalous than the lies and half-truths that politicians tell the public. Indeed, it may even have contributed to a culture of political mendacity. In his book The Rise of Political Lying, Peter Oborne argues that the presence of ‘shameless, habitual liars at the center of power’ during the Blair years was without precedent in modern British politics.18 Francis Wheen, attacking Tony Blair in his book, How Mumbo Jumbo Conquered the World, for allowing Creationism to be taught in a state school, suggested that the prime minister ‘had been infected (however unwittingly) by the cultural, moral and intellectual relativism of the postmodernists, and by the fashionable disease of non-judgmentalism’.19
In the final analysis, however, we elect politicians to improve our lot, not to educate us. If we catch them lying about something important, then the punishment is often merciless; but as a general rule we do not take their representations of reality at face value, any more than we believe in the literal truth of advertisements. Most Western politicians do not systematically disseminate counterknowledge; indeed, one of the signs that a politician has left the bounds of democratic discourse is that he deals in pseudohistory or pseudoscience (by, for example, proclaiming the racial inferiority of his enemies). In contrast, we do expect university lecturers, schoolteachers, doctors and other professionals to help us distinguish between a fact and an unproven hypothesis.
The methodology of the Enlightenment was under assault from capitalism and the counterculture long before most of us had internet access. But, inevitably, the new medium has speeded up the privatization of knowledge, and increased our ability to incorporate elements of fantasy into our ‘work in progress’. Thanks to the internet, millions of people have unconsciously absorbed postmodern relativism. To adapt the old Scientology slogan: if it’s a fact for you, it’s a fact. And your computer will hook you up with people who share your views, however ludicrous.
Sane people do not normally choose to believe things that contradict the direct evidence of their own senses. You do not decide that you are the president of the United States (unless you are). You may, however, choose to believe that the president ordered fake terrorists to fly two airliners into the World Trade Center. That choice involves, in effect, sticking your fingers in your ears and singing ‘la, la, la, I can’t hear you!’ when anyone points out the falsehoods upon which Loose Change is based. Unfortunately, it is also an easy option to take. Just bookmark your favourite ‘9/11 Truth’ websites and join one of Facebook’s Loose Change groups to link up with like-minded conspiracy theorists. Once again, Professor Jodi Dean is on hand to offer moral support. As she wrote on her blog on 28 January 2007: ‘Ultimately, the 9/11 truth phenomena indicate what happens to credibility under the conditions of the lack of symbolic efficiency: there isn’t a signifier strong enough to hold together a discourse within which credibility might emerge.’20
Andrew Keen, a former digital entrepreneur who has turned into one of the most ferocious critics of the internet, deplores what he calls the ‘flattening of truth’ in cyberspace. ‘Today’s media is shattering the world into a billion personalized truths, each seemingly equally valid and worthwhile,’ he writes. ‘This undermining of truth is threatening the quality of civil public discourse, encouraging plagiarism and intellectual property theft, and stifling creativity…Instead of more community, knowledge or culture, all that Web 2.0 really delivers is more dubious content from anonymous sources, hijacking our time and playing to our gullibility.’21
The flattening of truth is destroying the critical faculties of young people, says Keen. ‘These days, kids can’t tell the difference between credible news by objective professional journalists and what they read on joeshmoe.blogspot.com. For these generation Y utopians, every posting is just another person’s version of the truth; every fiction is just another person’s version of the facts.’22
In his book The Cult of the Amateur, Keen proposes all sorts of regulatory measures to address the ‘democratized chaos’ of cyberspace, ranging from legislation aimed at protecting traditional media to moving children’s computers out of their bedrooms and into the living room. In reality, it is unlikely that any of these measures would work. Attempts to control the flow of information on the internet are doomed to failure; its infrastructure is simply too complex, and generation Y would enjoy nothing more than running rings around cyber-police. In any case, although middle-class students may be the target audience of manufacturers of internet conspiracy theories, they are not necessarily the people most vulnerable to it.
In the long term, the real menace of the internet is its ability to carry the virus of counterknowledge to societies that are not protected by evidence-based methodology. Take the example of pseudoscientific ‘cures’ for AIDS in South Africa. For years, President Thabo Mbeki has done his best to hinder the distribution of antiretroviral drugs to his country’s 5.5 million HIV-positive citizens; he has described these medicines as ‘damaging’ and ‘toxic’ and hinted that the CIA is implicated in the epidemic.23 In doing so, he has received wholehearted support from Western purveyors of counterknowledge, known as ‘AIDS denialists’, who dismiss the conclusive evidence that HIV causes AIDS.
The world’s leading denialist is Peter Duesberg, a molecular biologist who argues that to prevent AIDS, and even cure the disease, it is necessary only to eat properly and abstain from toxic drugs. The American government’s top AIDS adviser, Anthony Fauci, takes a different view, as the New Yorker reported in March 2007. After listening to Duesberg speak at an AIDS research conference, the normally mild-mannered Fauci erupted. ‘This is murder,’ he said. ‘It’s really that simple.’24 In the late 1990s, Mbeki discovered Duesberg’s work on the internet and subsequently appointed him to a presidential advisory panel.
Mbeki’s support for Duesberg opened the door to a whole range of counterknowledge entrepreneurs selling their own treatments for AIDS. One of the most controversial is Matthias Rath, a German physician and vitamin salesman who urges people to take high doses of multivitamins instead of antiretroviral drugs. Patrick Holford, you may recall, is on record as saying that vitamin C is as effective as AZT, though he has never elaborated on this brief statement during his promotional tour of South Africa. The appalling Rath, on the other hand, held a press conference in Cape Town in 2005 at which he stated that ‘the course of AIDS can be reversed naturally’.25
The South African government, under pressure from the United Nations, is now retreating from its public support for AIDS denialists. But it is doing little or nothing to stop quacks spreading their deadly message; as recently as February 2007, the Rath Foundation helped run a workshop on fighting AIDS at the state-funded University of KwaZulu-Natal.26 Meanwhile, South Africans in the countryside are slowly gaining access to the internet, where ever more extreme theories are flourishing. One strain of denialism insists that there is no AIDS epidemic in Africa at all. A website called Virusmyth.net links to hundreds of articles written by pseudoscientists who believe that ‘the virus called HIV is harmless and not sexually transmitted; it probably has toxic causes’. At the top of the home page a banner proclaims: ‘Support President Mbeki to find the truth about “AIDS”.’27
The conjunction of modern counterknowledge and traditional superstition in sub-Saharan Africa threatens the lives of millions of people. But it is easy to put to the back of our minds because it has few immediate consequences for the West. The conjunction of counterknowledge and Islam, on the other hand, creates intractable problems both for Muslim countries and for Europe, where the number of Muslims is expected to double from 20 million to 40 million by 2025.28 As we have seen, the penetration of the Islamic world and diaspora by conspiracy theories and Creationism is terrifyingly high: most Muslims worldwide believe that Arabs were not involved in 9/11; only 29 per cent of British Muslims believe historical accounts of the Holocaust; more than 90 per cent of Muslims worldwide reject the theory of evolution. And these figures are almost certainly being pushed up, not down, by modernity in the shape of digital technology.
Many commentators talk in terms of disaffected Muslim youth being further radicalized by their encounter with the internet. But if we focus too narrowly on ‘cyber-jihad’, as it is sometimes called, we miss the bigger picture. Islamic culture is unable to defend itself against counterknowledge because it has failed to keep up with the intellectual advances of Western civilization. Pervez Hoodbhoy, professor of physics at Quaid-e-Azam University in Islamabad, has stated bluntly, and bravely, that no Muslim country–not one–has a viable educational system or a university of international stature. ‘Although genuine scientific achievement is rare in the contemporary Muslim world, pseudoscience is in generous supply,’ he wrote in the Washington Post in 2002.
A former chairman of my physics department in Islamabad has calculated the speed of heaven. He maintains it is receding from Earth at one centimetre per second less than the speed of light. His ingenious method relies upon a verse in the Islamic holy book, which says that worship on the night on which the book was revealed is worth a thousand nights of ordinary worship. He states that this amounts to a time-dilation factor of 1,000, which he puts into a formula of Einstein’s theory of special relativity…
One of the two Pakistani nuclear engineers who was recently arrested on suspicion of passing nuclear secrets to the Taliban had earlier proposed to solve Pakistan’s energy problems by harnessing the power of genies. He relied on the Islamic belief that God created man from clay, and angels and genies from fire; so this high-placed engineer proposed to capture the genies and extract their energy.29
Ziauddin Sardar, an influential left-wing Muslim journalist based in London, has observed despairingly that, since the 1990s, Islamic countries have moved away from scientific methodology and towards dangerous obscurantism. ‘The Islamic science discourse now follows the way of the Taliban,’ he writes.
According to Sardar, Saudi Arabia has been pouring money into attempts to prove that all modern scientific discoveries were fore-shadowed in scripture:
This tendency has spouted a whole genre of apologetic literature (books, papers, journals) looking at the scientific content of the Qur’an. Relativity, quantum mechanics, big bang theory, embryology and much of modern geology have been ‘discovered’ in the Qur’an. Conversely, ‘scientific’ experiments have been devised to discover what is mentioned in the Qur’an but not known to science–for example, the programme to harness the energy of the jinn that enjoyed much support in the mid-nineties in Pakistan. This highly toxic combination of religious fundamentalism and ‘science’, akin to the Creationists…attacks anyone who shows a critical or sceptical attitude towards science and defends its own faith as scientific, objective and ‘rational’. Unfortunately, it is now the most popular version of ‘Islamic science’.30
Hoodbhoy and Sardar have taken considerable risks by drawing attention to this assault on intellectual progress by counterknowledge. Yet many non-Muslim commentators consider it ‘inappropriate’ to make a fuss about what is happening, preferring to concentrate on safer targets such as American fundamentalists. In 2007, Scientists Confront Intelligent Design and Creationism, a volume of essays edited by Andrew Petto and Laurie Godfrey, attacked one of the most insidious scientific fallacies of the 21st century. Nowhere in its 450 pages is there a discussion of Islamic Creationism. Nor is there a single reference to Turkey, the country that is fast overtaking the United States as the major source of Creationist propaganda.31
History as well as science is under assault from Muslim counterknowledge. In December 2006, the Islamic Republic of Iran organized a conference entitled ‘A Review of the Holocaust’ in Tehran, where participants included the notorious French Holocaust revisionist Robert Faurisson and the far-right American demagogue David Duke.32 One of the attendees was Shiraz Dossa, a tenured professor of political science at St Francis Xavier University, Nova Scotia. In the June 2007 issue of the Literary Review of Canada, Dossa attacked the ‘illiterate Islamophobes’ who had criticized him for going to the conference. Their arguments were based on two fallacies, he said. The first was that President Ahmadinejad had ever called the Holocaust a myth; instead, he had merely questioned its ‘mythologizing and sacralizing’ by supporters of Israel. Presumably, Dossa had missed Ahmadinejad’s speech in the city of Zahedan in December 2005 in which he denounced ‘the myth that Jews were massacred’.33
Dossa continued: ‘The second western fallacy is that the event was a Holocaust-denial conference because of the presence of a few notorious western Christian deniers/skeptics, a couple of a neo-Nazi stripe. It was nothing of the sort. It was a Global South conference convened to devise an intellectual/political response to western-Israeli intervention in Muslim affairs. Holocaust deniers/skeptics were a fringe, a marginal few at the conference…Out of the 33 conference paper givers, 27 were not Holocaust deniers, but were university professors and social science researchers from Iran, Jordan, Algeria, India, Morocco, Bahrain, Tunisia, Malaysia, Indonesia and Syria.’34 Dossa did not say whether he had time to examine one of the exhibits at the conference, brought along by the ‘historian’ Frederick Toeben, who has served a prison sentence in his native Germany for denying the Holocaust. According to the official Iranian news agency, this was ‘a huge model of the Treblinka extermination camp, complete with model trains and human figures, which he said he would use to argue that the gas chambers did not exist’.35
Even if we forget for a moment the vile nature of the counterknowledge on display, it is profoundly depressing that academics from so many countries felt that their ‘Global South’ identity freed them from Western standards of evidence. There is an analogy with conferences on AIDS held by denialist governments in sub-Saharan Africa in recent years; at these gatherings, it was considered unnecessary and even patronizing to subject traditional remedies to rigorous empirical tests. The truth of a claim depended on its provenance and associations, not the evidence of the senses.
The future of global counterknowledge is hard to predict; we could be moving towards a world in which ideological and even religious struggles are replaced, or at least shaped, by battles between real and confected evidence. The endless adaptability of digital technology, coupled with the large-scale migration of populations from Africa and the Middle East to Western Europe, will make such conflicts very hard to contain. Fortunately, however, we do not yet live in a society in which cyberspace has abolished the relationship between public and private knowledge, or globalization has destroyed the relationship between local institutions and the circulation of knowledge. As I suggested at the beginning of this chapter, the West can, and must, take preventive measures against counterknowledge.
One measure in particular springs to mind. In the last couple of years, counterknowledge has proved surprisingly vulnerable to guerrilla attacks from the blogosphere. Freelance defenders of empirical truth, armed to the teeth with hard data, have mounted devastating ambushes on quacks and frauds who have ventured too far into the public domain. The tactic is an antiretroviral rather than a vaccine, and too modest in scope to effect dramatic change in society, but it does seem to work. The lives of celebrity pseudoscientists have been made an absolute misery by Bad Science, Holdfordwatch, the Quackometer blog and David Colquhoun’s Improbable Science website. Reputations are easily damaged in a furiously competitive market, and people rather enjoy the spectacle of smug, rich lifestyle gurus being humiliated.
To go back to an earlier example, if I mention the name Gillian McKeith to my friends, they reply: ‘Oh, you mean “Doctor” Gillian McKeith,’ wiggling their fingers to signify inverted commas. For that, we once again have to thank Ben Goldacre, whose sustained mockery of McKeith’s ‘PhD’ on his website led one of his readers–‘angry nerds’, as he calls them–to report her to the Advertising Standards Authority. As a result, she can no longer call herself Dr McKeith when advertising her products. ‘I can barely contain my pride,’ wrote Goldacre. ‘Is it petty to take pleasure in this? No. McKeith is a menace to the public understanding of science. She seems to misunderstand not nuances, but the most basic aspects of biology–things a 14-year-old could put her straight on.’36
Guerrilla bloggers have also managed to focus attention on the incredibly poor standard of science and health reporting across British journalism. By general consent, the worst offender is the Daily Mail, which reports implausible ‘cures’ for conditions such as dyslexia and ADHD alongside scary stories about the electromagnetic dangers of household appliances. But the most irresponsible piece of science reporting in 2007 came from the upmarket, left-leaning Observer. On 8 July it led on a story headed ‘New Health Fears over Big Surge in Autism’. This claimed that new research had found an increase in the prevalence of autism from one in 100 to one in fifty-eight; that the lead academic on this study was so concerned that he suggested raising the finding with public health officials; and that the two leading researchers on the team believed that the rise was due to the MMR vaccination.
As Goldacre pointed out on his blog and in the British Medical Journal, all three claims were wrong. The one in fifty-eight figure was speculative; the academic in question, Professor Simon Baron-Cohen, had made no such suggestion; and the ‘leading researchers’ were research assistants, one of whom, Dr Fiona Scott, was so convinced there was no link between the MMR vaccine and autism that she arranged for her own child to have the triple jab.37 The Observer was hugely embarrassed by these revelations and removed the story from its website. The Guardian, meanwhile, gleefully ran a piece reporting the controversy. (Although the two newspapers are part of the same media group, there is little love lost between them.) On 22 July 2007, the Observer finally published a ‘clarification’. In it, Dr Scott was quoted as saying that ‘it is outrageous that the article states that I link rising prevalence figures to use of the MMR. I have never held this opinion. I wholeheartedly agree with Professor Baron-Cohen that the article was irresponsible and misleading.’38
The Observer’s reputation for accurate science reporting will take a long time to recover from its flirtation with counterknowledge; for years to come, internet searches for MMR will point readers in the direction of its blunders. That is encouraging. We need a whole generation of digital warriors who know how to damage the professional reputations of institutions and people who propagate counterknowledge.
A good place to start might be the University of Westminster, which received some bad publicity as a result of the Nature report into its degrees in homeopathy and other quack therapies, but really deserves to become a national laughing stock. Indeed, so long as it continues to inform students that sugar pills contain magic properties, the word ‘University’ in its title deserves the same inverted commas as Gillian McKeith’s doctorate. The new vice-chancellor of Westminster is Professor Geoffrey Petts, a former professor of physical geography at Birmingham University, who was awarded a medal by the Royal Geographical Society for his work on river conservation. How does he reconcile his own rigorous methodology with running an institution that teaches voodoo science? What does he think a vice-chancellor is for, if not to ensure that his institution teaches facts as opposed to untruths? Let us ask him.
Likewise, respectable publishers who commission works of bogus scholarship should be persecuted mercilessly. What the field needs is its own Ben Goldacre to make the acquiring editors and houses that publish titles like 1421 synonymous with bogus history. Then they and other senior publishing executives might think twice before bringing out another ludicrous fabrication, however profitable. They might be surprised by how little support they receive from junior colleagues. I have spoken to several young editors, including one former copy editor for Graham Hancock, who are privately disgusted by the material they are expected to process. Many bookshop staff feel the same way; I still regret not taking up the offer of a Books etc. sales assistant who told me he would look the other way if I chose to shoplift a copy of ‘that disgusting trash’, 1421.
We can almost certainly do nothing about the circulation of counterknowledge on the internet. The fragmentation of shared knowledge into personalized truths began centuries ago; digital technology has merely speeded it up. But this inevitable shift from fate to choice does not relieve us of the responsibility to base judgements on the evidence of our senses. On the contrary, it makes it all the more important to preserve the notion of a public domain in which, to quote the Oxford English Dictionary, a fact is ‘a thing that is known to have occurred, to exist, or to be true’. We must hold to account the greedy, lazy and politically correct guardians of intellectual orthodoxy who have turned their backs on the methodology that enables us to distinguish fact from fantasy. It will be their fault if the sleep of reason brings forth monsters.