“I believe because it is absurd”; or, Pseudoscience
The Stars Down to Earth
In a letter to Louis Bourguet of 1714, Leibniz famously writes, “I despise almost nothing—except judiciary astrology.”1 For him, the advancement of any science or discipline is directly connected not just with discovery and theory, but also with the creation of a proper institutional structure for the facilitation of discovery and the production of theories. Leibniz understood, in his medical and epidemiological writings beginning from the 1690s, that processing data about past epidemics would be a far better tool for anticipating future ones than would be more traditional varieties of fortune-telling: thus, effectively, that retrodiction can be, when coupled with evaluation of statistical data, a powerful tool for prediction. And he saw, moreover, that this could be done with the help of machines, along with the collective labor of employees of state-sponsored institutions.
In spite of his disdainful remark, what Leibniz envisions might in fact be understood as something closer to an improvement, by a change in its basic techniques, of the art of fortune-telling, including judicial astrology, rather than an abandonment of it for more mature intellectual endeavors altogether. Divination of all varieties, whether astrology, tasseomancy (from tea leaves), or astragalomancy (from dice or knucklebones), may appear from the point of view of science to be the very height of irrationality. In fact, however, divination bears an important genealogical and conceptual relationship to scientific experiment as it develops over the course of the early modern period, and also has an important connection to the history of computing or reckoning.
We may think of mantic practices, such as those imposed upon the feeding schedule of Paul the Octopus, or of divination in the most general sense, as the use of experimental techniques under controlled conditions in order to either predict the future or decide on a particular course of action. Today there is a great variety of machines that purport to tell us, either truthfully or no, about the course our future will take. All of these machines are built, more or less, on the same mechanical principles as Blaise Pascal’s eponymous “Pascaline” device or Leibniz’s stepped reckoner. Some of them, such as the “love meter” or the online personality quiz, are patently fraudulent, while others, such as the online credit-rating service, somewhat more plausibly purport to be able to determine our future fates, based on the fact of who we are at present, through the accrual of our past actions. We may ask, however, whether an anthropologist external to our culture would, in studying us, be able to make sharp distinctions among the horoscope, the personality quiz, and the credit rating, or indeed whether we ourselves clearly understand how they differ. In some parks in the cities of Eastern Europe you can still find standing scales for weighing yourself, and thus for getting a report on a certain factor of your physical health, standing right next to automated fortune-telling machines. Here the side-by-side positions of the scientific instrument and the mantic apparatus cannot but reveal to us their shared pedigree.
We may not ever, in fact, have been perfectly clear on the boundary between computation and divination. When Leibniz implored his contemporaries to “Calculate!” or to “Compute!” and suggested by this that he had, or was in the course of getting, some sort of engine that might reveal to them their proper future course, it would not have been out of line to interpret this as at least somewhat akin to a call to look into a crystal ball or to consult a chiromancer. We turn to machines to tell us what to do, and how things are going to be. We want the indications they deliver to us to be well founded, but we also want them to reveal fate to us, to mediate between us and the open future.
Divination, in short, is an ancestor of computation. Both are projections of how the future might be. The latter sort of projections are based on rigorous data crunching that takes into ample consideration how the world has been up until now. The former sort also look at the world in its present state, how things have settled into the present moment—how tea leaves have arranged themselves, how the heavens have turned, whether the birds are taking sudden flight or staying put in the fields. They do so, generally, in a piecemeal and impressionistic way, and read past and present signs from one domain of nature into another, or from nature into human affairs, in a way that strikes us today as unjustified. But the shared ancestry is unmistakable.
Yet just as by the late nineteenth century the unity of science and faith in the programs of such comprehensive thinkers as Boyle or Goethe would be largely forgotten, the common ancestry of divination and computation would also, by around the same time, be more or less occluded from memory. By the twentieth century, science was for serious people, astrology for dupes. Or worse, astrology was for the useful idiots of fascism. While still in exile in Los Angeles in the early postwar years, Theodor Adorno took an interest in the peculiar tradition of American newspapers to include horoscopes for their readers, whereby they ostensibly learn of their near-term fates on the basis of the star sign that governs their date of birth. The result of this interest was The Stars Down to Earth, Adorno’s study of the horoscope section of the Los Angeles Times over the course of several years in the 1950s.2 He rightly saw these horoscopes as a drastically etiolated version of what would have been available to a practitioner of the art of horoscopy during the historical era in which astrology remained a meaningful, rich, and all-encompassing field of inquiry and explanation. To criticize the horoscopes in the Los Angeles Times is one thing; to criticize those of John Dee or any other Renaissance magus, or indeed of Galileo himself, who made a respectable income casting horoscopes alongside his more properly astronomical work, is another very different thing. To gloss over the differences, to take this sort of exercise as timelessly and context-independently irrational, is to overlook the ways in which different valences can come to attach to the same practices in different places and times.
For Adorno, midcentury American horoscopy, as well as the broader incipient New Age culture this heralded, was a subtle expression of a fascist tendency, to the extent that it involved submission to an abstract authority in the search for answers to life’s deep questions, rather than any effort to critically reason through one’s own life and options. A horoscope is not, for Adorno, what its enthusiasts today so often claim: “harmless fun.” Horoscope readers who provide this defense will often claim that they do not necessarily believe what the horoscope says, and, moreover, that one does not have to believe it in order for it to retain its power to amuse and distract. This defense is typically proffered as a way of assuring skeptical friends that it is not really so irrational to read one’s horoscope after all, that one can do so while still retaining one’s sharp critical sense. But it is even worse, Adorno thinks, to submit to abstract authority one knows to be empty. After all, if we sincerely believed that astrology offers the best, most state-of-the-art explanation of the causal links between celestial bodies and the biological and human world of the terrestrial surface, then the appropriate thing would be not just to read it “for fun,” but to read it and then to structure one’s life around it. To do so would at least have the virtue of conviction.
One of the remarkable features of the horoscopes in the Los Angeles Times, Adorno noted, is that they did virtually nothing to account for these purported causal links. They simply stated, without context, without detail, without any insight into the cosmology of the people who came up with horoscopy in the first place, that if you were born on such and such date, such and such suitably vague things will probably befall you—Astra inclinant sed non necessitant, as the old saying went, the stars incline but do not necessitate, and therefore any horoscopic prediction that fails to arrive cannot be subject to empirical disconfirmation. It is thus not enough, as a plea for understanding, for a reader of the Times horoscopes to say, “I just enjoy astrology!” For a reader of these horoscopes cannot really enjoy astrology, as he lacks the necessary historical curiosity and imaginative resources to do so; he cannot work himself into a position in which the correlation of individual fates with the configuration of the stars and planets might actually mean something, might contribute to a sort of self-actualization, the cultivation of a life praxis, rather than simply signifying submission to the voice of an anonymous authority in an establishment newspaper.
Now, again, quite a bit has changed in the United States since the 1950s. For one thing, no American media consumer has the option of submitting to the abstract authority of the voices emanating from establishment news sources, as there are no such sources, but only media that fit or do not fit with our own preferred media profile, with the much-discussed bubble we each create for ourselves with the help of social media and of the glut of choices offered by cable or satellite television. The Los Angeles Times is rapidly downsizing, laying off its core staff, who are for their part taking to Twitter in a desperate struggle to stay relevant. Meanwhile there are now horoscopists who write for a self-styled thoughtful, independent-minded, and skeptical audience (e.g., the syndicated author Rob Brezsny), and others who write for specific, finely focused demographics. And most recently—as if at long last explicitly reuniting the lineages of divination and computation, which we traced back at this chapter’s beginning to their original unity—internet users are now able to consult “algorithmic horoscopes.” As Amanda Hess has noted, “A.I. and machine learning can churn out predictions at speeds unmatched by flesh-and-blood astrologers.”3
Interestingly, while in general Republicans are less science-literate than the broader American population, they are somewhat less likely than any other group, and indeed than liberal Democrats, to believe that astrology is “very or sort of scientific,” according to a 2012 survey.4 The most prominent conservative media outlets in the United States, such as Fox News and Breitbart, do not feature astrology. This divide along political lines probably has to do with the perception of astrology as a pagan tradition (though of course it was practiced and promoted by members of the church for many centuries). Yet there are also astrologers out there ready to cater to consumers with a “family values” sensibility, or with a love of free markets. And again, these distinctions are extremely fluid. In recent years we have seen Tea Party demonstrators advocating holistic medicine, including traditional Chinese medicine and other cross-civilizational borrowings, as an inexpensive alternative to modern medical care for the uninsured.5 In the future there is no reason why self-styled conservatives should not also turn, or turn back, to astrology.
Whatever may be the accuracy of Adorno’s analysis of American horoscopy in the 1950s, there does not seem today to be any simple submission to abstract authority in the current “harmless fun” of astrology. There is, rather, conscious and elaborate identity construction, in which the sort of horoscope one reads is just one part of a suite of choices that also includes the clothing one wears and the music one listens to, all of which, together, signal what kind of person one is. In the United States today, such signaling is generally inseparable from the matter of which side of the tribalist culture wars one identifies with. Pace Adorno, it seems likely that this fragmentation itself, rather than the role that horoscopes play within it, is the more disconcerting sign of incipient fascism.
We are one step further removed here, than the Los Angeles Times was in the 1950s, from the lifeworld of John Dee or Galileo, in which astrology presented itself as something to believe, something that genuinely helped to make sense of the world and of our place in it, rather than making it more difficult to do so. And yet even here there remains a faint but unmistakable link to the deeply human, and even extrahuman, effort to orient in the world by reference to the fixed points of the celestial spheres (dung beetles, too, it turns out, navigate by the Milky Way).6 We admire the stability and regularity of the heavens, and are prone to imagining that whatever share we have of stability and regularity in our chaotic, terrestrial, mortal lives is somehow borrowed from them. For this same reason, we still take exceptional astronomical events as significant, as momentous in ways that cannot be fully explained by their observable effects.
In 1997, thirty-nine members of the Heaven’s Gate cult committed suicide together on the occasion of the approach to earth of Comet Hale-Bopp, which, their leaders claimed, was in fact an extraterrestrial spacecraft. Alan Hale, one of the comet’s two discoverers, declared the following year: “The sad part is that I was really not surprised. Comets are lovely objects, but they don’t have apocalyptic significance. We have to use our minds, our reason.”7 Twenty years later, in August 2017, a total eclipse of the sun passed across the United States, from west to east. The path it followed matched the arc we might easily have imagined to be traced by an intercontinental missile fired from North Korea: entering American airspace in the Pacific Northwest, and moving from there to the south and east across the heartland. The eclipse coincided with extreme tension over a recent war of words between Donald Trump and North Korean leader Kim Jong-un, resulting from the latter’s recent successful test of long-range missiles, and from the accumulating proof that his regime now would be able to deliver a warhead to American soil. Many said it was the closest the world had come to a nuclear confrontation since the Cuban Missile Crisis. Meanwhile, domestically, a neo-Nazi rally occurred in Virginia, and the president utterly failed to distance himself from the ideology of the demonstrators. As a result he was abandoned by many business leaders who had previously attempted to abide and deal with his various flaws. He was again reprimanded by many within his own party, and the speculation that his reign was bottoming out, while this had arisen many, many times before, seemed to be reaching a new, fevered intensity. (It did not, in fact, bottom out.)
It was inevitable that some would make a connection between the celestial and the terrestrial scales of events. It was jokingly said on social media, in countless variations, that the eclipse must somehow be a harbinger of the fall of the Trump regime. Less jocularly, rumors flew that it was a conspiracy, or that it would trigger events on earth leading to the collapse of power grids, or other apocalyptic scenarios. Experts who knew better than to stoke such fears nonetheless warned that human behavior during the eclipse, with millions of people displacing themselves in order to observe it, might have significant consequences for the environment and for civic stability. Whether joking, cautious, or ridiculous, American anticipation of the 2017 eclipse differed very little from what happened in the great eclipse of 1654, when the materialist and atheist philosopher Pierre Gassendi bemoaned the ignorance of all the doomsayers, and of their learned enablers such as Robert Fludd (who had died some years earlier, in 1637).
It is not that there is no progress, or that we are not getting closer to a correct account of how the world works. But we still get vertigo on glass-bottom bridges, we still fear strangers more than friends, and we still are surely unsettled when the sky turns black at noon. All of these expressions of irrationality, moreover, are irrational in the narrow sense of failure to make the right inferences from what we in fact know. Nor is it necessarily the case that the chatter and jokes and misinformed speculations surrounding the things that frighten us, the impoverished borrowings from the venerable astrological tradition, are all just so much noise. These are all expressions of irrationality, but they do not seem to be, as Adorno had thought of astrology, straightforward expressions of a desire to submit to abstract authority. They are the products of active searching, not passive acquiescence.
Let a Hundred Flowers Bloom
One consequence of the partition between art and science, considered in the previous chapter, has been the persistent proneness of science to infection and mutation, to meddling in its affairs by people who really do not know what they are talking about—people who are propelled forward by a moral conviction that this domain of human life, too, is theirs to play in, that the green lawn of science must not be roped off, transformed into a space that only the haughty college dons are permitted to cross.
The geneticist Kathy Niakan, who was the first researcher ever to gain ethics-board approval to conduct research with human embryonic stem cells using CRISPR gene-splicing technology, has explicitly compared the innovations made as a result of this research to those that came with fire, and with the internet.8 While we cannot possibly know all of the future applications of today’s innovations and discoveries, we have effectively no choice but to continue. For the moment, the mainstream research community is unanimous in the view that research for medical applications, such as improvements in assisted-fertility treatment, is salutary and should continue, while any research involving the creation of new immortalized germ lines—that is, cells that give rise to offspring that may then become part of the human species’ shared genetic profile—would amount to a Promethean ambition to be decisively rejected by any ethics board.9 Niakan asserts that the public frequently confuses these two sorts of research, and notes that it is largely as a result of this confusion that opposition to human stem-cell research is so widespread in public opinion. In fact, if it were left up to the public—that is, if it were an issue deemed to be worthy of democratic resolution—then Niakan would not have gained approval to conduct research on human embryos. She relies in her work on the approval of boards of experts, but not fellow citizens, and is grateful that this is the current arrangement where she works (in the United Kingdom).
Of course, the possibilities are not exhausted in the simple dichotomy between “expertocracy,” on the one hand, and putting the vote before an ignorant public on the other. Another possibility is informing the public to a point at which it is no longer ignorant, and then turning the decision over to it. But the deepening of the crisis of public ignorance that has come with the rise of the internet, and the simultaneous sharpening of opposed opinions among different camps of the public, makes this alternative unlikely, and scientists such as Niakan are no doubt rational in their presumption that they must protect their work from public oversight. Niakan is hacking through nature’s thorns and, like Oppenheimer before her, seems to be aware that her work is kissing awake new powers. The moral stance she adopts seems to take for granted that human beings will do whatever they find they are able to do, and thus that new technologies are unstoppable. The best one can then do as an individual at the vanguard of these technologies is to use them responsibly, to satisfy well-composed boards of ethics, whose members establish their qualifications for membership not principally as ethicists, but rather as knowers of the relevant scientific facts.
There are certainly many issues that should not be put to the vote, often because it is unreasonable to expect that the public could acquire the relevant expertise. But as long as scientific progress depends on antidemocratic institutions, the halls of science will continue to be invaded by gate-crashers: the amateurs despised by the experts, who make up in passion what they lack in knowledge, and who are the closest thing in our era to the Goethean dream of a science that can still make room for sentiment. But if they fail to fully realize this dream, it is in part because our era has made little room for a cultivation of sentiment that is not at the same time a descent into unreason.
Since the nineteenth century, as we saw in the previous chapter, there has been an expectation that science must now keep to itself, as the domain of reason, while unreason is free to romp within the limited spheres of art, poetry, and the expression of personal faith. Now that the violence of their separation has been endured, it has generally been supposed, they may be seen as a sort of divided homeland, which, even if naturally and historically unified, must nevertheless be protected against any invasion of the one side by the other.
Of course, low-level incursions have been a near-constant reality since the original partitioning of the two magisteria. Consider the case of that great oxymoron that has served as a wedge issue in American politics for over a century: creationism, or, as it sometimes styles itself, “creation science.” There is no fixed, context-free reason why commitment to the recent extinction of the dinosaurs, within human history, should be a component of a politically conservative activist agenda. The particular political significance of a given belief of this sort is subject to perpetual change. In the early seventeenth century the “conservatives” reacted harshly to Galileo’s discovery of sunspots. The sun is a superlunar body, and thus is composed not of diverse elements, but of one element only, for otherwise it would be subject to decomposition, mortal and corruptible, as only sublunar bodies are. But if it has spots, then these could only be a sign of composition from at least two elements. Therefore the idea of sunspots is a heresy and must be condemned. Somehow this issue was resolved fairly quickly, and today no Republican politician in the United States has to pander to an antisunspot constituency, even as some lawmakers continue to pretend, even perhaps to pretend to themselves, that the best evidence does not speak in favor of our descent from a common ancestor with the chimpanzees. Things could have been otherwise. Things will be otherwise, soon enough. Soon enough, public figures will be pretending to believe some completely implausible thing they could not, deep down, really believe, and that we cannot, now, anticipate.
According to the anarchist philosopher of science Paul Feyerabend, the fluidity of the social role played by ideas extends to scientific rationality itself. Scientific rationality is an ideology, for him, and as such it had been a particularly powerful and life-improving one in the seventeenth and eighteenth centuries. But its greatest breakthroughs were made, even then, by drawing on traditions that lay far beyond the field of scientific respectability, not only by our own standards today, but by theirs. Thus, to cite Feyerabend’s preferred example, in shifting to a heliocentric model Copernicus did indeed have some historical precedents to draw on, but these came from unhinged numerologists and astrologists such as the fourth-century BCE Pythagorean philosopher Philolaus, and not from defenders of views one would have seen as safe or respectable by the late sixteenth century.10 Writing in the late twentieth century, Feyerabend concludes that scientific rationality has largely outlived its purpose, and it does better when it exists alongside competing ideologies. He declares that he would like to see more Lysenkos—that is, more people like Trofim Lysenko, the Soviet geneticist whom Stalin favored, for a while, in view of his empirically ungrounded claim that a new “proletarian science” could transform grains to grow in cold environments in ways that a strict Darwinian account of adaptation would not allow.11 Let Lysenkoism live, Feyerabend thought. And let astrology, holistic medicine, and creationism live too!
As already mentioned, in more recent years holistic medicine has been defended, as an expedient alternative to a national health-care system, by American Republicans intent on repealing Obama’s Affordable Care Act. In principle there is no reason why these same people should not also take up the cause of Lysenkoism or astrology, and to do so, moreover, not as side interests, but as a central part of their political program. Stranger things have happened. One is in fact strongly tempted to conclude that there is never any way of deriving or predicting the political uses to which a given scientific doctrine will be put, or the political opposition it will face, by simply studying the content of the doctrine itself.
Consider specifically the Museum of Creation and Earth History, opened in Petersburg, Kentucky, in 2007.12 It features displays inspired by classical natural history museums, but with a twist: its mission is to bolster, or to bring to life, an alternative account of the origins of the diverse species of the world, including dinosaurs, in terms that are compatible with a more or less literal understanding of the book of Genesis. It is in effect a simulacrum of a museum, an institution that reproduces the look and feel of a museum, but that has no real authority to explain the objects it puts on display.
The museum’s founder, Ken Ham, defends “young-earth creationism,” a strict version of creationist doctrine on which the scriptural account of creation is literally, rather than allegorically, true, and everything that paleontology, cosmology, and related sciences would account for on a scale of millions or billions of years must somehow be accounted for as being not more than roughly six thousand years old.13 For example, creation scientists have latched onto the phenomenon of rapid or “flash” fossilization, which does happen on occasion, leaving us the remains of prehistoric life forms that fossilized so quickly as to preserve skin and internal organs along with the bones or shells that are more commonly preserved.14 This possibility, along with such facts as the occasional discovery of a fossil in a stratum claimed by evolutionists to date from long before that species’ presumed existence, has enabled savvy creationists to develop an alternative account of the history of life on earth, according to which all events that mainstream science explains in terms of a geological timescale can in fact be explained on the much smaller scale of human history.
Key to note here is that this approach implicitly accepts that the sort of reasoning and provision of evidence that have come to reign in scientific inquiry over the past centuries should not be abandoned, that the scientific method is worthy of respect. It accepts, in effect, that if you want your claims to be taken as true, you must prove that they are true by a combination of empirical data and valid inferences. The creationists have accepted the rules of the game as defined by the evolutionists. They have agreed to play their game on the home turf of science, and it is not at all surprising to find them here at their weakest.
Creationism has been gaining ground not only in the United States, but in many other countries throughout the world with a similarly strong streak of illiberalism and irrationalism in civic life. An interesting exception is East Asia, where the overall number of people who are uncomfortable with the idea of sharing a common ancestor with chimpanzees is lower than anywhere else in the world, quite apart from the nature of the political system or the freedom of the press that reigns in a given country. Turkey, by contrast, is one of the countries in which skepticism about evolution is even higher than in the United States. Some years ago, a charismatic cult leader, Adnan Oktar, also known as Harun Yahya, decided to take up the battle against Darwin, and found himself adapting many American Christian evangelical arguments and texts for a Muslim audience. This task was easier than one might expect, and one is struck by how closely his pamphlets—with their kitsch and childish illustrations of Noah’s Ark and other signal elements—resemble what we might just as well expect to find in Petersburg, Kentucky. Harun Yahya’s masterwork was his Atlas of Creation, the first volume of which was published in 2006 (that year I myself was mysteriously sent a complimentary copy, of the original Turkish edition, to my office in Montreal).15 In an amusing review of the work, Richard Dawkins noted that one of the supposed photographs of a caddis fly, meant to prove something about how currently existing species existed in what evolutionists wrongly take to be the distant past, was in fact an image of a fishing lure, copy-pasted from some online catalog for outdoor-sports equipment. One could distinctly see the metal hook coming out of it.16 This image may be thought of as the very emblem of the creationist movement: shabby, hasty, reliant on the assumption that its followers have no real interest in looking too far into the matter.
And yet the question naturally arises as to why they should go to the trouble at all of producing their simulacra of scholarly texts and august institutions, their “atlases” and “museums.” It is not as if no other model for religious faith has been defended since the beginning of the era of modern science. Already in the seventeenth century, Pascal articulated an account of religious faith on which it was its indefensibility in terms borrowed from reason that made it worth one’s total commitment. Much earlier, in the third century CE, the Christian apologist Tertullian had justified his commitment to the faith precisely in view of what he took to be its absurdity, leaving us with the stunning motto Credo quia absurdum: “I believe because it is absurd.”17 In the nineteenth century, again, the Danish philosopher Søren Kierkegaard articulated a vision of his own Christian faith on which this faith is strictly groundless, and on which its distinctive feature is that we come to our faith not through the persuasion of the intellect by reasons, but by an act of the will. For these thinkers, one does not defend religious faith against scientific reason by making the case that it is not absurd, or that its facts are better founded than the facts defended by science, but rather by embracing its absurdity as proof of its vastly greater importance than what may be comprehended by human reason. To make the case that faith is rational is for them self-defeating, quite apart from whether the case is convincing.
One might reasonably conclude that Tertullian and Kierkegaard have reflected somewhat more deeply on the nature of religious belief than Ken Ham has. The latter appears to take it for granted that assent to the truth of Christianity hangs on such matters as whether dinosaurs can be shown to have lived contemporaneously with human beings. This is somewhat as if one were to conceive of the problem of providing a proof for the existence of God in the way that someone might set out to prove the existence of Bigfoot. God will not leave clumps of hair or footprints; it is simply an inadequate understanding of the issue at hand—as it has developed over the course of the history of theology and philosophy—to take God and Bigfoot as relevantly similar, so as to warrant the same sort of proofs and reasoning regarding their similarly disputed existences.
Now, assent to the truth of Christianity in particular involves more complications than does assent to the existence of God, as critics of Descartes’s version of the ontological argument for the existence of God, for example, have noted: we might be able to prove the reality of some generic Supreme Being, but how this might compel us to accept, say, the Trinity or the truth of the Nicene Creed is not at all clear. Descartes pursued the matter through a priori reasoning, while Ken Ham wants to establish the truth of Christianity by empirical facts about fossils and so on, an approach that appears even more inadequate to the task at hand. Descartes can at least, perhaps, give us a generic Supreme Being by his a priori method. Ken Ham can only give us easily refutable empirical claims about the natural world, claims that cannot possibly be expected to ground transcendental commitments.
Skeptics and atheists, such as Richard Dawkins and other members of the “new atheist movement” (largely fractured and weakened in the era of Trump, when the great divide in our society no longer seems to be between the pious hypocrites and the up-front, morally balanced humanists) often suppose that the faithful are particularly credulous in their assent to belief systems that harbor blatant contradictions or absurdities: that God is both one person and three, for example. What they are missing is that it may well be not in spite of these absurdities, but rather because of them, that the doctrine is seen as warranting faithful assent. As ventured already in chapter 1, if there were no mystery at the heart of a religious doctrine, then the perfectly comprehensible facts that it lays out would likely grow less compelling over time. It is the mystery, the impossibility that is claimed as true, that keeps believers coming back, believing, not in the way that we believe that 2 + 2 = 4, or that humans and chimpanzees have a common ancestor, or that a clump of hair must have belonged to a Sasquatch, but in a way that is indifferent to the standards of assent involved in these latter sorts of claim.
Alternative Facts, and Alternatives to Facts
The way in which mystery—or, to speak with Tertullian, absurdity—generates a hold on followers of a religion is of course explicable in strictly sociological terms, and does not occur exclusively in social movements that are religious in the narrow sense, that is, in movements that make claims as to the nature of the transcendental realm. I have identified the Museum of Creation as a simulacrum of a museum. Another way to put this might be to say that it is an “alternative museum,” or, to deploy the most recent convention, an “alt-museum.” To describe it in this way is of course to highlight its illegitimacy. After Kellyanne Conway, Donald Trump’s then spokesperson, proposed in early 2017 that there may be “alternative facts,” this phrase was widely repeated, but more or less only by people who wished to denounce and to ridicule it.
To be fair to Conway, there are alternative facts, at least in one respect. As writers of histories know, the past contains infinitely many events. Every slice of time in fact, in every sliver of the world, contains infinitely many. When we write our histories, then, when we periodize and narrate, we select some facts rather than others as being most pertinent to the account we wish to offer. The facts that we leave out—the infinitely many facts—are in some sense “alternatives”: we could have included them if we had chosen to do so, and others might do so in their own history of the same topic. Perhaps one should say that these other facts are “facts in reserve.” In any case, Conway was not wrong here, though it was easy to interpret her claim uncharitably, given that she was working for a regime that does habitually promote alternative facts in the stronger and more deplorable sense: facts that are not facts at all, but lies (to which we will turn in chapter 8).18
Ken Ham’s five-thousand-year-old dinosaur fossils, are not, more properly speaking, alternative facts, but rather alternatives to facts. What are people doing, exactly, when they offer up these alternatives? It is difficult to be satisfied here by Harry Frankfurt’s famous analysis of “bullshit,”19 in its technical philosophical sense, as being distinct from a lie, in that the liar is concerned about the truth and hides it, while the bullshitter has lost all concern about the truth as an anchor for his claims, and wishes only to persuade. Alternative scientific claims such as those of Ken Ham are indeed made out of concern for the truth, and they are made with implicit knowledge of the fact that establishment science really does have something close to a monopoly here, really is getting things right in a way that the alternative scenarios do not.
The message of the Museum of Creation, on this reading, is not, then, that dinosaurs and human beings really did roam the earth together, but simply that we, creationists, reject your scientific account of things regardless of whether it gets the facts right, and the reason is that it does not speak to us as a community united by shared values. And yet, unable to fully understand that this is a question of values and not facts—unaware of the legacy in the history of theology, from Tertullian to Kierkegaard, of authors who have dealt profoundly with this distinction and come up with accounts of faith that are boldly independent of any countervailing factual claims—characters such as Ham do their feeble best to operate at the level of facts that they, likely, deep down, do not really believe. This is a species in the genus of irrationality, while bullshitting, however similar it may appear, is simply a moral transgression but not an intellectual failure. The successful bullshitter has not behaved irrationally; he has used what he knows to attain desired ends. The young-earth creationist is by contrast irrational to the extent that he does not fully understand what he is trying to do, what he is trying to defend, and he therefore sets himself up to lose in the long run. There is no plausible scenario on which he will be successful, on which he will achieve his desired ends.
If the attribution of disingenuousness to defenders of creation science seems unwarranted, perhaps it will be helpful to go a bit further afield and to consider an even more extreme strain of rejection of the modern scientific consensus: flat-earth theory. It is likely significant that the social movement made up by adherents of this view, while it has been around for several decades (in a 1968 book, the classicist G.E.R. Lloyd had occasion to say of Aristotle that he “was no flat-earther”),20 has enjoyed a spike in recruitment since Trump’s election. One suspects in fact that in multiple areas of social life, and not only in the political arena narrowly conceived, there has been an upping of the ante, or perhaps a widening of the so-called Overton window—a theory of how the range of acceptable ideas shifts in society over time, developed by the founder of the Mackinac Center for Public Policy, Joseph P. Overton, in the mid-1990s—with the result that the range of acceptable ideas within the public sphere has been significantly shifted.21
Flat-earth theory is far more radical than even young-earth creationism (not to mention old-earth creationism or intelligent-design theory), in part because it makes claims about the present state of the world that one would think could be refuted by straightforward observation, while creationism simply offers an alternative account of how the present state of the world came about, and disputes the claims of evolutionists about past processes that none of us are able to observe directly. Standard flat-earth theory holds, for example, that the outer boundary of the disk of the earth is a great ice wall, and that nobody knows what lies beyond it. This claim alone is enough to signal that the theory is likely most attractive to people who, let us say, are not exactly in control of their own destinies, who might be called “low-will” on analogy to the description that political scientists have deployed of certain voters as “low-information.” By contrast a high-will individual who sincerely suspected that the disk of the earth is bounded by an ice wall would surely be able to pull together the resources to make an expedition and to observe the thing. Surely a conspiracy of this size, and a basic cosmological truth of this importance, would warrant staking it all, going into deep debt, mortgaging your home, in order to get to the bottom of things. Someone who could rest content with the ice-wall theory is someone who does not ordinarily think of him- or herself as in a position to solve matters of great importance once and for all. Someone else, somewhere, can do that, the flat-earther must think, just as forces somewhere else have passed off their sinister conspiracy on us.
The theory of the ice wall is one that makes a claim about how the world is at present, though of course flat-eartherism also reaches back, like creationism, into the past. It holds for one thing that NASA images of the earth from outer space are a hoax, and that those who run NASA and similar agencies are part of a global conspiracy to keep the masses in perpetual ignorance. In order to make sense of NASA’s dastardly scheme, whereby the commonsense obviousness of a flat earth is denied in favor of the counterintuitive theory of a round earth, one must also suppose that Kepler, Galileo, and even Aristotle were in on it too, since all of them claimed that the earth is round long before NASA came onto the scene. This must be an elaborate scheme indeed, to have been sustained for so long, in contrast with the scheme to convince us that human beings are descended from other animal species, which really came together only in the nineteenth century.
But the primary focus of the flat-earthers is an alternative interpretation of present sensory evidence. Unlike creationists, who tend to suppose that evolutionists are sincerely wrong, rather than being liars, flat-earthers take round-earth theory (as it were) to be a theory that is not really believed by its most active promoters, namely, the perpetrators of the NASA hoax. Moreover, to the extent that it is believed by the masses, this is only because of the manipulations of its elite promoters. Flat-earth theorists tend, in debate, to pass rather quickly from the details of the theory itself—the ice wall, for example, not to mention the epicycles in the orbits of the planets (for flat-earthers there are in fact round planets, but the earth is simply not one of them; it is not in fact a planet at all)—to discussion of the social and political dimensions of the conspiracy. One senses, in fact, that the commitment to the actual content of the theory—that the world is flat—is rather minimal, and that the true nature of the movement is that it is a protest, against elite authorities telling us what we must believe.
Feyerabend’s point about Copernicus drawing inspiration from the unscientific Philolaus might also be extended to Newton, whose intellectual character drew him to biblical numerology, among other fields. It may well be that if Newton had not been able to satisfy his curiosity in biblical numerology he would also never have succeeded in making the discoveries that the world would come to value. And likewise it is at least possible that today a young scientist on the cusp of some great breakthrough will be triggered into making it while watching a flat-earther’s video on YouTube, infuriated, perhaps, at how deeply wrong it is, and driven to epiphany as a result of this anger. But it also does not seem reasonable to place much hope in such an eventuality; on the contrary it seems very reasonable to seek to limit the proliferation of such videos, not by prohibition, of course, but by education, the cultivation of a level of scientific literacy in schoolchildren that would leave such videos without an audience.
One might reasonably expect that the popularity of flat-earth theory would sooner prevent breakthroughs than inspire them. These could well be breakthroughs that are still far from the cusp of being made, breakthroughs that would have been made, somewhat further off in the future, had some potential young scientist not been dissuaded from beginning to pursue a career in science after watching a video that convinced her that establishment science is an elite and sinister conspiracy. The greatest danger of flat-earth theory is not that it will convince a young and easily influenced mind that the earth is flat, but rather that it will initiate the young mind into a picture of the world as one that is controlled by dark forces, by powerful actors behind the scenes, rather than by political factions that we as citizens are in a position to understand and, one hopes, to influence. Flat-earth theory is a threat not primarily because it gets the physical world wrong, but rather because it misrepresents the human, social world.
To be indoctrinated into such a theory is to be cut off from an understanding of politics as the working out of differences, through agreed-upon procedures, in a neutral public space, and to accept instead a vision of politics that is modeled on guerrilla warfare, on asymmetrical combat between total enemies. This sort of indoctrination, which characterizes flat-earth theory, does not appear to be nearly as present a risk in other, comparable alternative or antiestablishment domains, such as traditional holistic therapies, or indeed creationism. One might well be initiated into an interest in botany from an initial interest in indigenous herbal medicines, for example. Or one might be initiated into learning about other cultures and their knowledge of the living world, and from there begin to read about anthropology and history. No harm here, certainly, even if one risks being cut off from the prideful confidence in the superiority of one’s own culture’s attainments that today infects so many aspects of science education.
It is less plausible, but not out of the question, that one might discover an innate interest in the life sciences during a visit to the Museum of Creation. Many naturalist thinkers have resisted what they see as Darwinian “orthodoxy.” Their results may appear as stubborn and wrongheaded, but not necessarily as spurious or completely without value. Interestingly Vladimir Nabokov, who was on the staff at the Harvard Museum of Zoology for a time, and who discovered and gave his name to a species of butterfly, was as vehemently contemptuous of Darwinism as he was of psychoanalysis. Thus he writes in his memoir, Speak, Memory, that natural selection “could not explain the miraculous coincidence of imitative aspect and imitative behavior, nor could one appeal to the theory of ‘the struggle for life’ when a protective device was carried to a point of mimetic subtlety, exuberance, and luxury far in excess of a predator’s power of appreciation.”22
It is safe to say that Nabokov’s concerns here are not the same as Ken Ham’s, and, in turn, to assume that there is not, and never will be, a Nabokov of flat-eartherism: someone who plays a comparable role for that extreme pseudoscience to the one the Russian émigré author played for anti-Darwinism. A typical creationist, such as Ham, wants to say that nothing is nature, but all is art, or, more precisely, that nature is the artifice of a certain highly esteemed Artificer. Nabokov by contrast wants to say that art is natural, that our own mimetic activity is not an exception to what nature is doing all the time, but an instance of it. I will not help to lend legitimacy to creationism by agreeing with Nabokov here. Or, at least, I will not affirm his claim as a scientific claim. But if we view it as an opening to a general theory of art, he is perhaps onto something. Romanticism, as we saw in the previous chapter, left us with the dead-end idea that art is the product of an artist’s struggle, to get something out, something unique—something that belongs to him, uniquely, as a member of that rare class of creatures, the artists. What comes out, it has been thought, is something unlike anything else in the known universe: an artwork! There is no thought here that the work might be a species of secretion whose genus is not exclusive to a small group of human beings, or even to humanity as a whole. A work of art might be the exuberance of nature, channeled through a human being. The natural mimetics Nabokov observes in coleoptera is not the production of paintings and sculptures, but the very making of the beetle body. Of course we know that insects do not literally make their own bodies, but even the most rigid Darwinists will speak as if the butterfly has taken to donning that pseudo-eye on its wing in order to scare off predators. What a fine job it has done! we think, congratulating the insect as if it were showing not itself, but its work.23
This discussion of Nabokov may seem like a digression, yet it is important in that it helps us to gain a view of the variety of motivations and philosophical commitments that might lie behind a rejection of the consensus scientific account of the origins of species and the nature of their diversity. By contrast, again, it seems almost out of the question that flat-earth theory might ever serve as a gateway to serious cosmological reflection, or that it might be underlain by any philosophical commitments worth hearing about.
We are in the course here of developing a sort of provisional classification of different varieties of pseudoscience, with the aim of understanding their political uses and the context of their adoption. This classificatory scheme may be further fleshed out by a consideration of the antivaccination movement, which for its part seems to occupy a social niche somewhat closer to flat-earth theory than to interest in holistic medicine or in questioning the Darwinian orthodoxy. It is considerably more plausible to claim that vaccines cause autism than to claim that the earth is flat, but both positions appear to be motivated not so much by the content of the relevant claims, and the evidence on which these theories are based, as they are by wariness of elite authority. Opposition to vaccination might emerge out of an interest in alternative medicines in general, and traditional or indigenous medicines, for complicated and problematic reasons are in our culture conceived as “alternative.” But this opposition has a different political significance, and it is important to pay attention to this significance in assessing the theory itself, rather than simply contrasting establishment science with every species of fringe or antiestablishment science that crops up to challenge it, as Feyerabend sometimes seems to wish to do.
Is there anything that may be said in defense of the antivaccination movement? Is there any approach by which we may gain a sensitive anthropological appreciation of what is at stake for its adherents? We may begin, certainly, by noting that people in general do not appreciate having foreign biological fluids injected into their bloodstreams, and this with good reason: ordinarily, to invite such admixture is to risk disease and death, and our revulsion and avoidance are no doubt evolved survival mechanisms, rational in their own way, as all such adaptations are. Fear of vaccines is in this respect comparable to fear of insectivorous bats or of strangers walking toward us in the night.
Many members of the English working class reacted fiercely to the Compulsory Vaccination Act of 1853, resisting it, according to Nadja Durbach, as a form of political opposition to state control of individual bodies.24 At the same time, we know that long before the significant innovations of Edward Jenner at the end of the eighteenth century, the Chinese were practicing smallpox inoculation (intentional low-level infection) at least eight centuries earlier, and there is some significant evidence from medical anthropology that similar practices have existed in folk-medical traditions around the world since antiquity. In the modern period, then, going back at least to Victorian England, resistance to the injection of disease agents has not been, or not only been, resistance to something new and unknown and apparently “unnatural,” but rather, also, to the top-down imposition of state power. It is, at bottom, the expression of distrust of authority, which is accentuated in periods in which government has failed to convince the masses that the ends it pursues are, as is said, “for their own good.” If government agents are in general perceived as crooks, it is not surprising that physicians working on behalf of the government are perceived as quacks.
These considerations are as relevant to the present moment in the United States as they were to nineteenth-century London. In March 2014, when Donald Trump was busy building up his profile as a political troll (having launched this phase of his career in 2011 with his contributions to the “birther” conspiracy theory, denying Barack Obama’s birth on US soil), the soon-to-be president of the United States launched the following volley on Twitter: “Healthy young child goes to doctor, gets pumped with massive shot of many vaccines, doesn’t feel good and changes—AUTISM. Many such cases!”25 The tweet is in the style of a folktale, and that is how Trump’s audience best absorbs its messages from him. We do not know who this child was; it is a generic child, a moral exemplum who need not have existed in fact in order to serve as a vehicle of some alternative truth.
But why did Trump choose at this point, even as his star was rising with birtherism and other more straightforwardly political conspiracy claims, to reach out to the anti-vaxx constituency and to express common cause with frustrated parents of toddlers showing autism symptoms—with Jenny McCarthy and other spokespeople from trash-celebrity culture who, beyond this rather narrow issue, do not seem to be particularly interested in politics? Part of the answer to this complex question is that vaccination, along with opposition to it, is far more political than it may appear on the surface. It is, to speak with Michel Foucault, a paradigm instance of biopolitics, where policy and power collide with the real, living bodies of political subjects.
According to Alain Fischer, focusing on the antivaccination movement in France over the past thirty years, there are both proximate and distal causes for the rapid decline of faith in medical authority over this time period.26 There have been too many failures of the medical system to prevent sanitation crises, including, in 1991, the bombshell discovery that the Centre National de Transfusion Sanguine (Natural Center for Blood Transfusion) knowingly allowed HIV-infected blood into its supply. The same year a child fell ill with Creutzfeldt-Jakob syndrome after following a course of growth-hormone treatment. The medical system fails sometimes, and if it fails too much, it loses public confidence. But what counts as “too much” is significantly determined by the way the mass media depict risk, and here, according to Fischer, even establishment French media, such as Le Monde, have failed miserably. Over the past decade, moreover, the new social media have helped to significantly weaken trust in the medical system by inviting everyone with an internet connection to fuel whatever doubts might already exist with reckless speculation.
Some features of the modern antivaccination movement are common across borders and languages; others are more culturally specific. As Fischer notes, there has long been fear in France that it is the aluminum used in some vaccination procedures that has been most harmful. The same element has been used in many countries, but mistrust of it, and claims as to its deleterious effects, have been limited almost entirely to France. Unlike the United States, France, notwithstanding occasional crises of contaminated blood, has a dependable national health-care system, and there is virtually no danger for a French citizen or resident of being shut out of that system because of lack of money. By contrast, in the context in which Trump was tweeting in 2014, popular confidence in the health-care system could not but be impacted, in part, by the perception and the reality of its inaccessibility. It is difficult to have confidence in a system that erects barriers to accessing it, and it is unreasonable to expect that citizens who are largely shut out from the health-care system, who have no choice but to not be in it, should then be expected to docilely submit when they are informed that there is one single branch of this system, the one that sees to vaccinations, that by marked contrast they have no choice but to accept. The bond of trust is so eroded by the general rule of exclusion that there is little hope of finding any trust for this single exception to the rule, where the expectation is mandatory inclusion.
The epidemiological rationale of vaccination is crowd immunity. Individuals are protected from infectious diseases not because they themselves are vaccinated, but because the majority of people around them are vaccinated. As long as the majority of the population is vaccinated, contagious diseases will be contained, and will be less likely to strike even those few individuals who are not vaccinated. Thus one’s own vaccination status is not the key element in determining whether one falls ill. One’s own health is not up to one’s own free choices, but rather depends upon the general pattern of choices, or of coercions, within the population. Such a predicament is hard to accept if the reigning political ideology is one of individualism, or at least of a sort of microcommunitarianism that refuses to recognize any common cause with neighbors within the same geographical region who look different, speak a different language, or have different values. But diseases cut across community boundaries, whether we like it or not, and in this way epidemiology reveals the limits of a political arrangement based on every individual, or family, or ethnic group, looking out only for itself. But it is precisely this sort of arrangement that was required in order for the Trump campaign to convince enough voters that he would look out for their interests as against the interests of other kinds of people. Even if Trump had not briefly wandered into anti-vaxx conspiracy-mongering in 2014, his political vision would have continued to follow the same logic as this conspiracy theory, the logic that refuses to acknowledge crowd immunity, or its political equivalent: shared responsibility among all citizens for the well-being of the polis.
Fischer identifies a rapid decline of public trust in expert authority as one of the key causes of the rise of the antivaccination movement over the past few decades. He argues that sectors of the public have retreated into “magical thinking,” as against the rational thinking of the scientific establishment. As Tom Nichols similarly observes, the most recent era seems to be characterized by “the death of the ideal of expertise,”27 and accordingly the rise of opinions on all manner of subjects, forged and valued not in spite of but because of their ignorance of and contempt for well-informed analyses of these subjects. It is, Nichols writes, “a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laypeople, students and teachers, knowers and wonderers—in other words, between those of any achievement in an area and those with none at all.”28 But even this does not sound the full depth of the problem. For one thing it is certain that Leibniz, Voltaire, and other paragons of rationalism and Enlightenment would have been delighted by Google and Wikipedia.
While the concern about the decline of expertise is in part warranted, it is complicated by certain important lessons of history. Sometimes decline in public trust in expert authority can be salutary; moreover, it can be helpful in replacing magical by rational thinking. This, in particular, is the shortest version of what we call, in shorthand, the “scientific revolution.” The expert authorities who occupied positions of power in institutions, and who defended the official view that, say, action at a distance may occur as a result of “sympathies” between bodies, were opposed by those who wanted to explain these actions as only apparently taking place at a distance, but in fact as being mediated by subvisible particles. There were many more details to fill out, of course, and within a few decades the theory of gravity would return, in Newton’s 1686 Principia mathematica, to restore a sort of action at a distance (it is on these grounds that even by the time of his death in 1716 Leibniz still refused to accept gravitation, considering it a mysterious and occult power). But still, those who around 1640 were rejecting the expert authority of the Aristotelians still clinging to power in universities—and who were conspiring to go and establish their own new institutions, which would become the great scientific societies and academies of the era—are considered from most historiographical frameworks to have been history’s heroes.
So clearly it is not rejection of authority that is the problem, but only rejection of authority at the wrong times and for the wrong reasons. But how can we be sure of our ability to make such distinctions? It is not enough to say that the science itself is clear and dictates to us in its own clear voice, rather than in the voice of its human representatives, what is true and what is false. For most of us do not have a handle on the science at all. We have not read even a fraction of the relevant scientific literature, nor could we read it if we tried; far less have we carried out the relevant experiments ourselves.
Like it or not, our acceptance of the official account of how infection works, and of how vaccination helps to prevent it while also not causing other problems such as autism or aluminum poisoning, is in the end a matter of trust, in people who appear to us trustworthy because we accept their claim that they have themselves performed the relevant experiments and understood the relevant literature. And this trust in turn is a commitment that is more likely to be threatened or rendered fragile by changes in the social fabric than by new empirical evidence about the scientific truth of the matter. In this respect, the emerging scientific societies of the seventeenth century might in fact reveal to us significant parallels to the websites of today that promote alternative theories of the causes of autism, or that link certain forms of cancer to the “chemtrails” (i.e., vapor trails) left behind by passing airplanes. Whether or not there are parallels—a question that might be of interest to historians and sociologists of science, and also, one hopes, to the public in general—is something that might be determined quite independently of the content of the respective theories, or of whether in the end they turned out to be true.
It is hardly a promising sign, for contemporary alternative-science movements such as the anti-vaxx constituency, that in spite of their alternative stance they consistently play up whatever modest academic credentials their proponents may have. They exaggerate their institutional clout, and they generally include “PhD” after the names of their authorities (and even the occasional “MD”), in contexts in which those working solidly within the establishment would find it undignified or unnecessary to do so. So the establishment continues to have some considerable attraction after all, and one detects already from this that the antiestablishment stance is underlain more by ressentiment than by any real expectation that the alternative movement might, by force of the truths it possesses, hope someday soon to replace the establishment. Whatever else we might say of Francis Bacon or of Descartes, in their desire to raze the old and to build up new systems of inquiry in new institutions, there is no trace of ressentiment in their work. They believed that they were going to take over the establishment, and they were right. Their difference, then, from the confused and alienated citizens who start up websites linking vaccination to autism, or hypothesizing an ice wall that holds our oceans in, may be established without any need for nonscientist opponents of pseudoscience to carry out, or even to fully understand, the science.
The Paranoid Style in the Twenty-First Century
If we think of flat-earth theory’s ascendance in the Trump era as more than a coincidence, as having blown in like an icy gust thanks to the widening of the Overton window, we will notice the way in which it echoes a broad turn to the conspiratorial in public life in America. During the Bush and Obama administrations Rush Limbaugh and Glenn Beck were the media personalities suited to provide the account of political reality that was appreciated as an alternative to the one given in the establishment liberal media preferred by coastal elites. It is the internet radio host Alex Jones (locked out of his media platforms on Facebook, Apple, and YouTube as of August 2018, in response to what the corporate governors of these services deemed to be hate speech in violation of their terms of service) who seems their most obvious descendant in the Trump era.
Unlike Limbaugh and Beck, Jones does not aim to give a coherent alternative account of reality, based on a set of presuppositions about how the world works that he and his followers may be presumed to share with followers of the mainstream media. Jones, rather, wishes to call into question many of our most basic presumptions about how social reality works, much as a flat-earther seeks to do for physical reality. Thus, for example, he has promoted an elaborate alternative account of the 2012 shooting at the Sandy Hook Elementary School in Newtown, Connecticut, according to which it was a “false-flag operation,” and the members of the victims’ families who make appearances in the media are in fact only paid “crisis actors.” This elaborate plot is interpreted as a pretext for coming to take away Americans’ guns. Jones pretends, like the flat-earthers in their view of NASA, that there are forces in the world that are not only diabolical enough, but also powerful and clever enough, to make ordinary people believe more or less anything. It is only by crossing over to the alternative, socially stigmatized, low-status but nonetheless titillatingly “alternative” accounts being offered by the self-styled outsiders, Jones or the representatives of the flat-earth movement, that one can see things as they are.
We are caught, in trying to make sense of what has been generically called online “trutherism”—which can include everything from September 11 conspiracy theories, to accounts of Sandy Hook such as that described above, to flat-earth theory—between a cautious historian’s concern to not overlook continuities with long-standing historical legacies, on the one hand, and, on the other hand, to face up honestly to the radical transformations that the internet has brought on. The Republican candidate in the 1964 presidential elections, Barry Goldwater, had an enduring interest in UFOs, and in the 1970s began pushing for the US government to release its purported secret files concerning them. It was in reference to Goldwater and his followers that the historian Richard Hofstadter wrote his groundbreaking 1964 essay, “The Paranoid Style in American Politics.”29 Americans did not need the internet in order for conspiracy theories to become a central element of national political debate. Hofstadter himself traces the genealogy of this “style” back to at least the early nineteenth century. The ground that Goldwater and others prepared was already particularly fertile for the thriving of personalities like Alex Jones, now enhanced by the communicative superpower of the unrestricted internet.
For the creationists, the elite authorities are simply the members of the scientific establishment, promoting their own hegemonic vision of the world. For the flat-earthers, the elite authorities are a secret cabal, perhaps wealthy bankers, perhaps the same as are held to be spreading chemtrails in the aim of total global mind control. Though not in itself xenophobic or anti-Semitic, flat-earth theory does deploy tropes familiar from the conspiracy theories associated with these ideologies, and it is not at all surprising when on occasion we find them overlapping with flat-earth theory in the worldview of a single individual. In traditional creationism there was wariness of established institutions and their claims to know the truth, but there was no presumption of the power of these institutions to be able to hide the truth. The difference between these two species of alternative social movement may in the end be one of degree, but it plainly tracks the transformations that have taken place elsewhere in political life with the rise of Trump: the near-total disappearance of a shared space of common presuppositions from which we might argue through our differences, and the presumption that one’s opponents’ views are not so much wrong as diabolical.
If we were to agree with Feyerabend, then the proliferation of theories positioned as alternatives to science must count as an unqualified good, regardless of the content of these theories. Holistic medicine, numerology, proletarian genetics, flat-earth theory, creation science: all of these are more or less on a par with one another as alternatives to the hegemonic version of scientific rationality. Yet, in spite of the fact that Feyerabend himself wishes to abolish the myth of apolitical or nonideological science, he does not fully recognize that these various alternative theories may appear variously more or less propitious in different political contexts. It is not just a matter of letting a hundred flowers bloom; one must also pay attention to which sorts of flower bloom in which soils. I have already suggested that flat-earth theory has surged in the most recent period as a sort of scientific correlate of a much broader global trend of political illiberalism, and of growing suspicion of traditional authority that now regularly crosses over into conspiracy theory. It would be hard to imagine a healthy liberal democracy in which flat-earth theory is a viable contender, among others, against the hegemony of scientific reason. We do not need to fall back on any simplistic conception, of the sort that Feyerabend abhors, of the superiority of one scientific theory over another as consisting in its superior correspondence to the way the world in fact is, in order to be confident not only that round-earth theory is better than flat-earth theory, but also that it would be better off without flat-earth theory as its competitor. Flat-earth theory is unworthy to join this contest, even as an underdog.
Is there anything at all that can be said in its favor? It is, certainly, a significant fact about the phenomenology of human life on earth that we experience it as if it were taking place on a flat surface under a dome-shaped sky. For the great majority of human history, this was not only the phenomenology of human experience, but also the standard folk-cosmological account of our place in the world. Martin Heidegger captured this primordial character of our orientation in the world in his critique of the Cartesian view of the spatiality of the world as something pregiven and obvious, and of objects and indeed our own bodies as simply placed or inserted in this pregiven spatial world. In his 1927 Being and Time, the philosopher observes that “there is never a three-dimensional multiplicity of possible positions initially given which is then filled out with objectively present things. This dimensionality of space is still veiled in the spatiality of what is at hand.”30 Thus, he explains by way of illustration, “the ‘above’ is what is ‘on the ceiling’, the ‘below’ is what is ‘on the floor’, the ‘behind’ is what is ‘at the door’. All these wheres are discovered and circumspectly interpreted on the paths and ways of everyday associations, they are not ascertained and catalogued by the observational measurement of space.”31 Heidegger’s language is obscure, but his point is profound: we do not start out with a conception of ourselves, and of our surroundings and ultimately of our planet, as inserted into some pregiven spatial expanse. Rather, we get our very concepts of spatial notions such as “above” and “below” from our deep preconceptual experiences. Above is the sky. Below is the earth. No wonder, then, that flat-earth theory is the default model of the cosmos in human history. It sufficed for the purposes of highly developed civilizations such as ancient China, which included an advanced practice of maritime navigation. Even without knowing of the long and distinguished past of this cosmological model, we have our immediate experience, and it is humanly difficult to be told by experts that our immediate experience is not what we think it is.
We witness this difficulty again and again, across numerous examples of what Margaret Wertheim has called, in the course of her revelatory research on the subject, “outsider physics.”32 Outsider physicists do not want to be told that the basic constituents of reality are some new sort of entity that is not encountered by direct experience and can be detected only through the work of experts with their complicated, and expensive, equipment. And so they reject quarks and bosons in favor of something much more familiar, such as smoke rings. In the case of flat-earth theory, there are no alternative entities to ground the account, but only an insistence on phenomenology rather than empiricism, even if some semblance of empirical evidence in favor of the theory is scraped together ad hoc. In this, flat-earth theory ends up bearing a curious similarity to young-earth creationism, to the extent that it wishes to preserve something that is existentially dear—faith in the case of creationism, phenomenology in the case of flat-earth theory—but is not quite self-aware enough to grasp that it is this existential matter that is at issue, and not some mundane matter of fact. And so, again, it agrees to compete on the home field of science, where the rules are empiricism and valid inference, and therefore where it is fated at the outset to lose at a game for which it has signed up without having learned the rules.
Why would any outsider accept such a contest? To do so is irrational, in a much more profound sense than simply holding the wrong theory to be true. To do so is to not fully understand the nature of the thing to which one is committing oneself, mistaking a question of existential devotion for a question of fact. Here, the judgment of irrationality comes not from a disagreement over facts, but rather from a turning away from facts that are already known, or, to anticipate a notion that will be of central importance in chapter 9, facts that are known without being known.
There is, as we have been seeing in this chapter, a historically well-established tendency to reject the conception of truth as fact, in favor of a conception of truth as something internal, something felt, when it is clear that the facts are not in one’s favor. This move can have significant political implications. The George W. Bush administration’s manipulations are often said to have inaugurated a “post-truth” era. That certain claims may be morally true while empirically false is, however, an idea far older than Bush. It is in play in the lexical distinction in Russian between two different sorts of truth—pravda, which in principle must be grounded in fact, and istina, which is somehow higher than fact. This distinction was inverted by the Bolsheviks, who with no apparent irony gave the name of Pravda to the newspaper that didn’t so much report on what was the case as describe what they would have liked to be the case. A similar transcendence of the merely empirical helps to explain the reaction, in sixteenth-century Spain, to the fabrications of the Jesuit historian Jerónimo Román de la Higuera, author of the so-called Falsos cronicones, which purported to document the antiquity of the Christian faith in the Iberian Peninsula. When it was discovered that he had made it all up, that there had been no martyrs or miracles in Spain in the first few centuries after Christ, Higuera was not denounced as a fraud; instead the empirical falsity of his chronicles was taken as a sign of their power to convey a deeper truth. He had succeeded—by invention, by writing, by telling a story—in retrojecting Christianity into Spain’s distant past, which is surely a far greater accomplishment than simply relating facts.
Famously, Nietzsche called for a “transvaluation of all values.” What he had in mind was a coming era in which human beings would stop lying to themselves and one another, would be brave in the face of the truth. What less visionary and less brave followers, indeed myopic and craven followers, have preferred to do with Nietzsche’s call is instead to transform him into the prophet of a coming era of inegalitarianism, in which only the strongest survive or thrive, based explicitly on a rejection of liberty and equality, the core rational principles of Enlightenment philosophy. Much of the current disagreement about Trump among American voters has to do with which sort of character the president is: a lowly fraudster or a larger-than-life transvaluer of values. It does not have to do with whether or not he is telling the truth in a narrow empirical or factual sense. And so, frustratingly to many opponents, simply pointing out that he is speaking falsehoods can do nothing to set him back. The only principle he consistently follows is something like, as we saw in chapter 1, what the logicians call the “Principle of Explosion”: once you have allowed falsehood into your argument, you can say whatever you want.
One thing that historical perspective shows is that earlier eras have been much more subtle and profound than our own in articulating post-fact views, in particular, post-fact views that are at the same time very much committed to truth, even if it is truth grounded in unreason, such as that of Kierkegaardian faith. Instead, today post-fact irrationalists just make up the flimsiest lies, as that dinosaurs and Jesus Christ walked the earth together, and pretend that they believe this, when we know they do not, and they know we know they do not. Trump says one thing, and then its opposite a few hours later, but otherwise acts as if he has the same theory of truth as everyone else. This is a ratcheting up of irrationalism to levels unprecedented in recent history.
When in 2004 a member of the Bush administration reportedly scoffed at those who continue to live in the “reality-based community,” many were alarmed.33 But this stance did have the virtue of grasping and playing on the real difference between deep commitment to bringing about a world that matches what one most values, and submitting to the world as it is because the facts require us to do so. The administration official who coined this phrase lined up with those many thinkers throughout history who have conceived truth as something that can be willed. This is debatable, of course, and we have been debating it for thousands of years. But it is a world away from the dirty conspiracy-mongering of the flat-earthers, of Alex Jones, and of those they have helped to propel into political power.