Conclusion—Final words: Towards a taxonomy of Stupid, and other wankery

We warned at the start that we’d be offering no solutions to Stupid. To make up for that, or, more correctly, deflect attention from that, we’re going to offer a biographical note. We’ve only been friends for about two years, which is a bit odd as we went to the same university at the same time and sat in the same philosophy lectures. In that long-ago world, we drank at the same bar and liked the same dreary no-wave music. Later, much later, after at least one other career each, we would work for the same media outlet and develop the same intolerance for what we saw as craven thinking and express the same unpopular absolutism where free speech is concerned. We even got pissed off at the same people and annoyed everyone with the same antisocial willingness to tell everyone, including each other, that they were very wrong. The genesis of this book was, accordingly, a shared loathing of the sloppy thinking, shameless bullshitting and ignorant, amnesiac drivel that passes for so many contributions to rational debate, a feeling we were up to our necks in a tide of Stupid that showed no sign of ebbing, and that it was time Someone Fucking Did Something.

But in truth, as much as we might like to paint ourselves as the curmudgeonly heroes of a War on Stupid, shaped by our intellectual upbringings to have no choice but to take up arms against a sea of cretins, we’re more correctly just two more, and rather minor, names in a long and—we think—honourable tradition stretching back two millennia and more. For the history of Stupid is a long one. And in tracing the tangled, matted, strands of human idiocy that unite the greatest of philosophers and the shrillest of pop stars, that connect the mightiest of historical institutions with the most venal individuals and link our forebears to ourselves despite thousands of years of learning, we’ve seen how the fight against it has been a long one as well.

In many ways, it has been a successful fight: our explorations of the annals of Stupid suggest that things used to be a whole lot Stupider. Many of the most blatantly offensive forms of Stupid are now in retreat, at least in the lands of #firstworldproblems and their contiguous zones. Women are officially no longer second-class citizens, we don’t persecute and kill gay or transgender people as a matter of policy, we seek to acknowledge the impacts of imperialism on indigenous people and their prior relationship with the land if we now live on it. We live longer, healthier, wealthier lives than ever before, we place some basic restrictions on how much people can exploit one another (well, other than in the US), and many of the forms of discrimination and harassment that anyone other than an adult white male once endured as a matter of course in Western society are now illegal or considered entirely beyond the pale.

Compared to the 1960s, let alone Enlightenment Europe or the world of the Reformation, Stupid is mostly in retreat.

And yet, Stupid remains, always capable of surging back. It keeps on causing bad things to happen, it continues to cost lives, health, liberties, economic opportunity. People still die as a consequence of Stupid—their own, or someone else’s, and not just that of ordinary citizens, but of people who should know better, like their parents, or people who are paid, at least notionally, to be Not Stupid. You could take inspiration from one particular sub-branch of Stupid—dodgy economic modelling—and model the cost of Stupid to our economies, loading in everything from bad policy choices to people dying unnecessarily to lower economic growth, but it’s more than that. Stupid is pervasive—undermining our rights as citizens, infuriating us when we encounter it in the media or from some officious jobsworth, corrupting our capacity to sensibly debate public issues, alienating us from one another. It’s like the background radiation of society, always there, inescapable, the distant but permanent echo of some Big Bang of Idiocy.

You might have noticed a certain philosophical basis for this book: Stupid matters because it has consequences, bad consequences, and they flow even when people seek to do good. Few of the people in this book were or are genuinely and completely evil, but the damage inflicted by the well-intentioned or the ignorant can be just as profound as that caused by actual malice. We’re thus professedly consequentialists—although if you tried to pin us down on exactly which type of consequentialism we each adhered to we might have to sneak a quick glance at the Wikipedia entry to be sure. We think that, so long as we’re going to live together in societies, we should aim to maximise the positive consequences of the way we interact individually, socially, economically and politically, and minimise the negative consequences.

Of course, that sounds dead easy, but the trick is seeing those consequences clearly, a trick that proves beyond a surprising number of otherwise intelligent people—indeed, proves beyond all of us at some point or other. Who has such a cold, dead eye and such a forensic gaze that self-interest or ignorance or haste or emotion has never clouded their judgement? Not bloody us, that we can guarantee.

So, conscious that, as Jesus may well have said, the Stupid will always be with us, with us as individuals, as groups and as societies, we must always be on guard against it, must always be examining consequences, not merely intentions. And the first step in that process is to understand that Stupid is always driven by the same things, whether it’s in the medieval church or a Facebook group about chemtrails. The same core motivations for Stupid exist in human society now that have always existed in it. The first is obvious:

Commercial incentives

Upton Sinclair said it best: ‘It is difficult to get a man to understand something, when his salary depends on his not understanding it.’ From defence companies that benefit from hyping the threats they claim their products protect against to fossil fuel industries funding propaganda against climate science, from the medicalising for profit of innate human states to academics making a living peddling nanny-state solutions, the connection between Stupid and money is a strong one and has long been so.

Historical digression: one of the most successful rent-seekers in history was the London printing oligopoly, the Stationers’ Company, which long, and successfully, argued for strict censorship of printed material by the Tudor and Stuart regimes of England. That system of censorship also enabled the Stationers’ Company to enforce exclusive copyright and block competition for its members from the middle of the sixteenth to the end of the seventeenth centuries—in the same way that the copyright cartel of the movie and music industries now support aggressive internet censorship and anti-privacy laws to support their oligopoly. The Stationers argued that unfettered printing was a ‘dangerous innovation’, like a ‘field overpestered with too much stock’, and that the ‘public good of the state’ was linked to the ‘private prosperity of the Stationers’ Company’; what England needed was not printing but ‘well-ordered printing’.

The Stationers may not have been history’s first rent-seekers—religions have long understood the benefit of good relations with secular powers—but they are a splendid model for so many who have come after them. It is true that no industry lobbyist, academic or peak body would now dream of so bluntly associating the national interest with private interests; former General Motors executive Charles Wilson spent years living down his famous quote that ‘what was good for our country was good for General Motors, and vice versa’. Instead, there would be modelling produced to demonstrate the additional jobs, or higher economic growth, or lower social costs of measures that just so happened to benefit those urging the measures. But their matter-of-fact insistence that censorship was good for printing and good for England prefigures so much of the casual Stupid to which private interests and governments have subjected us for so long. When money talks, it does so in fluent Stupid.

Of similar long duration is the connection between Stupid and money’s close relative . . .

Preservation of power

As with money, power provides a strong incentive for Stupid. For millennia, it motivated institutional religions to insist they alone provided the path to salvation; individuals seeking alternatives were discouraged, then tortured, then killed, within an intellectual framework based on the need to support institutional authority rather than philosophical coherence.

The state—a relative newcomer in political philosophy, having been around less than 500 years (a length of time we don’t cavalierly dismiss, but which isn’t that long in the history of Stupid)—has long since replaced organised religion as the primary practitioner of Stupid-for-power, particularly but not only through national security laws: in many respects it replicates the intellectual framework of religions, insisting that it alone knows what is best for the safety of citizens, beyond even citizens themselves.

A possible distinction between Stupid produced for commercial purposes and that intended to support positions of power—usually state power—is that lobbyists and economists working for commercial interests often know perfectly well that their case is nonsense, but they are paid to argue it, and thus bring a certain professional commitment—‘Hello, Sam’ ‘Hello, Ralph’—to delivering Stupid. But members of state institutions are much more likely to believe the Stupid they utter, having convinced themselves that they are a critical bulwark against the threats they relentlessly hype. This explains why national security officials eventually come to see virtually everything outside themselves as a security threat—why the National Security Agency described anyone using internet cryptography (which is anyone who does online banking or shopping, for starters) as ‘adversaries’; why the head of the Australian Security and Intelligence Organisation complained that the internet allowed ‘individuals to propagate and absorb unfettered ideas . . . literally, in their lounge rooms.’

The terror of unfettered ideas roaming the nation’s lounge rooms (lounge rooms, by god, where you’d assume families were safe!) thanks to the internet aside, not all Stupid is self-interested. Stupid can also advance when we are . . .

Lacking the weapons to combat Stupid

It requires basic skills to combat Stupid, and sometimes new forms of Stupid, or new delivery mechanisms for it, demand new skills. The arrival of printing was not merely immensely disruptive to existing Stupid-based models of commerce and power in the middle of the last millennium, but disruptive to existing analytical techniques as well.

Here, for once, the frequent comparison of the impacts of printing and the internet is justified—just as the internet introduced us both to vast amounts of easily accessible information and vast amounts of complete garbage (sometimes the same thing), so too did printing usher in access both to the great works of Western and Arab philosophy and science and to huge amounts of rubbish. However, as Richard Abel details in his excellent Gutenberg Revolution: A History of Print Culture, Western scholars lacked the tools to know the difference between fictions like alchemy, astrology, magic, much of Aristotelian ‘science’ or ancient mysticism and more rigorously empirical content. That is, they lacked the tools to identify Stupid.

Identifying Stupid took the work of new thinkers, like Paracelsus in medicine and Bruno in cosmology (burned to death by the Catholics for his troubles), to start the process of falsifying much of the material suddenly far more widely available in Europe. Intellectually, the world owes such figures an enormous debt. For those of us worried about Stupid, they are remarkable, unsung heroes, pioneers who, to borrow an imperialist metaphor, ventured into a New World of Stupid and began trying to tame it in a way that none of us can begin to imagine. The whole idea of falsification, in a way, had to be invented by them, in the same way that a primitive kind of peer review was being invented. Before printing, there was intellectual debate, of course, but the scholastic tradition of the high Middle Ages was much more intensively an oral culture than that which followed, and not merely because of the dearth of books: the acts of both reading and writing were strongly oral in nature, particularly at those newfangled ‘universities’ that began spreading across Europe after the turn of the millennium.

Moreover, in the scholastic tradition, much analysis of new ideas was really about preservation of positions of power—for example, focused on assessing their complementarity with the Church’s power (meaning, among other things, scholars backed by a strong monastic order or secular ruler had more freedom than those who weren’t). But after printing, the size of the audience for new ideas massively increased, and, as we discussed earlier, ideas could actually be transmitted with relative accuracy rather than relying on monks acting as imperfect human photocopiers. Responses to new ideas could be circulated to a large number of readers within a (in historical terms) reasonably short time frame. The wisdom of crowds might be an overhyped phenomenon (for which evidence, Read The Comments), but it bore out in humanism and the beginnings of science as Western minds began acquiring the basics of bullshit detection.

Since then, the West has had 600 years to develop a whole framework of critical analysis, aided by the professionalisation of science and universal education in the twentieth century. Unlike our fifteenth-century ancestors, we don’t have the excuse of lacking the tools to combat Stupid. The land has been tamed; the log cabins have given way to luxury apartments; we have a vast array of anti-Stupid tools, but we keep finding reasons not to use them.

Now, true, there are aspects of internet-derived Stupid that make combating it more difficult. The internet is a far more rapid delivery mechanism of Stupid than printing ever was. Online, a lie can circumnavigate the world several thousand times and flog you a wrist band while the truth is still looking up Wikipedia. And true, things can ‘go viral’, in that term beloved of marketing types, spreading instantly in a raging storm of retweets and likes, although books often ‘went viral’ back in the day too—it’s not so long since The Secret shifted tens of millions of units to the intellectually feeble, emotionally crippled and financially desperate via Oprah.

The internet does demand certain skills more than others compared to print: we no longer need to remember as much, as long as we can recall how to reach important information (in the same way, though to a lesser degree, that writing and later printing ended the centrality of memory in oral culture), but we need to be more sceptical because ‘authority’ is easier to fake online, whether it’s a Photoshopped picture, an invented quote or the life that facts can take on once unchained from their context and source and left free to float about, unfettered, in our lounge rooms.

The internet also strengthens another key motivation for Stupid . . .

Tribality and groupthink

. . . because it strengthens connectedness and enables us to link up with the communities with which we most identify. A sense of tribality drives the ‘piling-on’ effect of social media, in which online groups come to resemble pitchfork-wielding hordes or lynch mobs pursuing someone deemed to have egregiously offended the group. Criticising online witch-hunts is de rigueur these days; it’s forgotten that they frequently happen to those entirely deserving of such a dire fate, but it can also overtake those guilty of, at worst, clumsy expression, with careers and job prospects ruined for some type of –ism more perceived than real.

But that such behaviour is confined to online rather than the real world in the West is one identifiable area of Stupid where history demonstrates significant improvement. Whether it was Christians slaughtering Jews in medieval Europe, or burning witches in early modern England, or lynching African Americans in the US or murdering gays in Australia, the long tradition of tribal violence in the West has receded, and people are alive who otherwise wouldn’t be because of it.

Even so, a related component of tribality continues to feature strongly in public debate, something we spent some time on in our Introduction . . .

Ad hominem

In the immortal words of The Onion, stereotypes save time—a principle almost all of us, no matter how intellectually rigorous, have employed at some point, in particular in dismissing the argument of someone because of who they are or whom they represent, rather than properly engaging with it.

It’s worth going through this slowly because it takes us somewhere close to the final point we want to make. The process of rigorous engagement with the substance of what your interlocutor actually says—or ‘listening’ as scientists call it—can be difficult. Most of us link our arguments to our egos: to admit that someone else’s argument, one that we have been aggressively challenging, has merit is a wounding blow to our pride; to acknowledge our errors is akin to gouging out our own eyes—especially when it relates to what we do for a living. When our interlocutor is someone whom we dislike or whose motives we suspect, our dismissiveness is redoubled. Admitting they are right is a Gethsemanean agony and you’d rather—to hopelessly mix up the metaphor—lop off an ear than do it.

So, yes, unfortunately, rigour can be profoundly annoying. Worse, just because someone is biased doesn’t mean their arguments can be automatically dismissed. The industry lobbyist touting modelling, the fossil fuel-funded think tank getting press coverage for its new ‘research’, the be-costumed cleric decrying some social innovation, all indeed would say that, but merely pointing that out doesn’t necessarily negate the requirement to demonstrate the dearth of evidence, the failure of logic, in their case.

This is not an error that is confined to the intellectual hoi polloi—all of us can do it. In early 2014, eminent Princeton historian Sean Wilentz insisted that true liberals (of which, as a close friend of the Clintons, Wilentz is a kind of éminence grease) shouldn’t support journalist Glenn Greenwald, publisher Julian Assange and whistleblower Edward Snowden in their efforts to bring transparency to the national security state because each had, at some point, expressed libertarian views. That is, anything they say should be immediately dismissed because they fail to meet some Wilentz-designed test of How to Spot an East Coast Liberal.*

From this point of view, the quality of public debate, alas, has not been helped by the rise of by-lines in news reportage and commentary. Historically, journalists and commentators either used pseudonyms or, with the rise of newspapers, were anonymous—newspapers reported the news and analysed it with a kind of voice-of-God perspective that these days is reserved for editorials, which newspaper editors still like to think should be handed down carved in stone from the nearest mountain. But over the course of the twentieth century by-lines became ubiquitous, more quickly in some newspapers than others, and for different reasons in different countries (by-lined journalism, for example, is easier for governments and corporations to target for retribution). The result is more epistemologically sound journalism—it is clearly one person’s, or several people’s, view of events rather than an account purporting to complete objectivity. By-lines allow a reader to understand, at least in part, where the information or analysis or comment is coming from.

But it has also driven the rise of celebrity journalism and means journalists and commentators are easier to pigeonhole, given readers can rapidly become familiar with their work even if they don’t regularly read them. In that environment, ad hominem analysis of media content becomes routine.

However, the problem of ad hominem thinking points us towards the nearest thing you’ll get to a lesson from this book.

Where to from here, or, Fuck off, you’re on your own

Having devoted considerable length to tracing the intellectual roots from which the great flowering weed of Stupid has grown over thousands of years, it’s natural to ask what can be done to fight Stupid, what measures will arrest its progress where it’s still on the march, to expedite its reversal in those places where it is in retreat, to draw a line in the sand, to take a final stand, and say, ‘This far, and no further’?

Alas, we’ve got nothing. There are no magic solutions to Stupid, other than the accounts we’ve provided in this book. It is a fight that has already taken millennia. For over 2500 years, people in Western societies have been wrestling with Stupid, locked in Mortal Kombat with it, trying to establish intellectually rigorous frameworks for knowledge, grappling with the most basic questions of epistemology and phenomenology, developing tools for thinking logically and assessing evidence. In the West, we’ve been able to access the best thinkers from other cultures: the rich tradition of Arab scholarship and philosophy offset the intellectual doldrums that persisted in the West until after the Carolingian Renaissance; we’ve absorbed Jewish philosophy and thought even as we launched pogroms against Jews; the explorations of a more confident early modern Europe brought contact with the remarkable cultures of India and the Far East and their rich intellectual histories; imperialism, for all its genocidal and exploitative heritage, eventually permitted the filtering back of unique indigenous perspectives and thought to the West.

All of that, and yet we still click on Nigerian scam emails, fail to vaccinate our kids and read celebrity news.

Combating Stupid has been the task of some of history’s greatest minds: the Greek philosophers who first gave serious thought to the gap between what we thought we knew, and the world itself—if, for that matter, there was a world itself. And then it took over twenty centuries before the European descendants of those first philosophers began wondering not just whether there was a world itself, but about the language we were using to describe the world, and the extent to which epistemology was also about language and, never mind the world, what did using a word like ‘world’ actually mean?

Others shared the burden: Greek and Roman philosophers who first gave thought to how exactly societies should be governed; the Catholic monks in western Europe who slowly lost their eyesight and endured haemorrhoid hell transcribing key patristic works; the Orthodox Church in the eastern Roman Empire that kept safe some of Western philosophy’s most important books all the way until the fifteenth century; the scholars of the first renaissances in the eighth and twelfth centuries (as Woody Allen might say, the early, funny renaissances) and their descendants in the fifteenth century in art, science and politics; the first humanists, who began treating the Bible as a work of literature to be investigated in its historical context; the first Reformists who puzzled over why institutional religion differed so markedly from the prescriptions of the Gospel; the early advocates for natural rights like Spinoza and Locke; the people put to death for believing in the wrong religious nuance, or in no religion at all; the English parliamentarians who fought their own monarch and correctly charged him with levying war on his own people, and their children who resisted that monarch’s son and sent him packing as well.

The Enlightenments were a culmination of all of these traditions and more besides. It needs to be repeated that Enlightenments were almost entirely elite phenomena of middle- and upper-class western Europeans, and mainly men. But the emphasis on reason, and the inevitable consequences of its application, at the heart of the Enlightenment projects was what was important, rather than the composition of its advocates.

Many more conservative Enlightenment figures—most notably Voltaire—wouldn’t accept those inevitable consequences. Voltaire’s target was, above all, the Catholic Church, rather than fundamental political reform. For more radical philosophes—most prominently Diderot—anti-clericalism eventually became just one of many facets of their work, because the consistent and full application of reason and the emphasis on individual choice led on to other targets—the repression of women; the exploitation of colonies; the power of aristocrats or any system of government that wasn’t democratic.

The radical philosophes weren’t always consistent—for example, they were often disgustingly anti-Semitic—but they went much further on the journey that Reason took them than their more conservative opponents and colleagues. That’s the crucial lesson from the Enlightenments: accepting where Reason takes you, rather than allowing money or power or the source of an idea to derail that journey. And the long list of men and women who have embarked on that journey, to which we’re just two minor names on the most recent page, is as close as you’re going to get to a solution to Stupid. The nineteenth-century philosopher, scientist, semiotician, mathematician and more Charles Sanders Peirce wrote:

Different minds may set out with the most antagonistic views, but the progress of investigation carries them by a force outside of themselves to one and the same conclusion. This activity of thought by which we are carried, not where we wish, but to a foreordained goal, is like the operation of destiny. No modification of the point of view taken, no selection of other facts for study, no natural bent of mind even, can enable a man to escape the predestinate opinion. This great law is embodied in the conception of truth and reality. The opinion which is fated to be ultimately agreed to by all who investigate, is what we mean by the truth, and the object represented in this opinion is the real. That is the way I would explain reality.

Peirce is also known as the father of Pragmatism, a much-misunderstood approach to philosophy that rejected the radical scepticism of much of Western philosophy, especially the Cartesian tradition, in favour of a more practical approach to knowledge and truth. It emphasised reason as an instrument for dealing with perceived reality and solving its problems.

Pragmatists like Peirce, William James (brother of Henry) and John Dewey rejected the Cartesian idea that everything must be doubted and knowledge painstakingly built up using only perfectly verifiable blocks (the first of which, for Descartes, was his own existence). Pragmatists instead suggested that doubt is only relevant when it has a real-world significance, and that an understanding of reality can be developed using scientific method and utility (thus, the misinterpretation that Pragmatism is about believing whatever serves your purposes). This approach is a fallibilist one, accepting that our understanding of what is true may change, but what works for now will serve until that point.

And in particular, in contrast to the individualist approach of radical sceptics, who won’t accept that anyone other than themselves even exists until rigorously established, Pragmatists (especially Peirce and Dewey) emphasised the collaborative, community aspect of the search for truth. ‘The very origin of the conception of reality shows that this conception essentially involves the notion of a community,’ Peirce said. For Dewey, the concept of shared inquiry was fundamental, and informed his groundbreaking work on education reform in the US. For Pragmatists, the search for truth was an exchange of views within an intellectual community, perhaps using different techniques and operating from different perspectives, but sharing a respect for scientific method and what could be demonstrated to be consistent with reality.

Pragmatism has had mixed fortunes since the late nineteenth century, but re-emerged via the likes of Richard Rorty nearly a century later. Even if Pragmatism doesn’t float your particular epistemological boat, there is much to like about its rejection of radical scepticism and, by implication, other dismissals of external truths: for the Pragmatist, there is truth, and it can exist both within and outside the text, and the search for it is a collaborative one. It’s not that sexy a philosophy, really: it lacks the relativism that so entrances first-year philosophy students delighted to discover that nothing is real or true, it has none of the Gallic panache of deconstruction, the Left Bank cool of existentialism or the high Teutonic rigour of classical German philosophy; it’s just the product of some rather doughty New England and New York part-time philosophers, focused on the practicality of establishing working solutions to problems.

But it carries within it the reason why Stupid still plagues us, and how we can continue to fight it like so many generations of grumps, curmudgeons and annoying sceptics before us. In societies in which old forms of Stupid live on, while new ones are being constantly created, it can serve as both a methodological approach and maybe even a rallying cry: that we be carried not where we wish, but to a goal foreordained by the rigour of our thought.

 

 

 

*     Of course, applying that logic to Wilentz himself, no progressive should heed anything he has to say because his Rise of American Democracy is not far short of propaganda for the genocidal slave-owner Andrew Jackson (Daniel Walker Howe’s What Hath God Wrought is far superior, in any case).