4

‘Nudge them all—God will know his own’: Soft, hard and extreme paternalism

. . . it would be possible to create a national smart card system . . . Using data from the card system, a sliding scale of taxes could be introduced . . . The more alcohol you purchase, in any form, at any time within the statement period, the higher tax you pay . . . On a night out, drinks would become progressively more expensive. Loading up on alcohol before you go out wouldn’t help, as the system would take into account the takeaway purchases you’d made earlier.

—Dan O’Keeffe, The Conversation, March 2013

For a country with a reputation as a bunch of boozers, Australia now has a strange attitude to alcohol. Official data shows that our per capita consumption has fallen by a third since 1975, and is now below, and often well below, that of most European countries. In 2013, levels of daily drinking were at their lowest since at least 1991, including a big drop since 2010; the number of Australians who don’t drink at all has risen by more than half between 1991 and 2013 and is now at the highest levels ever recorded. Binge drinking by young people has fallen dramatically, and binge drinking by women is down too.

You’d think, on the strength of those outcomes, public health types would be well pleased. Not at all. Indeed, quite the opposite: Australia is in the midst of an anti-alcohol crusade that constantly warns of an ‘epidemic’ of alcohol abuse with massive ‘social costs’ that needs to be curbed by more regulation, more surveillance of consumers and price rises. Some of the highlights of this War on Alcohol include:

•   The Australian Medical Association proposed raising the legal drinking age to twenty-five because neuroscience suggested that was when brain development halted.

•   In 2008, the federal government pledged to end the ‘epidemic’ of binge drinking among young people, despite evidence that the incidence of binge drinking had been falling for a long time.

•   A public health body called for alcohol consumption to be banned on school grounds because drinking at fetes or barbecues ‘undermines the alcohol education programs for young people in schools’.

•   Public health bodies now regularly warn about ‘pre-loading’, a sinister term they have developed to describe drinking alcohol at home prior to going out, which ‘is causing alcohol-related crime, violence, hospitalisation, assault and death’ and must be curbed by alcohol price rises.

•   A government department proposed to force employers to discourage alcohol consumption on the basis that ‘in some work settings, workers who do not normally drink in their own leisure time may find it expected of them by their colleagues or workplace’.

However, the term ‘War on Alcohol’ may well be too narrow: other perceived sins are targeted as well. Various Australian academics, politicians and campaigners have also called for bans on and censorship of social media, bans on online apps, bans on supermarkets selling pain relief, bans on advertising of junk food, bans on soft drink and high-sugar products, bans on clothing that ‘sexualises children’, drug tests for everyone in the country using opioid pain relief, licensing of smokers and, as we saw at the start of this chapter, licensing and surveillance of drinkers to track their alcohol consumption.

This last idea has particular appeal to public health lobbyists because of its extendability: once in place, a universal monitoring system could be used to track and deter whatever is the subject of the most recent moral panic: the consumption of junk food and soft drinks, sugar, television, pharmaceuticals, video games, ringtones, pornography, hoodies and whatever music form or artist enjoyed by our feckless youth is currently considered unacceptably corrupting (we’ll return to that).

Based on the proposals routinely floated by public health lobbyists, you might think Australia faced a major health crisis requiring urgent action. In fact, according to the World Health Organization, non-indigenous Australians are the equal fourth longest-lived people in the world (indigenous health outcomes are a very different matter). That’s despite spending far less of its GDP on health than many other developed countries, despite its apparently shocking alcohol consumption, despite Australia’s ‘obesegenic society’.

Public health groups look to bridge this reality gap between the rude good health of non-indigenous Australians and their hysterical claims about alcohol consumption and diet by emphasising what Australians think about everyone else’s lifestyle choices. Polling from public health bodies now regularly shows Australians reporting they themselves are drinking less, but they are more and more worried about how much everyone else is drinking—unsurprising, given they are constantly bombarded with claims about ‘epidemics’ of alcohol and obesity.

Paternalism in theory and practice

What drives these health-motivated interventions in Australia is the same thing that has driven many other forms of state-sponsored intervention in people’s lives—paternalism: the conviction that you know what is best for others, and that that knowledge gives you the right to regulate and control others’ behaviour to make their lives better, whether they want you to or not. It is one of history’s most pervasive and damaging forms of Stupid.

There are numerous kinds of paternalism—most usefully, for our purposes, are those described as soft and hard paternalism. The soft–hard difference comes in two kinds. One relates to how far a paternalist will go to interfere with someone else. If I decide to play Russian roulette with a semi-automatic pistol, not realising it works differently from a revolver, a soft paternalist would intervene to stop me and make sure I was fully informed about the basics of firearms before letting me proceed; a hard paternalist would seize the weapon, or have me committed to a psych ward, because I have no right to take my own life.

But the more common soft–hard paternalism distinction is between methods: soft paternalism seeks to influence or ‘nudge’ its target’s decision-making, but stops short of outright prohibition, which is reserved for hard paternalists, who simply prefer to ban things they don’t like. Soft paternalism sometimes gets the oxymoron ‘libertarian paternalism’, the sort of term likely to infuriate both nanny-state types and rugged individualists.

The case against paternalism has been mounted by a succession of philosophers, starting with Locke (Spinoza as well, although less directly), followed by the likes of Kant, who argued that paternalism is innately hostile to the concept of human equality, and particularly John Stuart Mill, who formulated the classic argument against paternalism:

. . . [T]he only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others. His own good, either physical or moral, is not a sufficient warrant. He cannot rightfully be compelled to do or forbear because it will be better for him to do so, because it will make him happier, because in the opinion of others, to do so would be wise, or even right . . . The only part of the conduct of anyone, for which he is amenable to society, is that which concerns others. In the part which merely concerns himself, his independence is of right, absolute, over himself. Over his own body-mind, the individual is sovereign.

Mill’s most acute point was that individuals are inevitably the best judges of their own interests; there is no other party with more or even the same knowledge about what is best for that individual. Moreover, no other party, and particularly no government, shares the exact value system and priorities of an individual, however closely affiliated to them they may be. In Mill’s famous examples, intervention in others’ decisions about themselves was only justified in rare cases: a person could be prevented from selling themselves into slavery, as he would ‘defeat his own case for liberty’, or a pedestrian should be warned of an unsafe bridge—even if that necessitated force to ensure they were aware of it—but once they were aware of the danger, left to make their own decision about whether to cross or not.

Some arguments in favour of paternalism do stand up better than others. In particular, the argument that we are, in effect, multiple selves, and can make decisions that may be costly to our future selves, is a solid one, particularly if one’s current self is making decisions based on insufficient information—information that would be available to a future self—or one is temporarily cognitively impaired. And as we see repeatedly elsewhere in this book, humans are pretty poor at making rational, evidence-based decisions. This reinforces the ‘multiple selves’ argument: over time, as a group of our multiple selves—like those episodes of Doctor Who in which different Doctors join forces—we might reach sensible decisions, but individual decisions are likely to be affected by a host of the sort of problems we identify in this book.

Against the multiple-selves argument, however, is the response that it is exactly our current decisions, for better and for worse, that make our future selves better; that we need the freedom to make poor decisions and mistakes in order to become wiser, and how we use our freedom is fundamental to how we develop as humans. A decision to undertake an activity that’s risky either to our health or, perhaps, to our future income, like undertaking humanitarian work abroad, embracing political activism or climbing a mountain, may in fact significantly change and improve us as people even if these activities are, from a risk-averse point of view, ill-advised. Life, even the most anodyne life, must contain some risk, and how we manage and assess risk is one of the most important aspects of our characters and how we live.

That said, there are some forms of paternalism that all but the most hard-hearted libertarian would surely endorse. Permitting slavery, even voluntary slavery, is a straightforward case. So too depriving a drunk of their car keys. But the problem with those examples is that, once granted, they can logically be extended to other acts that might permanently reduce one’s personal autonomy, such as preventing suicide or banning dangerous drugs—restrictions with which a great many people would have a problem.

Such shades-of-grey individual scenarios, however, are more games for philosophy students than practical guides. In the real world, paternalism is considerably wider, and Stupider, than people looking to have themselves enslaved or pedestrians wandering towards dodgy bridges. Australians live in a society riddled with paternalism. Like other Western countries, we have drug laws designed to minimise the personal harm from consuming some chemicals and plants, a form of Stupid that inflicts far more damage on society than that caused by their consumption. Drug laws restrain personal liberty, inflict a massive economic cost from law enforcement and criminal justice, and create a violent and destructive criminal culture. And it certainly doesn’t stop at drugs: we have a range of limitations on gambling which are in effect competition restrictions maximising profits for approved operators. We retain a censorship system to stop adults from viewing materials deemed harmful to them. We have laws intended to prevent terminally ill people from receiving advice and assistance on euthanasia options. We have a costly consumption tax exemption for fresh food, designed to encourage low-income earners to eat more healthily.

Australia’s paternalism, like that of other countries, is also remarkably inconsistent—we ban certain drugs but allow others that produce greater harms, for instance. And we’re bizarrely unpaternalistic about interfering in people’s freedom posthumously: we allow people with certain illnesses to die and suffer because others are allowed to retain their healthy organs after death. Indeed, in Australia families can override the wishes of the dead who have indicated their desire to donate their organs—where’s a little paternalism when organ donors need it?

As with other forms of Stupid, there are also hierarchies of paternalism, which—not coincidentally—closely resemble power structures within society. It is only a matter of years since Australian states had ‘hard paternalism’ based on sexual preference, while a variety of financially discriminatory practices against gay couples were only removed more recently. Our welfare policies are structured so that middle-income earners are given generous transfer payments by governments without even having to fill out forms to claim them, whereas low-income earners are subjected to schemes such as Work for the Dole and stringent reporting requirements, and Aboriginal Australians are subjected to income management.

We also have a gender-based hierarchy of paternalism: men are subjected to less intervention than women, who are forced to endure both hard and soft forms of paternalism in relation to their bodies. In particular, women are constantly pressured to moderate their behaviour and consumption of drugs because they have uteruses. Such paternalism infantilises women: in 2013, an Australian public health body ran a campaign to encourage men not to drink while their partners are pregnant, as if women are easily influenced into unhealthy behaviours by partners. And this gender-based paternalism hierarchy is at odds with the high correlation between having a penis and poor health outcomes, like dying younger and the likelihood of becoming a victim or perpetrator of violence or inflicting costs on society via the criminal justice system.

Paternalism is also, as the word suggests, particularly directed at young people. Despite falling levels of alcohol consumption, less violent crime and less binge drinking, the young men of Australia are the object of perennial lamentation about their out-of-control alcohol consumption (and consumption of *insert current object of media drug panic here*) fuelling an epidemic of violence, while young women are portrayed as binge drinkers constantly placing themselves at risk of sexual exploitation, or worse. This, of course, is an eternal cycle that each generation is doomed to repeat as the reckless, drug-abusing youth of today become the concerned parents of tomorrow, poised to criticise their offspring for the sins they committed to worse degree.

Discussions of paternalism are most commonly complicated by the fact that many ostensibly paternalistic actions by the state prevent an individual not merely from damaging themselves, but also from inflicting economic or social costs on others or the rest of the community, something which a state is entirely justified in preventing. Laws against drunk-driving, and police enforcement of them (for example, via random breath-testing) are now widely accepted, not merely because they reduce the numbers of drink-drivers who kill and injure themselves, but because they reduce the number of other motorists and pedestrians killed by drunks.

We also have laws requiring the wearing of seatbelts by vehicle occupants and helmets by bicycle riders. These are ostensibly aimed only at individuals, but in fact they reduce the broader costs of individuals’ poor decision-making: if a driver decides to not wear a seatbelt or a cyclist prefers not to wear a helmet, the costs of a subsequent accident in terms of healthcare costs will be greater than if they had been better protected. Outside of a strict user-pays healthcare system, this means those additional costs will be borne by the rest of the community—so seatbelt and helmet laws are justified purely from an economic standpoint and on the basis that individuals don’t have a right to inflict costs on the rest of their communities.

In a RonPaulistan libertarian utopia, the sort of place where men are men and bureaucrats are nervous, there could conceivably be a system in which you may elect not to wear a seatbelt, but in doing so agree to pay all healthcare costs beyond that which would have accrued if you had been wearing one. But such a system would be an administrative nightmare, and in any event what civilised society (well, outside the United States) would leave, say, a brain-injured non-seatbelt wearer who could not afford rehabilitation to rot? It’s also hard to see how to apply such an arrangement when individual decisions contribute to a systemic cost, like higher crime and suicide levels across a whole society because of widespread gun ownership. Firearms have a form of network effect in which, no matter how safely and responsibly an individual weapon is used and stored, the greater the number of firearms in a community, the greater the number of firearm crimes and deaths.

Australia also has a compulsory superannuation system that forces all workers to save for their retirement. Unlike other forms of paternalism, there are significant medium- and long-term economic benefits from the large national savings pool generated by compulsory superannuation: Australia’s one and a half trillion dollar-plus superannuation pool, for example, was an important factor in mitigating the effects of the global financial crisis on Australian financial institutions. Most particularly, compulsion (as opposed to the generous tax incentives that are also intended to encourage retirement saving) will provide a significant saving for future budgets as the population ages and there is less reliance on aged pensions than would otherwise be the case.

Another example is tobacco excise. The mere use of tobacco causes health problems, and unlike alcohol, which can be consumed in safe and indeed healthful doses, it has no offsetting health benefits for the range of illnesses it inflicts. Tobacco users therefore inflict greater costs on the healthcare system than they otherwise would. It is thus not paternalism to charge smokers a tax on tobacco sufficient to cover the significant extra costs they impose on the health system (as happens in Australia); nor is it paternalism to make smokers puff away from anyone who may breathe in second-hand smoke, especially children. But any additional tax beyond that level, or restrictions such as curbing advertising, retailing and packaging, are mere revenue-raising and a form of paternalism, imposed purely because society believes it can make a better decision about tobacco consumption than individuals.

The extent to which you can apply this argument, however, can be difficult, because calculating net social impacts of behaviour is complex. Opponents of compulsory bicycle helmets, for example, argue that requiring helmets reduces the incidence of cycling, and thereby reduces the overall health of the population, a cost that may be sufficient to offset the benefits of reduced head trauma. Such calculations of the social costs of an activity targeted by paternalists forms an increasing part of campaigns to regulate certain behaviours, because policymakers are more likely to accept the need to override individual decisions on the basis that economic welfare will thereby be improved than if paternalists simply argue that they don’t like particular activities.

What never features in estimates of costs is the harm from interfering in people’s rights—that is, the social costs of Stupid. Merely because the infringement of individual rights is a nebulous kind of wrong, one hard to pin down or adequately cost, doesn’t mean there aren’t real-world consequences. Soft paternalism in time can lead to hard paternalism, as has happened with tobacco, which is now heavily restricted and which could plausibly be banned once consumption rates drop into single digits; the demonisation of alcohol and junk food by public health lobbyists has the same goal of creating a climate for ever greater restrictions. And as the array of proposals put forward by public health lobbyists suggests, they view surveillance and infringement of privacy as a small price to pay for the perceived benefits of imposing their own priorities on people.

History’s perennial paternalists

This connection between the tools of governmental control—surveillance and curbing of basic rights like privacy—and paternalism is no accident. The longer history of paternalism shows how fundamentally it is a tool of social and political control as much as an expression of communal interest in the individual targeted.

For example, religious persecution is mostly a form of paternalism. That’s not to dismiss the role that other motivations, such as old-fashioned bigotry, play. But religious persecution in Western cultures has been persistently justified by the conviction that a heretic or non-believer was in danger not of poor health outcomes but of disastrous spiritual outcomes, as they faced eternal damnation because of their views. If you actually believe those sorts of superstitions, religious persecution is entirely logical. Forget John Stuart Mill’s example in which a damaged bridge risks a pedestrian’s life—one’s physical existence is nothing compared to an eternity of hellfire, and anything in this world is justified in saving your soul in the next.

As with smoking or gambling or other modern sins, the role of external agencies is also important in religious paternalism—Satan, like tobacco companies or advertising agencies, was said to possess remarkable powers of manipulation and persuasion that further justified taking action to prevent his misleading weak human minds (noting that paternalists naturally possess superhuman powers of resistance to the wiles of Satan and marketing companies).*

Moreover, such people risk leading others to damnation as well as themselves; that is, there was believed to be a spiritual form of social cost in allowing heretics to communicate with others. It’s hard to say what those social costs of heresy would be without the appropriate economic modelling, but they are probably $∞, given Hell is forever, which, even using Net Present Value, is an awfully long time.

By such logic, torturing heretics is a mere nudge in the right direction; outright killing, a kind of spiritual public health measure, a religious quarantine. Killing an unrepentant heretic was unfortunate, as it would dispatch them to Hell, but better that than their taking others to Hell with them. See the logic?

Now, it’s incorrect to suggest the Christians invented the killing-heretics-as-spiritual-sanitation form of Stupid; recorded history gives that honour to the Greeks, although one imagines the invention of religious persecution was contemporaneous with the invention of religion. But it was the Greeks from whose philosophical traditions the Christians took so much. Socrates was condemned to death by the Athenians for both impiety and ‘corrupting the minds of Athenian youth’—the first recorded use of what would become a favoured paternalist justification, protecting the kids. On the other hand, Roman persecution of Christians appears to have been motivated mostly by reasons of state: Christianity was, unlike Judaism, a new, non-traditional superstition, and adherents of other superstitions like Jews and pagans strongly resented them, threatening the pax Romana.

But institutionalised persecution didn’t receive a full-blown treatment until Christians were able to take over the Roman state apparatus and use it themselves. Even key Christian thinkers who still get good press centuries later, like St Augustine, were enthusiasts for the spiritual equivalent of hard paternalism. ‘It is wonderful how he who entered the service of the gospel in the first instance under the compulsion of bodily punishment, afterwards labored more in the gospel than all they who were called by word only,’ Augustine declared in the fifth century (and don’t you love his New York Times–like euphemism for torture?).

But Augustine’s ‘Lord, persecute me, but not yet’ approach was too timid for some, such as thirteenth-century cleric Arnaud Amalric. When faced with the vexing problem of sorting out Catholics from Cathars during a Crusade in 1209 (for those playing at home, the primary difference is believing Satan was an evil version of God), Amalric declared, ‘Kill them all, God will know his own,’ a mentality that suggests, had he been born several centuries later, he might have become one of those serial killers the media ends up inventing an exotic name for, or, at least, a senior Bush Administration official.

It was in response to the Cathars that the Inquisition was first established by the Catholic Church; it survived, in different countries and in various forms, into the nineteenth century as one of the premier organs of Stupid. The last victim of the Spanish Inquisition was a schoolteacher executed for teaching deism in 1826, by which time the United States had had four presidents who were deists. By the time of its abolition, the Spanish Inquisition could look back on a job well done—it had overseen the execution of at least three to five thousand people and the condemnation of hundreds of thousands more, a huge number of them Jews or Jewish converts. (Forced conversions of Jews and Muslims was common as Christian rule was re-established on the Iberian peninsula up to the end of the fifteenth century.)

But the Catholic Church’s zero-tolerance approach to heterodoxy didn’t always work, and heresies of one kind and another routinely cropped up around Europe. Even as Martin Luther was indulging himself by vandalising the door of a Wittenberg church, there persisted in England a sect called the Lollards (chiefly celebrated today as the first internet meme) with a sort of Central European subsidiary called the Hussites.*

By that stage, however, the Church had a new problem, and it ushered in a new era of paternalism. In an earlier chapter, we looked in detail at the role of printing in the schism between reason and emotion in the European mind after 1500. But printing was also rocket fuel for paternalism. The medieval Church didn’t have a problem with books, beyond the dearth of them—it was difficult enough storing and distributing them and making correct copies of key sources for scholars. The arrival of printing fundamentally changed that, making the sheer number of books a problem, because books could spread ideas. While the Church and individual rulers had previously suppressed certain inconvenient sacred texts or ones judged inauthentic, the role of printing in the spread of Reform ideas prompted the beginnings of modern literary censorship. From the 1520s, national Catholic churches began issuing indices of banned books, and the Vatican itself issued its first Index Librorum Prohibitorum in the 1550s.

Such lists only had moral force; it was secular rulers who ultimately implemented censorship. In England, Henry VIII, whose outcomes-oriented take on religion meant the state church varied depending on whom he wanted to marry and how he was faring financially, decided to add a particular entry to the list of banned books in 1543: the Bible itself.

Many arrogant and ignorant persons had taken upon them not only to preach, teach, and set forth the same by words, sermons, and disputations, but also by printed books, ballads, plays, rhymes, songs and other fancies, subtly to instruct the people, and especially the youth of the kingdom,* otherwise than the Scripture ought to be taught.

Thus, in one of the great moments of Stupid, to protect Christianity, Henry banned reading and discussing the Bible, despite funding and distributing an English Bible himself just two years previously.

Individual English translations of the Good Book had been banned before, but now reading any Bible was outright banned by the Act for the Advancement of True Religion—banned, that is, if you weren’t part of the ruling elite, because this was another example of paternalism with a hierarchy. ‘Noblemen and gentlemen’ and ‘noble and gentle women’ were permitted to read the Bible. For the lower orders, the penalty was one month in prison for reading it, aloud or silent, in public or private. Similarly, anyone who publicly discussed the Bible could be locked up for a month.

How many people were prosecuted for having a quick squiz at Genesis or reading the Lord’s Prayer is unknown; it was unlikely to have been very many; within four years Henry was dead, but his example of absurd, and hierarchical, paternalism would live on. Governments naturally banned more than religious books. Early modern governments understood the threat posed by the new technology of printing, and controlled it through printing licences and copyright regulation. The Stationers’ Company became the official monopolist for printing in England, and as part of that deal supported the Tudors’ censorship regime; in the 1750s, the Parlement of Paris condemned Denis Diderot’s Encyclopedié, demanded it be submitted to theologians for approval and revoked its copyright protection, which meant the work, even though it was still published, was instantly pirated, leading to losses by its publishers. (We’ll come back to the Stationers, who demonstrated one of the eternal truths of Stupid: that the copyright industry will always support censorship and suppression in media.)

We encountered Diderot earlier as, eventually, a radical philosophe. Diderot had been briefly jailed for his published views on religion as a young man. He was also one of the early—if not the earliest—observers to note what we now call the Streisand effect, pointing out that censorship ‘encourages the ideas it opposes through the very violence of its prohibition’. A more sensible approach, he suggested, would be to allow bad ideas to be publicly aired and ridiculed—another idea that is still with us centuries later. Diderot’s case was helped by the remarkable grabquote-laden blurb for the Encyclopedié issued by an angry Pope Clement XIII when it was added to the Index Librorum Prohibitorum:

The said book is impious, scandalous, bold, and full of blasphemies and calumnies against the Christian religion. These volumes are so much more dangerous and reprehensible as they are written in French and in the most seductive style. The author of this book, who has the boldness to sign his name to it, should be arrested as soon as possible.

Putting to one side the idea of an encyclopaedia being written in a seductive style, if anything, the Streisand effect was a greater danger in early modern Europe than in the twentieth century. Prominent politicians, writers and officials kept up a high level of correspondence with each other, affording an alternative means of circulating ideas beyond books alone. Enlightenment readerships were smaller and often confined to a well-connected elite—but that meant they knew which books were being censored and banned, and often circulated copies among themselves, including across national boundaries, to see what the fuss was about. In particular, scholarly texts of no interest beyond academics risked being widely circulated if they became the subject of government censorship: Spinoza was a particular target of Enlightenment censors, and considered so dangerous that even works criticising his ideas were banned in parts of Germany—which of course merely led to their dissemination well beyond the academic elites who would have otherwise read them. Perhaps the Streisand effect should be renamed for the Dutch Jewish lens grinder/philosopher.

Moreover, the scholars and bureaucrats who implemented censorship policies were members of the same elites, oftentimes forced to balance implementation of official policies with personal, well-informed views. The Encyclopedié, for example, was greatly helped by the indulgence of Malesherbes, Louis XV’s chief censor, who supported the project and many other officially banned books and often gave censored publishers advance notice of his own raids.

Nonetheless, the most effective ways for controversial authors to evade the personal consequences of censorship in early modern Europe was either to await the arrival of a friendlier regime—John Locke published his major political works after returning to England in the wake of William of Orange’s invasion, which drove out the Catholic James II—or to die: much of Diderot’s non-Encyclopedié work and Spinoza’s most significant treatise, which influenced generations of scholars and alarmed governments across Europe, were published after their deaths.

While Diderot was in prison for his views on religion, London was undergoing the first drug panic in history. Mid-eighteenth-century British governments, spurred by an outraged upper class (which voraciously consumed alcohol itself), launched an assault on gin consumption among poorer English people, which had risen dramatically off the back of trade protectionism, other forms of paternalism (heavy beer taxes) and inept regulation. The poor, British elites felt, drank too much and didn’t work hard enough. Attempts to regulate and tax gin out of the reach of poorer people were, it was felt, justified not merely by moral righteousness but on economic grounds: consumption of gin caused poverty and idleness in an economy struggling to compete with its European rivals, as well as fuelling riots and crime and degrading Britain’s military capacity.

As with twenty-first-century public health moralising—and, for that matter, Stupid generally—the evidentiary basis for the campaign against gin was flawed: consumption of gin dramatically increased up to the 1750s, but despite a rapidly growing population, London’s crime rate per capita remained about the same. And just as even the smallest level of drinking by pregnant women is now seen as bordering on criminal behaviour, gin was said to damage the capacity of English women to produce the healthy children required by a growing imperial power competing against continental powers such as France. But even contemporaries questioned the demonising of gin, pointing out that social conditions in London and the rioting of a growing lower class had more to do with degrading poverty and wretched living conditions than alcohol.

Alcohol isn’t the only paternalistic obsession that keeps coming around again and again despite the passage of centuries. The wave of Stupid engendered by the arrival of the new medium of the printed book was replicated repeatedly as new media emerged in the twentieth and twenty-first centuries. Catholic groups led the charge against the morality of movies in the United States in the 1920s (partly because a number of studios were headed by Jews) and the result was over thirty years of self-censorship by Hollywood. Of particular concern for movie censors was—wait for it—the impact of movies on children, whose ‘sacred . . . clean, virgin . . . unmarked’ minds might be corrupted by films, although early efforts to find any evidence for this foundered. There was also evidence, suppressed at the time, of a Mae West effect: boycotts by the Catholic Legion of Decency increased ticket sales for controversial films. But even in the 1920s, complaints about films were already well-established—the Women’s Christian Temperance Union had lamented the effect of films on youth as early as 1906, and in 1914 blamed them for violence and delinquency.

The new technology of radio, too, was seen as damaging fragile young minds, discouraging healthy activities like reading, and driving children to delinquency through exposure to radio serials like The Shadow. Who knew what evil lurked in the hearts of men? Paternalists knew—despite, yet again, the dearth of evidence of any negative impacts. Then it was television’s turn to desensitise children, encourage violence and undermine morality: by the 1970s an entire academic industry existed dedicated to charting the impacts of television on children, while morals campaigners like Mary Whitehouse in the UK demanded censorship of sex and violence on the box. Alas, the evidence for the negative impacts of television, like that for radio and movies, was hard to track down.*

Meantime, music had become the preferred target for hand-wringers. Indeed, music had long been the target of Stupid: the waltz had caused a remarkable scandal in the early nineteenth century, when it was regarded as indecent and fit only for prostitutes and adulteresses. First African American music in the 1950s (insert racist stereotyping here), then white versions thereof, then drug and anti-war songs in the 1960s, all caused alarm among concerned paternalists, until rap and hip-hop generated full-blown moral panics in the 1980s and 1990s. The American Academy of Pediatrics mused in 1996 that there were few studies of the impact of explicit lyrics in ‘heavy metal’ and ‘gangsta rap’, and no link proven between sexually explicit or violent lyrics and adverse behavioural effects. This was partly because, the study found, many teenagers had no idea what the lyrics of their favourite songs were—although sadly the opportunity for a doctoral thesis on the link between mondegreens and youth crime doesn’t appear to have been taken up. Eighties hair metal band Judas Priest even found themselves in court in 1990 facing claims their alleged backmasked message of ‘do it, do it, do it’ in a song had driven two men to attempt suicide.

And by that stage, music was already losing its menace to video games, first in arcade form (remember them?) and later on home consoles, which became the new bogey, encouraging (yes) delinquency and, later, warping young minds with sex and violence.

Paternalism goes online

Printing, movies, radio, music and TV all prompted paternalist responses; in fact, about the only new communications technology that didn’t induce Stupid was the fax. But the internet was like all of these rolled into one, unleashing a new drive for censorship from people concerned about the dire impacts of new forms of content delivery. And while modern Henry VIIIs have sought to censor, block or otherwise disrupt the internet outright because of its potential to foster political disruption, any number of paternalists have blamed the internet for social problems as well. Handily, however, the internet only became widely used in the 1990s, so we have plenty of data to measure the alleged impacts of the nefarious series of tubes on society.

Let’s take suicide, for example, and especially youth suicide. The internet—presumably replacing the collected works of Judas Priest—is often held to be a key cause of suicide among young people. It used to be online ‘death pacts’ and ‘suicide websites’ that were driving people, and especially our vulnerable youth, to kill themselves. These days that’s been replaced by cyberbullying, and more latterly the threat of ‘trolls’, phenomena held to regularly drive young people to take their lives. Indeed, a small ‘cybersafety’ industry has grown up in Australia that makes money from purporting to advise schools, governments and professional associations about online bullying, child cybersafety and moral panics like teen sexting.

Now, true, many young people have, indeed, tragically, taken their lives in response to bullying, online and off. But is the problem getting worse? What does the data tell us?

In Australia, the overall suicide rate has fallen in the last twenty years.* In 1996, the overall death rate from suicide was 13 per 100,000 people. In 2011, it was 11.2 per 100,000. The rate has fallen particularly for men, from 21 to below 17 per 100,000. The death rate among males under thirty has also fallen significantly, by between a quarter and a third, since 2002—despite media claims that suicide is a ‘cultural epidemic’ among young men; in 2012, the teenage male suicide rate reached an eight-year low and was at the second-lowest level since the 1990s.

True, the decline isn’t consistent—there’s been no fall in suicide rates among indigenous people since 2001; some states have fallen faster than others, and there has been a rise in recent years in the suicide rate of teenage girls, traditionally the demographic least likely to take their own lives. But whatever the specific causes of the overall fall in the number of people taking their lives, it has coincided with the spread of the internet.

Perhaps Australia is unusual. How about the United States? The overall suicide rate in the US is about where it was in 1996, and below where it was in 1991, at around 14 deaths for 100,000 people. Suicide among American ten- to twenty-four-year-olds peaked in 1994 and is well below that level now. Youth suicide has also fallen significantly in the UK since the 1990s, as it has for people over sixty. Overall, the suicide rate was 12.4 in 1995 in the UK compared to 11.8 in 2011.

But if there’s little evidence for a connection between the internet and rising suicide rates, what about the supposedly degrading effects of pornography, which is suddenly far more available than in the analogue era, when it required a trip to the local newsagent or, if harder stuff was your fancy, actual sex shops? There are plenty ready to declare that today’s ‘epidemic’* of pornography, and men with a ‘porn addiction’, leads to rape, whether they’re speaking from a feminist perspective, or a religious perspective. Others claim it is warping the minds of young men and causing problems in their relationships. Attacks on internet pornography often combine these themes and other tropes of paternalism, although it is rare to find as many crammed into one article as in a March 2014 piece, ‘Campus rape culture linked to online porn’, from Canada’s Western Catholic Reporter, about ‘the widespread availability of increasingly violent and degrading pornography called Gonzo porn on the Internet’:

Catholic therapist Peter Kleponis, who specializes in men’s issues and porn-addiction recovery, said in an interview that he sees a ‘big relationship’ between pornography and the ‘violent, sexual aggression we see among young men today . . .

‘Now kids have access not only through computers, but through smart phones, tablets and various gaming systems such as Xboxes, PlayStations and Wiis . . .

‘. . . men are learning it’s “okay to get a woman drunk and get a bunch of guys together to rape her” . . .

Kleponis called porn ‘the new drug of choice’ . . . that ‘it can easily come and take your life without your even knowing it . . . It’s the new crack cocaine . . .’ Except unlike crack, with porn there are no ‘gateway drugs’ gradually leading to it. First time exposure is generally to hard-core deviant porn.

And some governments actually believe this: the UK government recently went so far as to ban online rape depictions as part of its (entirely useless) internet filtering scheme, which will at least have the fortunate consequence of preventing the online distribution of Fifty Shades of Grey.

But the data doesn’t support claims about the impact of online pornography. Putting aside the issue of reporting rates, the sexual assault rate in Australia in 2012 was 80 per 100,000 people, roughly the same as the rate in 2000, of 85, and about the same as the rate of 79 in 1996. In the United States, sexual assaults on women declined by more than half between 1994 and 2010.*

Has pornography led to unhappier relationships? It’s a hard claim to prove or disprove. For what it’s worth, the Australian divorce rate has been declining since 1996 (and at a much faster rate than the decline in number of marriages); the US divorce rate has been declining since the 1990s and more broadly since the 1970s, has been declining since the early noughties in the UK and has been relatively stable in Canada for the last decade.

Whatever the impacts of the internet on Western societies, there is little evidence that it has prompted a rise in suicide, or that the ‘porn epidemic’—and access to more graphic and exotic kinds of pornography—has had any impact on sexual assault levels or relationships; at the very least, those who would censor, filter, block or otherwise play nanny to the internet have to demonstrate how falls, or even bigger falls, in suicide and sexual assaults and divorces would have occurred but for the negative impact of the internet.

A key characteristic of this form of Stupid, whether it’s focused on reading the Bible, gin, waltzing, silent movies or rap music, is profound historical ignorance. Any attempt to link one particular phenomenon to crime inevitably founders on the fact that Western societies are dramatically less violent now than historically. The Australian Bureau of Statistics concluded that homicide in twentieth-century Australia was significantly lower than in the nineteenth century (before, presumably, Aboriginal victims of white settlement are counted as well). Estimates of the US homicide rate show it has fallen by more than half since the mid-nineteenth century and is continuing to fall, notwithstanding government-engineered surges in homicide rates caused by paternalism (Prohibition and crack cocaine) in the twentieth century. Similarly, European data shows big falls in homicide rates compared to earlier centuries. But data, of course, is no match for anecdotes: you might be able to point to long-term declines in rates of violent crime, but I know some guy who got punched when he was out drinking one night, which demonstrates how the world is going to hell in a handcart and something must be done about gin/video games/heavy metal/waltzing/reading the Old Testament.

Comparing not merely drug laws but health-motivated ‘soft paternalism’ to religious persecution and censorship may seem a stretch to nanny-state types, but while differing in methods, all reflect the same logic: that the powerful have both the superior knowledge and the right to make decisions for the welfare of the less powerful, and to impose those decisions on them or use resources to seek to influence them in the desired direction.

This is why the same paternalist targets, themes and rhetoric repeat throughout history in a recurring cycle of Stupid. The impetus to paternalism is the vehicle for very old elite attitudes toward ‘sin’ behaviours—sex, drinking and other drugs, bad diets, popular entertainment and gambling. It is always is the lower classes and less powerful who are the target of paternalism while elites are left alone; threats are hyped to justify dramatic action; the need to protect children is always invoked; remarkable powers of manipulation are attributed to external agencies; serious impacts (now called ‘social costs’) are asserted without evidence; women and the young are targeted for special restriction. And in their rhetoric, modern-day public health advocates are hard to differentiate from the panicked middle classes of Hanoverian England; moralists who want to censor the internet are indistinguishable from the groups who railed at silent movies; anti-pornography campaigners hard to tell apart from the Athenians who executed Socrates.

Part of the impetus for paternalism now comes from progressives, who since the success of liberal economics in the 1980s and 1990s have embraced forms of paternalism as the primary tool of social engineering. Having, in effect, conceded the fight on basic economics in favour of liberal capitalism, sections of the left now look to achieve progressive goals not through economic reforms to restore the sort of communitarian government control lost with the economic reforms of the 1980s, but through changing the ‘choice architecture’ people face in consumption decisions. Rather than wages and price fixing, tariff barriers, high taxation and a fixed currency, we have price signals, regulation and ‘nudge’ policies that seek to amplify the impact of social norms on people in deciding how they spend their money and time.

This was vividly demonstrated in a much-cited 2013 Lancet paper, ‘Profits and pandemics: Prevention of harmful effects of tobacco, alcohol, and ultra-processed food and drink industries’, by some of Australia’s most senior public health figures. The paper argued that large corporations were in effect deadly viruses themselves:

The term industrial epidemic has been used to describe health harms associated with various goods including tobacco, alcohol, vinyl chloride, asbestos, cars, and the food and drink industries. In industrial epidemics, the vectors of spread are not biological agents, but transnational corporations. Unlike infectious disease epidemics, however, these corporate disease vectors implement sophisticated campaigns to undermine public health interventions.

This is an elegant example not merely of how paternalism refracts an economic tradition hostile to liberal capitalism, but of the now-common technique of pathologising what paternalism opposes. Thus, drinking at home becomes ‘pre-loading’; pornography becomes a ‘sexual addiction’ and corporations become vast, world-straddling disease vectors that, like a good B-movie villain, fight back against the plucky heroes trying to save humanity.

This has the effect of reasserting the role of the state back into regulating the economic choices of low-income citizens that liberal capitalism withdrew it from, albeit via different, more subtle mechanisms. Low-income earners are perceived by paternalists to be less capable of making informed, competent decisions about their consumption than paternalist decision-makers in academia and government. In particular, they are seen as more prone to being manipulated by corporate interests—poorer Australians, apparently, are easily swayed into becoming addicted to nicotine, they’re gulls for the clubs industry which wants them to throw away their money on poker machines, they’re prone to drinking too much because of the alcohol industry’s incessant advertising, they look at the ‘crack cocaine’ of online pornography too much and they eat too much bad food sold by multinational companies. Liberal capitalism has delivered greater wealth even for low-income earners compared to two and three decades ago, and lifespans continue to increase, but, like their ancestral nanny statists warning about the wiles of Satan, paternalists believe low-income earners are incapable of navigating liberal capitalism for themselves, that they are unable to resist the dark arts of the marketing industry and thus are in need of a forceful hand to guide them in how they spend their income.

But in arguing that we should construct ‘choice architecture’ to encourage people to be healthy, happy participants in capitalist society, maximising their productivity and capacity to consume and, in the case of women, bear the next generation of healthy workers and consumers, progressive social engineers turn out to closely resemble conservative policymakers. Much of the economic agenda of the right in Australia has been based on transforming low-income workers into aspirational, shareholding, private education-and-healthcare small business owners. They have been offered financial incentives to have their children schooled privately and to use private health care; ‘mum and dad’ investors were encouraged to acquire shares in major government privatisations; the taxation system was structured to encourage them to shift from being employees to ‘independent contractors’; they were encouraged to shift from having mere employment-based superannuation into ‘wealth management’. At the same time, the left wanted to encourage them, via regulation and ‘nudging’, to be healthy consumers who disdain the right sins (drugs, gambling, bad food), live long, economically productive lives, bear children the right way and then nurture them properly.

For both sides, inside every working-class person is a bourgeois just like them that needs to be set free. As a consequence of their joint efforts, the average human type will rise to the heights of a healthy small business owner, a nonagenarian self-funded retiree, or a McMansion resident with a vegie patch.

And above this ridge, new peaks of Stupidity will rise.

BK

 

 

 

*     Similar logic is to be found in the warning of the head of the Australian Security Intelligence Organisation that the internet allowed ‘unfettered ideas’ to radicalise people in their ‘lounge rooms’, conjuring a nightmare scenario in which you might be relaxing watching television or enjoying some time with your family, when you’d suddenly be transformed into a jihadist ready to wage war on the unbeliever.

 

 

 

*     Named after Jan Huss, who was executed by the Church after being given a promise of safe conduct to the Council of Constance—promises to a heretic didn’t need to be kept, it was decided. That turned out to be probably the most expensive broken promise in history, resulting in fifteen years of war in Czech lands, the complete desolation of Bohemia and the Church forced to accepted the Hussites until the Reformation overtook events in the sixteenth century.

 

 

 

*     There are the kids again!

 

 

 

*     Older readers might remember the seventies factoid that ‘the average American child will have watched 8000 murders on television by the age of twelve’.

 

 

 

*     The Australian Bureau of Statistics has expressed concern about the accuracy of suicide statistics, in particular those from before 2007, but the issue relates to underreporting. That is, the likely level of suicide was even higher in the 1990s and early 2000s compared to now.

 

 

 

*     The reader will have noted that ‘epidemic’ is a recurring word in paternalist-speak

 

 

 

*     Again, this is unlikely to be uniform—women from indigenous backgrounds, women with disabilities and women with abusive partners are more likely to suffer sexual abuse.