I already praised Marshall McLuhan for his most famous cryptic pronouncement—that “the medium is the message.” He intended this aphorism to convey the idea that the nature of different information technologies affects the sensibility and nervous system of recipients more profoundly than the content of messages carried over various communication channels. Armed with this insight, McLuhan emphasized the mesmerizing effects of television and the way it was transposing the children who were growing up with it into a never-never land detached from the world of adults.
Such an understanding of the effects of that new medium could lead to the conclusion that television in and of itself provided stimulation that was much more compelling than anything children and adolescents had experienced earlier. TV screens can, indeed, be seen as “supernormal stimuli” which evoke a much more powerful physiological response than any natural objects or events.1 In recent decades, these screens have become bigger and brighter, and the intensity of the stimulation they provide has grown. This effect is only intensified by the shift to high-definition broadcasts and TV sets2 as well as the addition of 3D, internet connectivity, app support, and other add-ons to newer models. As communications professor Byron Reeves has noted, a big-screen TV “may turn up the volume on whatever emotional responses would have been experienced with a standard presentation.”3 Since focusing on something requires that the brain automatically turn down its reaction to everything else, lasting supernormal stimulation can make sustained attention toward what matters much harder.4 Computer and gadget screens are smaller, yet interaction with them is a lot more intimate. They mediate over extended periods of time forms of social stimulation and access to curious trivia, often preoccupying every spare minute in their owners’ hectic schedules. Moreover, use of digital devices tends to supplement extensive exposure to television, with the two media creating an increasingly compelling virtual cocoon.
According to neuroscientist Gregory Berns, what the brain really “needs” most may be information, pure and simple. In his words, “neurons really exist to process information… If you want to anthropomorphize neurons, you can say that they are happiest when they are processing information.”5 And, in pursuing this kind of info-grazing “happiness,” we may indeed come to “resemble nothing so much as those legendary lab rats that endlessly pressed a lever to give themselves a little electrical jolt to the brain.”6 Like them, we are sensitive not only to the substances or experiences providing stimulation, but also to the cues we have learned to associate with these—and the latter can be dissociated from the former, and come to provide heightened sensory and affective stimulation in and of themselves.7
Once staring at glittering screens, particularly those providing access to the internet and various forms of virtual interaction, turns into a compulsion, affected individuals would need ever more intense and prolonged exposure to the constantly refreshed images, sensations, and information generated by screens of various sizes. They would seek to experience the previous excitement—but would be unable to get a lasting satisfaction, with the “real world” feeling slightly underwhelming against the backdrop of compulsive self-stimulation.8
A few years ago, psychologist Philip Zimbardo described in a controversial TED talk a more sinister version of this curse. His impassioned argument was followed by a TED e-book and a paperback (both co-authored with Nikita Coulombe) expanding on his initial points. 9 Zimbardo believes too many boys and young men succumb to a broadly defined “arousal addiction.” They become hooked on video games and online pornography, and are in constant need of novelty and variety in these forms of gratification. Moreover, the hyperarousal associated with video games and online pornography is the tip of a larger iceberg of pervasive stimulation and virtual indulgence–all made possible by our enchantment with digital technology. Once young brains learn to crave and expect such overstimulation, the ability of their owners to feel similar excitement from books, ideas, and even physical objects (or human bodies) would tend to fade.
But even the combined power of television and screen-based digital attractions may not in itself be sufficient as an explanation for their hold over masses of “users,” particularly children, adolescents, and young adults. Interacting with screens of various forms and sizes may become even more appealing for individuals whose nervous system has additionally been “softened up” by the compulsive consumption of a deluge of “supernormal stimuli.” Neuropsychologist Deirdre Barrett, who popularized this term, has pointed to super tasty fast food, television, and pornography as some of the most obvious examples for such excessive levels of stimulation.10
The syndrome she describes, however, may be even broader. As Peter Whybrow has pointed out, our brains are evolutionarily adapted not just to weaker stimuli, but also to general scarcity. They are therefore poorly equipped to deal with the overabundance of goods and experiences churned out by a technologically advanced market economy. These can hijack and throw out of balance the dopamine system of the human brain and induce a state of “mania,” or a permanently altered state of consciousness. This overall effect is reinforced as exposure to, and consumption of, “superpalatable” foods engineered to provide “supernormal” temptation and facilitate addiction,11 novel “cool” products, excessive and often unpredictable social rewards, and so on.12
This overall picture should also include the pervasive, incessant generation of enticing sensory cues through marketing and advertising. All these inputs, plus a deluge of screen-mediated essential and trivial information, almost constant access to sensory and social self-stimulation, other forms of real or virtual thrill seeking (including video games with in-built addictive properties), and overall stress form a powerful torrent of unrelenting overstimulation of the human brain’s motivation system.13
Psychologist Fred Previc has similarly described a mass “hyperdopaminergic syndrome” linked to chronic overstimulation of the human brain’s dopamine pathways and related epigenetic adaptations. In his view, such neurophysiological adaptations to the onslaught of modern living underlie more specific symptoms like emotional cooling and detachment, individualism, relentless goal-directedness, excessive risktaking, utilitarian decision-making, multiple chemical and behavioral addictions, as well as the increase in recognized psychiatric conditions like autism, schizophrenia, hyperactivity, obsessive-compulsive disorder, bipolar disorder, mania, delusions of grandeur, etc.14
Previc’s concerns have in fact received support from many recent studies which have contradicted earlier, more optimistic conclusions. Journalist Tony Dokoupil has reviewed some of these less sanguine findings indicating that our increasing digital immersion may lead to mental illness and clinical delusion.15 He has noted wryly that “the black crow is back on the wire.” He has also warned his readers not to kid themselves since “the gap between an ‘Internet addict’ and John Q. Public is thin to nonexistent.” Previc, however, believes that even within the “normal” psychiatric range manic or hyperdopaminergic tendencies may lead to “delusional behavior and, in a milder sense, rationalization and denial.”16 Like Whybrow, he sees diagnosed mental illness as the tip of a much larger iceberg–posing more obvious problems on a spectrum of developmental skewing.
Previc’s concerns are indirectly corroborated by studies that are different from those reviewed by Dokoupil. For example, survey data show that a growing proportion of American adolescents classified as “overweight” do not perceive their weight as abnormal.17 According to other surveys, the vast majority of Millennials are optimistic they will be as or more prosperous than their parents—despite the hardships they face and dire predictions from experts.18 In a 2011 Gallup poll, 64 percent of respondents identified “big government” as the biggest threat to the United States, with “big business” coming a distant second at 26 percent.19
Such findings do seem to capture some larger tendencies—which are corroborated by more impressionistic observations. For example, one can wonder at the overall persistence of the “American dream,” or the idea “that the greatest economic rewards rightly go to society’s most hard-working and deserving members,” amidst fastrising inequality (to levels antedating the Great Depression) and the progressive hollowing out of the middle class.20 The fusion of reality show and politics in the bizarre persona of Donald Trump, and the grassroots resonance of the image he projects, seem similarly puzzling. Moreover, his main Republican opponent was someone who claimed to stand for America’s humiliated and insulted—while he himself had been educated at Princeton and Harvard, had held senior government positions before he was elected to the Senate, and is married to a Goldman Sachs executive. The list could go on and on—including, of course, the continued flooding of the United States with military-grade assault rifles amid signs of growing mental disturbances among young males and a diffuse terrorist threat.
I do fear the changes in neural functioning induced by chronic stimulation can be analogous to those associated with recognized forms of addiction to chemical substances.21 They can also be related, however, to an overall dependence on overstimulation (or self-stimulation) of the motivation system of the human brain—as opposed to any specific substance abuse or specific compulsive behaviors. Part of this broader syndrome is what psychologist Stephanie Brown has described as an addiction to speed and ever more hectic modes of work and lifestyles.22 Such a “meta-addiction” is at the heart of consumerism and various other forms of continual sensation-seeking. Over the past two decades or so, this compulsion has been compounded by an intensified drive to induce new, potentially limitless and largely “virtual” needs and desires in millions of future consumers and money-makers from a very young age.23 With the explosive growth of information technology, these tendencies have been supplemented by an ever intensifying “infomania.”
The fatal attraction of a broad variety of “infomanic” behaviors, however, raises an obvious question. Why have so many of us, and of our children, succumbed to these if the dysregulation of the nervous system they induce has obvious existential downsides? As in the case of more specific addictions, focusing on the inherent addictiveness of chemical substances or compulsive behaviors may be misleading. It is widely recognized that many individuals who sample such substances and behaviors do not develop a full-blown addiction.
In the early 1970s, psychiatrist Lee Robins discovered that about 20 percent of American soldiers sent to Vietnam had become addicted to heroin. The majority, however, were able to kick the habit on their own after they returned home. Apparently, the high levels of stress in a combat environment had facilitated the development of what had been considered a purely chemical addiction; and once “addicts” returned to civilian life, most no longer needed the substance they had abused. Only a minority of former soldiers, those who perhaps had a more compromised nervous system as a result of previous experiences or had suffered more traumatic insults, and had found adaptation to “normal” life too overwhelming, relapsed.24
The role of stress in facilitating substance abuse has also been demonstrated by—what else—clever experiments with animals. As it turned out, the tendency of the proverbial lab rats to self-administer drugs to the point of complete exhaustion and even death was not entirely natural. In the late 1970s, Canadian psychologist Bruce Alexander set out to test this hypothesis by building what he and his colleagues called “Rat Park”—a giant enclosure offering a colony of lab animals a habitat as similar as possible to their natural environment. It provided sufficient space for free movement, exploration, socializing, and mating, and contained plenty of food and “playground” equipment.
Placed in this environment, most rats had very little appetite for sipping morphine-laced water as lab animals had done when kept in isolation in small, bare cells. Alexander concluded that under “normal” circumstances most animals and humans would not develop an addiction to the substances they would abuse under conditions of severe stress and deprivation.25 But does human life in contemporary society resemble rodent life in Alexander’s Rat Park? And could most individuals inhabiting such increasingly stressful and overstimulating social milieus ignore potentially addictive substances if offered virtually unlimited access to these? Or curb compulsive behaviors if these are spurred by powerful technological and socioeconomic forces? The obesity epidemic spreading around the world since the 1980s, which straddles the divide between substance dependence and compulsive behaviors, does not inspire much confidence in such human temperance26—and it is only one poignant example.
Of course, even many traditional communities have failed to develop the habitual sobriety Alexander might have predicted for them. They have long used mind-altering substances as a gateway to an altered state of consciousness and communion with the spirits inhabiting their worlds. Still, modern living seems to have induced a level of overall stress that produces a much more acute need for self-medication and stronger behavioral compulsions—a tendency which has markedly accelerated in recent decades. Such generalized distress is related to the overall increase in social complexity, pace of life, density of interactions, material abundance, technological saturation, and information overload associated with “late modernity.” It is made even less bearable by the growing fragmentation and “disenchantment of the world,” the loss of communal support and of spiritual horizons which can help most individuals withstand and make sense of the existential maelstrom engulfing them on a daily basis.
The suspicion that modern societies and technological progress create a deeply unnatural and potentially unhealthy habitat27 is hardly new. It was shared not only by Romantic and conservative writers and thinkers. Nietzsche was particularly wary of the neural overstimulation individuals seemed to experience in the hectic, noisy, and overcrowded cities of the time. He worried about the “massive influx of impressions” that “press so overpoweringly—‘balled up into hideous clumps’—in the youthful soul; that it can save itself only by taking recourse in premeditated stupidity.” 28
Many other intellectuals shared Nietzsche’s testy premonitions as they observed the increased levels of audiovisual pollution and overcrowding in the growing industrial cities of the 19th century. In medicine, the increasingly frequent diagnosis of “neurasthenia” (or “nervous exhaustion”) reflected similar concerns. The resonance of those worries, however, gradually weakened as the hectic bustle of the big city became the new normal. When in the 1920s German sociologist Georg Simmel raised similar concerns, he was a lot more sanguine about the trends he analyzed.29 He described how urban inhabitants needed to develop an emotional distance from their daily experiences and from fellow city dwellers. In his view, such a “blasé” attitude allowed them to keep their psychological balance and function relatively undisturbed in a variety of social roles involving multiple social interactions.
In the 1960s, Simmel’s more benign interpretation was questioned by Marshall McLuhan who described a degree of “auto-amputation” as a necessary adaptation to information overload.30 In the 1970s, futurologist Alvin Toffler raised a slightly different concern. He argued that an unprecedented degree of sensory and cognitive overstimulation was causing a “future shock”—a state of mental confusion and “blurring of the line between illusion and reality.”31
These impressionistic arguments received empirical support from a series of shrewd experiments designed in the 1970s by social psychologist Stanley Milgram. He observed patterns of communication among city dwellers, their willingness to help strangers, and other social behaviors. He concluded that the swarming of masses of diverse inhabitants in large modern cities created a psychological overload. As a result, individuals placed in such urban environments would be less likely to show compassion and extend help to others. In the scientific jargon he used, he defined overload as “a system’s inability to process inputs”—either because there are too many, or they come too fast (or maybe both).32 After Milgram, many other sociologists adopted the view that urban settings produce an information overload which could account for much of the malaise of modern living. Yet, they rarely put into serious doubt the ability of most individuals to strategically adapt to a potentially unhealthy urban environment.
With hindsight, the apprehensions of Nietzsche and other modernization skeptics seem far more prescient. I am often reminded of Henry David Thoreau who sought refuge in nature from Concord, which must have been a rather tranquil village in Massachusetts. Or Emily Dickinson who imprisoned herself in her family’s estate. Or Marcel Proust who a bit later sought to protect himself from the bustle and dust of Paris in a room insulated with cork paneling, and with windows covered by heavy curtains. Or Edvard Munch who gave expression to his own mental exhaustion in a series of paintings which came to be seen as the paradigmatic expression of modern existential angst. The extreme reactions of such intellectuals, writers, and artists can be easily dismissed as the overblown, deeply idiosyncratic antics of a few disturbed minds. Certainly, Proust, Munch, and others did succumb to severe mental and physical maladies, and many more showed various subclinical symptoms. I tend to see them, however, as the proverbial canaries in the pit whose heightened sensitivity made them more vulnerable to the generalized overload of modern civilization.
These stresses, meanwhile, had a much broader influence on intellectual and artistic life. They probably contributed to the erosion of formerly rigid authority structures, standards of proper behavior, and even artistic tastes. Conservative Spanish philosopher José Ortega y Gasset once described how in the late 19th century the young European intelligentsia quite suddenly lost their taste for representational art, harmonious music, rhyme, and the conventional narrative of the great novels.33 With a nod to this shift in tastes and attitudes, a few decades later Virginia Woolf (who eventually drowned herself) famously quipped that “on or about December 1910 human character changed.”34 That change found its most visible expression in modernist art, music, literature, and architecture, which all sought to break free from social conventions and contexts.
New artistic and literary fads were closely related to various quirks in personality and behavior—for example, the rise and disappearance of the urban flâneur;35 or unceasing attempts to shock the bourgeoisie not just artistically, but also through daily provocations—like taking a turtle or a lobster for a walk. Meanwhile, new “social sciences” like psychology, sociology, economics, and political science sought to mimic the scientific method developed in the natural sciences in search of similarly “objective” knowledge. Even philosophy was swept by a similar trend with the rise of “logical positivism” and “analytic philosophy.” In the 1960s, such modernist pursuits were disrupted by the rise of “post-modernist” artistic and intellectual trends. Despite their cultivated diversity, those were united by a common rejection of the search for objective truths and deep essences, repudiation of artistic and social hierarchies, preference for random sampling from different styles and epochs, an overall sense of ironic detachment, and praise for social disinhibition and the subversion of arbitrary norms.36
These changes in modes of artistic expression and knowledge paradigms are often attributed to an evolution in styles or ideas; or to the spread of particular new technologies like photography (which ostensibly prompted artists to leave behind attempts to represent aspects of “reality”), or the typewriter (which seemed to encourage a more telegraphic writing style). In my mind, changing artistic and intellectual fashions in the late 19th and early 20th centuries were more closely related to shifting sensibilities—reflecting modifications in patterns of neurophysiological functioning under the influence of urbanization, industrialization, mass literacy, increased social complexity, technological change, etc. These trends have long been recognized as benchmarks of social “modernization.”37
At the heart of these neurophysiological adaptations has arguably been a growing affective-visceral desensitization and dissociation (mostly sub-clinical) under the impact of sensory and social overstimulation. That was essentially the process observed by Simmel and Milgram among inhabitants of big cities. Further changes in sensitivity were also measured by the researchers who tested German children several decades ago and postulated the development of a “new brain.” And in recent years neuroscientists have observed additional modifications in the brain wiring of “digital natives.”38
It was Tocqueville, however, who saw it all coming long, long ago. Reflecting on the social and psychological changes associated with the waning of “aristocratic society,” he noted that “the bond of human affection is extended, but it is relaxed.”39 As a result, we would sympathize with the victims of an earthquake in Haiti, a deadly epidemic in western Africa, ceaseless fighting in the Middle East, risky voyages in the Mediterranean or Aegean, or vicious terrorist attacks. We may even make a modest or, in the case of someone like Bill Gates or Ted Turner, a sizable financial contribution to help alleviate acute problems in distant lands or closer to home. But we may not be moved too deeply by the personal misfortune of a neighbor or of a cousin living on the other side of town.
While Tocqueville admired many aspects of the “democratic” society he found in the United States, and had little doubt that it was the wave of the future for France and other countries, he was also suspicious of the potentially excessive individualism it fostered. He thought such individualism was qualitatively different from the selfishness which had existed even within tightly knit medieval communities. While that older selfishness was an expression of self-love, a strong emotion, individualism was colder and more calculating. As the new kind of society that had developed in the United States was based on civil equality and market exchange, it induced in individuals a false sense of self-reliance. In Tocqueville’s view, such a social context could “make every man forget his ancestors” since it “hides his descendants and separates his contemporaries from him; it throws him back forever upon himself alone and threatens in the end to confine him entirely within the solitude of his own heart.”40
This radical social detachment has also included the loss of the larger ontological or existential horizon associated with transcendent religious belief. In this broader sense, it was later noted and theorized by many other intellectuals who gave it different names: alienation, estrangement, disenchantment, desacralization, profanization, etc.; or wrote about an overall social malaise, anomie, ennui, civilization’s “discontents,” a loss of purpose and meaning by a “homeless” modern mind, etc. These concepts have slightly different connotations, but they all point to an existential void which opens as modern individuals become detached from previously secure communal or spiritual moorings.
This sort of disconnect is highlighted memorably in Leonardo DiCaprio’s documentary, The 11th Hour. In it, a string of respected biologists and other scientists ponder essentially the same question: If we humans are so obviously part of nature, why have we forgotten this fundamental truth and adopted a blindly exploitative (and potentially self-destructive) attitude toward the natural world? Ironically, the answer to this all-important question may be partly rooted in our own biology. Onetime internet proselytizer Douglas Rushkoff has similarly lamented the disconnect he sees as pervading all of American (and not just American) life, attributing it mostly to a conquest by an omnivorous “corporatism.”41 Taken to an extreme, such tendencies can produce a nagging sense of dissociation and “unreality.” Such sensations may be weaker than those resulting from acute trauma or overwhelming stress,42 or may not even present themselves as abnormal or problematic. They can still be seen, however, as a diffuse syndrome—which can probably explain why the first Matrix movie struck such a chord.
At the heart of such existential detachment probably lies a common neuropsychological syndrome—a gradual numbing of the senses and gut feelings as a necessary adaptation to a more cacophonous, hectic, and technologically saturated social environment. Modernization has, indeed, often been associated with a degree of emotional cooling—a notion captured perhaps most strikingly by Marshall McLuhan’s observation that “the Westerner appears to people of ear culture to be a very cold fish indeed.”43 Meanwhile, neuroscientists have come to recognize that the meaning or significance of particular experiences, events, and narratives is largely derived from their affective resonance and the general coupling of thought with physiological arousal.44 But the link between affective-visceral desensitization and the loss of transcendence and meaning in the modern age has not been commonly made.
I do suspect the link between our overall desensitization and the loss of existential moorings (or fetters, in the more optimistic interpretation) reflects a general state of “meta-addiction” and pursuit of overstimulation—resulting in chronic existential (in psychiatric jargon, “allostatic”) overload. This syndrome has developed under the influence of a social and technological milieu that has become increasingly overwhelming for human brains and organisms that are evolutionarily adapted to life in a mostly natural environment marked by general scarcity and face-to-face interactions within small communities.
Such a jaded sensibility is exacerbated by the ubiquity of screen-mediated accessing of information, communication, and gaming. It probably underlies most expressions of indifference or ironic detachment, desperate attempts to recapture a slipping sense of vitality and authenticity in contemporary societies, and increased apathy among students at various levels. This set of challenges inspired Bill McKibben to call at the dawn of the new millennium for a new kind of “mental environmentalism”45—reminiscent of Neil Postman’s earlier notion of “media ecology.”46