15

The Last Oasis

At this point, I would likely be expected to offer a solution to the problems I have described. Both Whybrow and Previc, while warning about the dangers of chronic stimulation and dopamine overload in the brain, emphasize that the neurophysiological changes they describe can be reversed.1 Of the two, Whybrow is notably less skeptical. He opens his latest book with an unsettling question he kept asking himself: “But why had the madness become so pervasive in Western culture?”2 Then, in the second part of the book, he offers a comprehensive plan for taking personal responsibility, living well, building a “well-tuned” brain, and resolving the existential problems plaguing contemporary civilization. Whybrow has also dedicated a chapter to changes in education that could help diffuse the manic-addictive tendencies he describes.3 He believes that, “ideally, formal education should be an extension of early parenting”—building upon ties of care and attachment, and tapping the “young students maturational drive to forge and hone the adaptive skills that will serve lifelong autonomy and self-directed growth.”4 Other authors have also felt a need to devise personal solutions to the similar problems they describe.5

Zimbardo and Coulombe have offered a more systematic approach. They hope the dangers they highlight can be addressed with the joint efforts of government agencies, schools, media and technology companies, parents, and young men and women made aware of the new challenges.6 Even neuroscientist Susan Greenfield ends her latest book on “mind change,” a far-reaching shift in the human mind and brain (analogous to “climate change”), with a four-step action plan.7

A Credentialed Digital Cassandra?

Professor Baroness Susan Greenfield has drawn the ire of many neuroscientists with her dire warnings regarding the overall impact of digital technology, particularly on young brains and minds. The musings of amateurs like Nicholas Carr (or myself) can be easily dismissed—not so those of a fellow brain “scientist.” Or perhaps they can? The British Medical Journal has published an editorial faulting Greenfield for making baseless alarmist speculations in the media—rather than publishing her “claims in the peer reviewed scientific literature, where clinical researchers can check how well they are supported by evidence.”8 One of the editorial’s authors, Vaughan Bell, had previously written a scathing review of Greenfield’s book, Mind Change. 9 In it, he had accused Greenfield of systematic “confusion between correlation and causation and … apparent inability to distinguish which sorts of studies can provide the best evidence for each.” As a result, her whole argument had “the tone of a bad undergraduate essay in which a series of randomly encountered scientific findings are woven together with a narrative based on free association and a burning desire to be controversial.” So Greenfield may be an esteemed neuroscientist, but in fact lacks the basic research skills expected from a decent undergraduate student, and is bent on alarmist sensationalism. Iain McGilchrist, a prominent psychiatrist and author, faced similarly dismissive rebuttals after he warned that many children were displaying borderline “autistic” behaviors as they spent so much time interacting with screens.10

My hunch is that such unguarded accusations reflect to a larger extent the predispositions of their authors rather than the qualities and qualifications of the person being castigated.11 In her book, Greenfield deflects such criticism by claiming the right to draw broader conclusions from multiple studies and experiments. She also points to a common logical fallacy that should be familiar to her detractors (and perhaps even to undergraduate students): “absence of evidence is not evidence of absence.”12 The Baroness could have added that Cassandra’s curse, after all, was to tell the truth. In any case, is this fundamental disagreement a matter of different competence or intelligence? More likely, it reflects disparate unconscious predispositions deriving from differences in affective-visceral processing, and the extent to which the latter is integrated into higher-order thinking.13

Many experts, meanwhile, do not seem to recognize that adaptation to a more demanding socio-technological environment may have significant downsides—since there is no conclusive evidence for this. Neuroscientist Elkhonon Goldberg has even speculated about a political evolution analogous to the evolution of the brain. He has suggested that the world is undergoing a “transition from a world order build of a few large autonomous geopolitical units to a network of many small, highly interdependent geopolitical units.”14 In Goldberg’s view, this network could come under some sort of global authority—mimicking the integration of the brain through signals from the frontal lobes.15 Curiously, he kept that hope even following the September 11 attacks and the violent disintegration of Iraq.16

I do believe “we” are now facing an unprecedented existential crisis. Yet I would rather avoid the “last chapter problem” identified by professor of history and journalism David Greenberg. He has noted that almost every book analyzing social or political problems, “no matter how shrewd or rich its survey of the question at hand, finishes with an obligatory prescription that is utopian, banal, unhelpful or out of tune with the rest of the book.”17 I would rather not become one of the countless social critics who, in Greenberg’s words, have “succumb [ed] to the hubristic idea that they can find new and unique ideas for solving intractable problems.”18 In fact, I suspect the solutions most authors have offered reflect mostly their own “positivity bias” and faith in human agency–tendencies that are strongest among the educated elite, particularly within “weird” cultures. Though these predispositions seem weaker in Greenfield, even she cannot quite escape the biocultural force field in which she is suspended.19

Free of any similar inclinations, I cannot in good faith offer a grand solution of my own. I suspect no amount of deliberate “interventions” can counter on a mass scale the perfect storm that engulfs us all. Alas, this storm seems set to gather strength given the competitive pressures of global capitalism, the relentless and explosive growth of information technology, the pull of the supernormal stimuli it generates, and the rapid proliferation of companies whose business plans are premised on subverting our neurophysiological balance and reflective self-control.20 As already noted, the effects of all these noxious forces are be strongest on children and adolescents whose brains (and bodies) are still exceedingly plastic. So I am hardly surprised by evidence that programs aimed to boost executive control and nonacademic abilities like “grit” in elementary school students have had little impact on overall academic achievement.21 Or by studies which have found that a large proportion of high school students (or the majority in some high-pressure schools) display symptoms of mental illness.22

I believe, though, that we should look into the abyss and grasp the full extent of the challenges facing technologically advanced societies and the different generations inhabiting them. In my view, the trends I have described require a more evocative label, the way “global warming” is more resonant than the emotionally cooler term “climate change.” We could refer to a “Learning Hijack Syndrome” resulting from chronic overstimulation and dopamine overload. Or, if a more technical term still seems preferable, perhaps “Hyperdopaminergic Learning Syndrome” would fit the bill (with a nod to Previc’s notion of an overall “hyperdopaminergic syndrome” affecting late modern society23).

From this more holistic perspective, I have come to doubt that student success (or lack thereof) is primarily a function of pedagogy. I also suspect educational “attainment” is mostly undermined by the invasion of information technology and the introduction of more stimulating activities into the classroom—including digital self-stimulation by students accessing “social media,” sending and receiving text messages and email, surfing the web, playing games, etc. with their smartphones, tablets, or laptops. In a recent survey, American college students admitted they were spending on average 20 percent of class time using digital devices for activities unrelated to class tasks or content.24

The goal of reading-centered learning and appropriate neurosomatic maturation (to say nothing of any solution to the broader problems I have addressed) may already be out of reach at the societal level. The tide of the future is quite clearly flowing in the opposite direction, and I do not quite see how it can be stemmed or even weakened. It seems a lot stronger than anything any educational “interventions” can accomplish—no matter how much they struggle to make a virtue out of necessity. Moreover, many such schemes to enhance learning and build an ability to focus (on the basis either of neuroscientific research or of experiments seeking to establish practically “what works”) may only reinforce the sensory overstimulation students face outside of school. As noted earlier, some “evidence-based” interventions and cutting-edge techniques and technologies might help the most disadvantaged or least motivated students, or those with learning disabilities. They could also benefit many budding “analysts” with various interests and career prospects. But such interventions could be detrimental to students who still have the potential to achieve deeper immersion in reading as the central learning activity, and to become keenly attuned to larger social issues.

The difficulties which make any hopes for a long delayed educational revolution unrealistic start from birth, if not earlier. From the womb, children are often exposed—and remain exposed—to an array of noxious influences. Too many are born by C-section25 or are not breastfed (or are breastfed for only a short period).26 A large number may not receive sufficient warm care and attention from parents who are overworked and frequently spaced-out—glued to their own digital devices and not fully “there” for their children. When this is the case, behavioral addictions can become a “poor substitute for love” and mutually attuned interactions.27

Part of truly “nuclear” families, similar partnerships, or single-parent households, children are typically deprived of extended contact with grandparents and other extended family members. They are also handed over to various ersatz forms of caregiving when both parents need (or prefer) to work for pay. And too often children come from an early age to rely on screens to provide stimulating interaction and entertainment. This cocktail of noxious influences has only thickened since former primary school teacher and literacy expert Sue Palmer sounded the alarm for a “toxic childhood” syndrome a decade ago.28 These influences can be particularly detrimental to the highly sensitive children and adolescents described by psychologist Elaine Aron29 and journalist David Dobbs, among others.30

Excessive screen time became an issue for students of all ages long before the arrival of broadband internet access and hand-held digital devices. As I noted earlier, Marshall McLuhan believed that growing up in front of TV screens (like the protagonist of the 1990s sitcom Dream On) helped open the gap between the younger and older generations back in the 1960s. Then, in the 1980s and 90s, there was an apparent increase in the degree of understanding between parents and children who had grown up in a similar media environment. With the coming of age of the Millennials, and of each succeeding microgeneration immersed ever more intensely in a screen-mediated environment, some psychologists have described a new “generational divide in cognitive modes”31—even if many “digital natives” maintain regular, friendly communication with their parents.

Increasingly, schools at all levels are contributing to the deepening of these trends. Until recently, the school served to establish some balance in the lives of students by requiring them to work with the help of textbooks, general-purpose books, notebooks, and other “real” instructional materials. With the proliferation of faster and uninterrupted internet access, laptops and tablet computers handed out by schools, instructional video games, etc., the balance between screen time and all other activities has been tilted even further. Those countless hours spent staring into a screen entail a mostly sedentary, indoor-bound lifestyle, unhealthy snacking, insufficient sleep, and other unhealthy habits—a form of “co-morbidity” which cannot really be beneficial.

Once children are hooked on this constant stream of self-stimulation, books or even the “real world” are bound to hold a lot less attraction and to become a source mostly of boredom.32 Even when forced to read from a real page (or, increasingly, from a screen), many adolescents would struggle to achieve the deep-reading immersion which is essential for learning and brain development. Alas, this is a syndrome which for the majority of students is likely to continue into college and beyond.

As Martha Herbert and others have argued, all these potentially harmful influences can add up, and trigger unhealthy neurophysiological adaptations.33 In some cases, these adaptations will be expressed in the form of recognizable psychiatric symptoms, specific learning disabilities, or other “disorders.” But they can also be diffuse, and associated with disruptions in learning and neurosomatic maturation even in the absence of a clear psychiatric diagnosis. If these difficulties prevent children from developing their ability and desire (or at least willingness) to read, more complex forms of thinking, and a capacity to navigate their dizzying social and technological environment, then the learning and overall developmental “outcomes” from education are likely to disappoint. Over a decade ago, a Canadian study found that less than half of the sixth- and ninth-graders who participated appeared developmentally mature for their age.34

Making these observations, I could easily be accused of repeating age-old complaints about the young, and blaming them for their perceived failings. As I look at the political and social clouds thickening around the globe, I keep wondering if the older concerns were entirely misplaced. To the extent they were, my fear is that this time it is, indeed, different—and we may be at a civilizational breaking point. A point where the long-term transformation sociologist Norbert Elias dubbed the “civilizing process” (the progressive internalization of appropriate social inhibitions accompanying the rise of the nation state and the market economy)35 is increasingly unhinged at the neurophysiological level.36

Of course, these troubling trends will be shrugged off by observers who, like Virginia Heffernan, Hanna Rosin, or Steven Pinker, are not easily disturbed by any cultural permutations. If, however, the shifts in aptitudes and sensibilities I have described are to be taken seriously, children and young people can hardly be blamed for these. Young brains and bodies need to adapt to an environment that requires and fosters new skills at the expense of others. We would hardly blame penguins for losing their ability to fly while developing other capacities in order to survive in their harsh Antarctic habitat. Children and adolescents have found themselves in an analogous predicament affecting mostly their mental—but also physical–aptitudes.

If anyone carries any responsibility for the analogous adaptation I have described, it is “we,” the older generations. We, in Bernard Stiegler’s harsh judgment, have abandoned our obligation to take proper care of the young. The French philosopher, however, does not seem to accuse the older generations either. In his view, its members now inhabit “a society that has become structurally incapable of educating its children”—as it has been reshaped by larger socioeconomic and technological forces.37

This conclusion reflects a more holistic and—by extension—more fatalistic outlook.38 It will ring hollow to readers and various experts with more analytic and optimistic predispositions. Unlike such critics, I have few illusions that the perfect storm of unhealthy influences and activities I have described could somehow be extinguished or curtailed. I do recognize that my own typically Bulgarian mindset may be overly apprehensive. But I suspect chronic optimism has its own pitfalls. Such an upbeat attitude can take you to the Moon and to victory in major conflicts like World War II and the Cold War. Incidentally, it can also get you into Vietnam, Iraq, and Afghanistan—or a spiral of economic and technological innovation with much unforeseen socioeconomic, political, and neurosomatic fallout. Indeed, Cassandra was right. As were the fear and moral panic mongers who later warned that Rome could be sacked.

In any case, the self-stimulation made possible by digital devices and screens of various sizes seems just too attractive for children and young people, and even for many of the “digital immigrants” charged with their upbringing and education (in the broadest sense). Adults and adult institutions have in effect lost much of the authority and will to force or induce children, adolescents, and college students to persevere with learning activities that are not immediately rewarding or stimulus-driven. If parents, educators, children, adolescents, and young adults do make an effort to resist the pull of “virtual reality,” they are pitted against powerful commercial interests and a technological crusade aimed at turning all “users” into digital junkies.39 As I already noted, these forces are now invading schools, education more generally, and life at all levels, without facing broad resistance. Moreover, they are frequently welcomed as the harbingers of forms of learning and student engagement best suited to the 21st century (despite the lingering concerns of many teachers).

As I see it, a public policy response to the current crisis in learning, reading, and neurosomatic development would require that the capitalist and technological juggernaut which feeds on existential burnout and a meta-addiction to overstimulation be defanged and tied down. But it can hardly be, barring some larger technological, social or environmental catastrophe—at which point the time for effective collective action may already have passed. Hopefully, deep reading, engaged learning, interest in larger social issues, and more sophisticated forms of thinking will not be extinguished as predicted by Idiocracy, the provocative dystopian movie released a decade ago. But these are likely to become niche preoccupations practiced by a relatively small reading elite mentally detached from the social mainstream.

As Birkerts warned over two decades ago, “the overall situation is bleak and getting bleaker.”40 He made this grim observation with reference to the kind of reading he cherishes, but it can also be applied to education in the broadest sense. With large-scale social institutions around the world facing a general crisis, it is hard to imagine how education can become the one effective island in a largely dysfunctional sea. As German sociologist Ulrich Beck has noted, we now live in an age when individuals are impelled to seek “biographical solutions” to deep systemic problems.41 Or when, as a mock Soviet slogan proclaimed back in the 1920s, “the rescue of the drowning lies in the hands of the drowning themselves.”42

What then can “we” (as parents, caregivers, and educators), as well as young people, do in order to stay sane and push back against the influence of the overwhelming social and technological forces I have described? The first step would be to grasp the predicament we all, and our children in particular, face. Unfortunately, the neurosomatic adaptations our socio-technological milieu demands make this an almost impossible, and largely unwanted, task. What seems true, real or significant depends much on the degree to which emotional and visceral attunement infuses higher-order reasoning. And it is this existential involvement with the larger social world and “significant others” that seems most severely weakened as we speak, giving way to a technologically induced altered state of consciousness.

What would I advise readers who can still take seriously my broodings, or sense in them a (perhaps stronger) echo of their own doubts and struggles? I would first call on them to show all the determination they can muster and, if they have not already made some steps in this direction, stop spending most of their waking hours immersed in digital imagery and information—so they can break out of the vicious circle of virtualization of their own existence. This would also help them make a crucial effort to invest their relationships with the children and young people in their care with sufficient warmth. The rapid rise of “social media” has made maintaining this essential emotional bond much harder but even more imperative—a point made eloquently by Dr. Gabor Maté and developmental psychologist Gordon Neufeld.43

As I know from personal experience, this is easier said than done. I am someone who does get an easy high from reading an impressive feature article or non-fiction book. I am also keenly aware of, and spend much time ruminating on, the dangers I have described. Yet I have struggled to achieve in my own life a semblance of the balance I am advocating. Fortunately, my wife and I are blessed with a daughter who is extremely impressionable and eager to love us back even when we fail to give her the attention he seeks. She is also easily excited by seemingly trivial aspects of the “real world”44; is strongly attached to her grandparents; seeks out a limited number of deep friendships; and has been an exceptional student at a demanding high school (praised not just for her academic achievements, but also for her radiant presence in the classroom and willingness to help others). Gali is not glued to her phone texting, has not begged for an internet connection on it or for a tablet, and in the summer of 2014 spent willingly over a month (in several installments) without any internet access. Yet she also finds it difficult to follow consistently my unrelenting admonitions to limit the time she spends in front of her laptop (mostly searching information on various issues related—or often unrelated—to school assignments and planning her college education).

In my own teaching, I try to structure courses like The Matrix, providing a mixture of intellectual and more immediate stimulation. I use a mix of articles and book chapters—some academic, some potentially more engaging (for example, Michael Lewis’s reports on the financial crisis—one of which provided material for The Big Short movie45), some in-between (like essays from Foreign Affairs, Foreign Policy, The Atlantic, and other high-brow publications). I usually select texts which address some larger issues related to structural or cultural trends that are often concealed behind the coverage of current events and developments—for example, changing understandings of freedom and power in the writings of major thinkers, or various implications of the rise of the market economy since the late 18th century. I also introduce some narrower concepts (for example, different explanations for extreme nationalism and related atrocities), and use bare-bone PowerPoint outlines to provide a degree of structure to classes and the whole course.

While I dearly hope such intellectual exploration can be sufficiently stimulating for some of my more curious students, I also include in my class sessions a few “action” sequences. I still show some brief video segments or distribute short articles and other handouts we can read and discuss in class. These are intended to provide more vivid illustrations or interpretations of the larger issues addressed in assigned readings, and to offer an opportunity for some more engaging activities—particularly for students who would otherwise be bored. I also ask students to write frequently, from one-paragraph responses they compose individually or in small groups in class, to longer integrative papers. Luckily, few students use laptops in the classroom, so I just banish them to the back (so they would distract others less), as opposed to any outright ban.

While I do make an effort to expose students to this learning mix, I still believe that the road to more sophisticated thinking and better writing passes mostly through “deep reading” and the neurophysiological maturation it spurs. In addition to selecting sufficiently complex yet emotionally engaging texts meant to facilitate this process, I also try to alert students to the indispensability of “real” reading, and the need to limit the time they spend online. More generally, I prod them to recognize the challenges in dealing with the information overload we all face. My overall goal is to allow and stimulate students to tune to gradually increasing levels of intellectual and moral complexity. I also try to make evident the benefits of further intellectual growth by exposing them to more sophisticated forms of thinking and making connections, sometimes between seemingly unrelated issues—as demonstrated in readings and comments from more intellectually advanced students (and sometimes from me).

Developing such an ability to think in context, relate concrete facts and developments to broader concepts and trends, and connect at a personal level to larger social issues is a sine qua non for achieving the goals of liberal education. Without such an overall aptitude, the knowledge students acquire can hardly become cumulative and “transfer” to new problems. Nor can they develop a more holistic understanding of the world and a capacity for lifelong, self-directed learning. Since I believe these abilities are primarily a matter of neural maturation and the more intricate forms of intellectual and emotional processing associated with it, I try to introduce students to some basics of brain development. I also give them tips for brain and physical fitness. I try to discuss these problems and offer illustrations mostly in good humor, as I am weary of offending students by implying that they are, as Mark Bauerlein put it in the provocative title of his book, just “dumb.”46 I try to frame issues related to brain development as “areas for improvement,” and to emphasize the potential benefits of the mental gymnastics students can do in and outside of class. I also try to reach out to individual students who show curiosity, intellectual humility, and potential as earnest readers.

Despite all my efforts, too many of my students remain skeptical and aloof as I try to take them on this mental tour. Ultimately, the success of all as lifelong learners will depend on their own willingness and ability to muster sufficient intellectual and emotional resources to move on on their own. I do wish them much luck, since in the dizzying world we inhabit this has become an almost Herculean task. As I noted at the outset, it appears that many of my students are facing long odds—and they hardly seem exceptional in this respect. Personally, I am very much opposed to efforts to “disrupt” and “unbundle” the liberal arts model of higher education. Yet, it may well turn out that the majority of college students cannot fully benefit from this model and develop the holistic understanding and critical stance it is intended to foster.

My deeper worry is that we all—as individuals, parents, teachers, scholars, students, workers, entrepreneurs, leaders, etc.—have reached a point where we have become too jaded, even delusional as a result of the information and existential overload we suffer.47 So we cannot collectively recognize—with sufficient clarity and urgency—the neurophysiological crisis we are facing; to say nothing of mustering the almost inhuman determination and human commitment needed to deal with it.

This is the recurring nightmare I have been unable to get out of my mind. It evokes a form of “technocracy” which, in the words of Scottish information society theorist Alistair Duff, should not be interpreted merely as “the rule of experts.” Rather, it is a regime that involves “the rule of information technology, the domination of information technology over human beings, and the subordination of people to a technological imperative.”48 The kind of “technocracy” Duff describes seems to shape too many of us into restless cogs whose labor and life energy can be almost constantly tapped, and who are engrossed in abundant forms of self-stimulation. We are thus assisting the technocratic system in which we are caught to expand its tickling tentacles into every crevice of our social and personal lives—while often dreaming dreams of freedom and self-empowerment. This is a role for which “the best and the brightest,” or the cognitive 1%, are richly rewarded—until one day even their input is perhaps rendered obsolete.49

Over eight decades ago, Aldous Huxley dreamed up a most frightening future society. In it, human fetuses are manipulated prior to their artificial “hatching” in order to stunt the natural development of some and thus create castes of individuals fitted for different social roles.50 They live lives devoid of strong attachments or emotions, saturated with opportunities to instantly gratify various desires—so these will not be pent up to the point of bursting and causing social disruption. Suspended in this milieu, most individuals feel free and happy—unlike previous generations or the recalcitrant “savages” on the edge of “civilization.”

In my nightmare scenario, the socio-technological “matrix” we inhabit has developed a more subtle way of shaping our proclivities, abilities, and attitudes. It achieves this by providing sources of chronic stimulation, tapping the exceeding plasticity of the human brain, and provoking longer-term neurophysiological modification. As English professor Edward Mendelson notes, “Virginia Woolf’s serious joke that ‘on or about December 1910 human character changed’ was a hundred years premature. Human character changed on or about December 2010, when everyone, it seemed, started carrying a smartphone.”51 This was perhaps the tipping point in a longer-term process of adaptation to social and sensory overstimulation. As a result of this adjustment, we have progressively turned into “mental penguins”—beings who have shed some deeply human qualities in order to acquire a more adaptive yet narrower “skill set.”

Huxley Was Right?

Over three decades ago, Neil Postman was nagged by a troubling thought—“the possibility that Huxley, not Orwell, was right”52 in describing the totalitarianism of the future. He introduced his book, Amusing Ourselves to Death, with a premonition that ran counter to the spirit of Ronald Reagan’s “morning in America”:

What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy.

As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “failed to take into account man’s almost infinite appetite for distractions.” In 1984, Huxley added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us.53

If only Huxley had realized that what we crave most may be information as such.54 In any case, since the time of his and Postman’s dire warnings, the information-carrying caravan has only picked up speed.

I am under no illusion that what I write can make the smallest dent in the larger logic of the zeitgeist. The musings of skeptics like Neil Postman, Jane Healy, Sven Birkerts, Nicholas Carr, or Susan Greenfield, or more balanced coverage of our digitally-enhanced mental environment (for example, The New York Times series “Your Brain on Computers” published in 2010) have obviously failed to achieve this. So I can hardly hope to do better, or even to be part of some larger awareness-raising effort.

As Gabor Maté has noted, childhood development problems and compulsive behaviors are hardly a matter of personal or parental failure. Rather, they reflect “a social and cultural breakdown of cataclysmic proportions.”55 Alas, I cannot imagine a set of institutional or technological adjustments that can reverse this seismic shift. The only realistic form of resistance I envision is an effort to create small sanctuaries for ourselves and our loved ones; and reach out within and out of these to a few kindred souls. It is particularly important to reach out, before it is too late, to the young learners in our care (in the broadest sense of these words)—and help them grow as best as we can. So they can develop a fuller grasp of the “matrix” they inhabit—and of their own unique mix of strengths and vulnerabilities.

My more upbeat critics will no doubt dismiss all these premonitions as the flawed arguments of an uninformed amateur or the alarmist delusions of a troubled mind—or both. They will point out that worries about technological change have always been proven wrong—as humankind has adapted to technological change by developing new and leaving behind some outdated aptitudes. It may appear that even real penguins in Antarctica do not quite miss the stereotypical bird traits they have lost. Until perhaps a giant iceberg cuts them off from the ocean,56 or global warming threatens their food supply.57 For our children’s sake, I hope against hope that the optimists are at least partly right.58