1

A Brief History of Distraction

OUR FEARS ABOUT the distractibility of students exist within the context of larger social fears about increasing distractibility in the digital era. With screens available everywhere now, even on watches and glasses, we wonder whether our devices are doing permanent damage to our attention. We tend to express that fear most vocally in relation to our children and students, perhaps because we see them so frequently interacting with their screens in ways that seem trivial to us: texting one another, watching videos, tracking celebrity social media feeds. But most adults I know also fear, or at least wonder about, the extent to which distraction might harm our attention. A couple of years ago, while visiting another campus, I had dinner with an administrator in his mid-sixties, and the topic of attention and distraction arose. “I can’t seem to pay attention anymore like I used to,” he said, shaking his head sadly. “I can barely get through a few pages of a novel before I find myself checking my phone.” The resignation in his voice struck a chord with me, as it seemed to capture well the dominant tone of our current conversations about distraction and attention. Something has been lost, and we regret it; we know the identity of the culprit, but we are helpless in the face of its power. Two assumptions buried within that administrator’s statement underpin this story of attention lost: the first is that we once had a greater ability to pay attention; the second is that our devices are to blame.

Before we turn specifically to the challenges of attention in education, then, it would be useful to put our assumptions about attention and distraction into a broader historical context. As we shall see, rumors about the recent death of attention have been greatly exaggerated. We can trace concerns about our distractible minds just about as far back as we can trace the development of written philosophical, religious, and literary texts. The history of human reflection on the problem of attention encompasses writing from both Western and Eastern traditions; works of ancient philosophy and sacred scripture; novels, poetry, and nonfiction; instruction manuals for polite behavior; and screeds against new technologies. We need to acknowledge this history before we consider how to address our contemporary educational concerns, because it can shape both the intensity and nature of our response. When we recognize the extent to which anxiety about distraction has a long history, we can dial back the sense of panic that infects much of today’s discourse about students and their short attention spans. But we can also acknowledge that today’s technologies are presenting some particularly complex new challenges to our students—and to ourselves.

Wandering Minds and Fly-Catching Lizards

Considerations of the problem of distraction date at least back to Greek and Roman antiquity. In the Ethics, Aristotle argued that distraction arises from a clash between activities which are more and less pleasant to us. “People who are passionately devoted to the flute,” he explains, “are unable to pay attention to arguments if they hear someone playing a flute, since they enjoy the flute-playing more than the activity that presently occupies them.” Thus we are distracted away from challenging tasks (such as attending to arguments, paying attention in class, or grading papers) when we encounter the prospect of something more pleasing (such as listening to music, chatting with a friend, or checking Twitter). In a following passage, Aristotle explains that the problem can arise not only from the potentially pleasant nature of the distraction, but also from the unpleasant or boring nature of the experience we are having: “When we are [only] mildly pleased with things of one sort, we do things of other sorts; for instance, people who eat snacks in theaters do this most when the actors are bad.”1 In this statement we get not only a description of Aristotle’s theory of distraction, but also a glimpse of the challenges that ancient theaters faced in the form of their distracted audiences (a problem to which we will return in greater detail in Chapter Five). But we see even in these earliest writings about distraction the dual nature of its power over us: we can be pushed toward distraction by unpleasant or difficult experiences (listening to arguments, watching bad theater) and pulled toward it by the prospect of something more pleasing (listening to flutes, eating snacks).

The Latin theologian Augustine of Hippo, who penned the Confessions in the late fourth century, offered less of an analysis and more of a lamentation of the distractions that beset our mind. For Augustine, as for many religious writers, the proper focal point should be God; he was thus disturbed by the inability of the human mind to stay focused on prayers. In the Confessions, he bemoans the ways in which his contemplation of God and truth could be broken by the smallest and most everyday distractions:

In the same chapter of the Confessions, Augustine explains that he can try to structure his life in order to minimize distractions: for example, he can avoid going to the circus to watch dogs chasing rabbits. But even his best intentions and efforts won’t stop his attention from being diverted if he sees a dog chasing a rabbit while he’s out taking a walk. No matter how we try to corral our disobedient attention toward higher matters, it will defy us in the end.

We find these same laments about our distractible minds in the religious and wisdom traditions of other cultures. Huston Smith and Philip Novak explain in Buddhism: A Concise Introduction that the core insight of the Buddha was the impermanence of all existence; our failure to reconcile ourselves to this impermanence represents the source of our suffering. That impermanence applies not only to the physical objects of this world that come and go, but to the thoughts and emotions that spin incessantly through our mind. In Smith and Novak’s summary of the insights of the Buddha, “Every mental and physical state is in flux; none is solid or enduring… We have little control over our mental states and our physical sensations, and normally little awareness of them.”3 Practitioners of Buddhism begin their journey toward a more mindful existence by acknowledging what the Buddhist tradition often refers to as the “monkey mind”—jumping, swinging, and howling in unceasing motion. That monkey mind, subject to constant distraction and wanderings, represents the starting point for the long and arduous journey toward enlightenment.

From the monkey mind to the blissful state of enlightenment, from fly-catching lizards to contemplation of God: the ancient texts of attention and distraction put the two terms in a clear hierarchy, with attention above and distraction below. Attention enables us to contemplate the truth, to focus on what matters, to achieve peace and wisdom. Distraction scatters the mind, deters us from right thinking and behavior, and brings unhappiness. We speak about attention and distraction in these same terms today, as we admonish our children and students to pay attention at the dinner table or in the classroom. We feel Augustine’s unhappiness and irritation when we are distracted by trivial matters—pulled in many directions, true to the Latin roots of the word dis-traction, which means to drag something apart. We chide ourselves for the hours we spend on our phones, knowing that we might have spent that time in more productive ways: in study or scholarship, in exercise or prayer, in conversation with our children, parents, or friends.

These ancient texts establish another important binary, though—one that is essential to understanding our contemporary concerns about distraction. We see in the range of early writing about attention that we are subject to both internal and external distractions. For the Buddhist, the source of the problems lies within us. Minds jump and wander, disobey and distract; no matter how hard we work to harness and sustain our attention, it slips from our grasp. But Aristotle and Augustine point out that our susceptible minds are also drawn from their focus by external forces. Flute players, lizards, and spiders intrude on our attention and pull it away from the places we wish it to be. The English poet and cleric John Donne, writing in the early sixteenth century, describes in a funeral sermon the way that his efforts at attention to prayer are beset by these twin demons of internal and external distraction:

Given all of the distractions that Donne describes—from the noise of a fly to the straw beneath our legs (external distractions), from our memories of yesterday to our fears for tomorrow (internal distractions)—it begins to seem miraculous that we are ever capable of sustained attention of any kind.

Theorists of attention will continue to express concern about both internal and external causes of distraction right up to the present day, but in eighteenth-century Europe a new formulation of the relationship between internal and external distraction emerges, one that raises a scary prospect.

Coffeehouse to Computer Screen: The Destructive Power of Distraction

That lovely hot beverage that you might be enjoying as you read these words, the source of so much energy and delight among students and teachers alike, first became popular in the Ottoman Empire in the sixteenth century. It was discovered by Europeans through both travel and trade, and was promoted for its stimulating qualities and for its many reputed health benefits. The first coffeehouse in England was opened in 1650 at Oxford University, thus initiating a long and still vibrant association between learning and coffee. The first coffee shop in London opened just two years later, and then coffeehouses spread with startling speed throughout the English capital. Fifty years later, there were more than two thousand coffeehouses in London. They became sites of leisure, political discussion, and commercial work. The insurer Lloyd’s of London originated in a coffee shop, as did an early version of the London Stock Exchange. These buzzing sites of social, commercial, and political interaction drew people together in major cities across England and beyond, including Vienna, Paris, and Amsterdam.5

The manic atmosphere in the coffeehouses created energy and excitement, but at the same time drew the concern of intellectuals who feared that it was interfering with the ability of coffeehouse patrons to put their heads down and focus on serious work. Tom Standage, author of Writing on the Wall: Social Media—the First 2,000 Years, cites the words of an Oxford don in 1677 who argued that “solid and serious learning” was declining as a result of people wasting their days in coffeehouses.6 A lawyer from Cambridge had made the connection between coffeehouses and a decline in focus slightly more specific a few years earlier: “Who can apply close to a Subject with his Head full of the Din of a Coffee-house?”7 The rise of the coffeehouse—and the concerns raised by these scholars about its role as a new source of distraction—echoes the fears of Augustine and John Donne about the ways in which external distractions can intrude on thinking.

In eighteenth-century discussions of the coffeehouse, a newfound focus on the problem of distraction raised a dreadful prospect: sustained exposure to external distractions can degrade our internal capacities for attention. Isaac Watts was an English clergyman and writer of Christian hymns, including that Christmastime classic “Joy to the World.” In 1727, he published a self-help book for freethinking folk called The Improvement of the Mind, and in it he cautioned readers that spending too much time in distraction-filled environments would eventually create an easily distractible mind:

The use of the word “habit” in this passage reflects a fear that what we have previously acknowledged as an unhappy feature of the mind—its tendency to wander away in spite of our wishes—can be exacerbated or made more permanent by too much time spent in the company of external distractions. In other words, the more time you spend being distracted, the more you will become an easily distractible person.

The formulation that we see in Watts’s book has become the dominant way of theorizing the problem of distraction, right down to the present day. Writers will no doubt continue to reflect upon and lament the internal wanderings of their own minds, echoing eighteenth-century writer Samuel Johnson’s remark that “with or without our consent… the mind will break, from confinement to its stated task, into sudden excursions.”9 But this problem seems much less pressing to writers as we progress into the twentieth century and beyond. The locus of concern rests more and more squarely, with each passing century, on the newly developed technologies or media that steal our attention, chip away at our cognitive powers, and destroy our ability to pay attention to one another. A 1906 cartoon from the British magazine Punch depicts two well-dressed late Victorians seated under a tree, facing away from each other, each staring at a telegraph receiver on their lap. “These two figures,” the caption reads, “are not communicating with one another. The lady receives an amatory message, and the gentleman some racing results.” The figures, mesmerized by their quaint devices, bear an uncanny resemblance to students today, sitting under a shady tree on the quad, transfixed by the phones in their hands.10

The specific fear of external technologies diminishing attention capacities arises again with the arrival of the radio. As this new device found its way into people’s homes in the 1920s and ’30s, according to historians Luke Fernandez and Susan J. Matt, “many grappled with the meaning and place of the radio in their mental lives. Could they take in all it had to offer without sacrificing other mental powers? Was it a force of enlightenment or a source of distraction and dissipation?” A diary writer from the time notes that she “spent a stupid and useless morning at home did not even get the papers read. The radio interferes with my intellectual life very much.”11 Note the connection here again from short- to long-term: the concern is less about the “stupid and useless morning” than it is about the ability of such mornings to destroy her “intellectual life.” The more she listens to the radio, with its short bursts of entertainment, the less she can pursue an intellectual life of sustained thinking.

Both radio and its successor, television, raised special concerns about their impact on young people and their developing brains. In the late twentieth and early twenty-first centuries, those concerns included a special focus on the ways in which screen exposure could negatively impact the attention spans of young people. In 2004, a research study appearing in the journal Pediatrics presented data showing a correlation between high rates of television viewing among very young children and rates of attention deficit hyperactivity disorder (ADHD) diagnoses at age seven.12 “This study,” explained the lead author in a news release, “suggests that there is a significant and important association between early exposure to television and subsequent attentional problems.”13 Two years later, the same journal published a second essay calling these findings into question. Using a different set of research tools to approach the issue, a pair of researchers found that “effect sizes for the relationship between television exposure and symptoms of ADHD were close to zero and not statistically significant.”14 But the media had jumped on the original study and broadcast its findings widely, cementing the proposition in the minds of the public that too much attention to screens could cause permanent damage to the attention spans of children.

And who could forget the ongoing panic about whether video games are destroying our attention spans? Michael Z. Newman documents the ways in which the rise of video-game culture in the 1980s “prompted educators, psychotherapists, local government officeholders and media commentators to warn that young players were likely to suffer serious negative effects.”15 Those negative effects of course included the ability to pay attention. As recently as 2010, news outlets were covering an alarming study, also published in Pediatrics, that showed that video games could be destroying the attention spans of children.16 “Playing video games may make it harder for some children to pay attention in school, a new study suggests,” reads the opening sentence of a Canadian news report on the study.17 What the study actually revealed was that children who spent more than two hours per day on video games and television were more likely to be flagged by teachers as having attention problems. The caveats were buried in the article: there was no way to determine whether the video games were causing the attention-span problems, and in fact the causality could just as easily have run in the other direction—it could have been the case that children with attention problems were more likely to gravitate toward video games. Or perhaps there was no meaningful connection at all—the real culprit might have been the fact that children were gaming beyond their bedtimes, and lack of sleep was creating the attention problems reported by the teachers.

Contemporary Concerns

With all of this historical context in mind, we can now take a fresh look at the arguments being made by contemporary critics of distraction, like Nicholas Carr. Carr’s best-selling book The Shallows: What the Internet Is Doing to Our Brains, a finalist for the Pulitzer Prize in 2011, points the finger at the internet and the devices that enable it: computers, tablets, smartphones. But the underlying logic is the same as the one articulated by Isaac Watts: too much time spent in the company of distractions ultimately changes who we are—for the worse. Carr posits the existence of a “linear mind,” one that can proceed logically and deliberately from one thought to the next, that is being slowly supplanted by a more scattered version of itself: “Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts—the faster, the better.”18 As previous writers have argued, Carr posits that this change occurs because of the way external forces—coffee shops, telegraphs, radios, television screens—continually draw our attention in multiple, divided ways. “The division of attention demanded by multimedia,” Carr writes, “strains our cognitive abilities, diminishing our learning and weakening our understanding.”19 Carr tells a story of innocence lost, or perhaps innocence deliberately destroyed. When our brains were living in simpler times, they worked in simpler ways: slower, more focused, more attentive. Now that they are living in highly distractible times, they are becoming highly distractible organs: shallow, surface oriented, in constant search of novelty and distraction.

Carr’s presentation of this argument seems more convincing than those previous iterations because it incorporates a well-established principle of neuroscience: neuroplasticity, or the ability of our brains to adapt, grow, and evolve throughout our lifespan. We have plenty of evidence that both external circumstances and internal mental activity can alter the landscape of our brains in substantive ways. When the brain encounters a situation over and over again, and responds to it in the same way, it strengthens neural pathways that are new or slightly altered from the existing ones. This is, of course, what happens when we learn something deeply. We encounter a new piece of knowledge or develop a new skill, rehearse it repeatedly in different contexts, and then that fact or skill takes its place within our existing mental structure. Casual descriptions of this process will often say that our brains have been “rewired,” which can be a helpful but also misleading metaphor. It implies that once we have created or reinforced a neural pathway, it remains fixed in place. But, as Carr rightly points out, “our brains are always in flux, adapting to even small shifts in our circumstances and behavior.”20 That flux occurs in response to myriad circumstances: when we learn something deliberately, when we experience strong emotions that are seared into our memories, and when we are first navigating our way through an unfamiliar context.

As Carr argues, such flux also occurs when we engage repeatedly in behaviors that are shaped and conditioned by technological devices. So, the argument goes, if we are repeatedly swiping from one thing to the next on our phone, never pausing to sit and read a long-form article or a book, or never sitting quietly with our thoughts and reflecting on the meaning of life, we are changing our brains into organs that have lost the ability to pay sustained attention. As Carr writes in the conclusion to The Shallows, the shallow but endlessly shifting surface of the internet, made concrete and always accessible in the form of our phones, “reroutes our vital paths and diminishes our capacity for contemplation.” Worse still, “it is altering the depth of our emotions as well as our thoughts.” To be fair to Carr, he does acknowledge, as our history of distraction might suggest, that “the natural state of the human brain… is one of distractedness. Our predisposition is to shift our gaze, and hence our attention, from one object to another, to be aware of as much of what’s going on around as possible.”21 His point is that our phones intensify and solidify this predisposition of our brains toward distraction, whereas much of the best parts of ourselves as humans—our grand achievements in art and architecture, politics and conversation, religion and community—have sprung from our ability to focus, attend, and contemplate.

The arguments made by Carr and others who tell this story of attention dispossessed by technology look much less convincing when they are placed within the larger historical narrative of concerns about this issue. The history of distraction shows us that we have never lived in some prelapsarian state of attentional grace, in which we focused effortlessly with our calm, linear minds. For one thing, brains cells connect in rich networks rather than in clean lines. But more important, we learn from these historical voices that we have always been distracted. The difference between us and our nineteenth-century cousins is not that our attentional capacities have somehow been permanently diminished, as Carr would have it, but that the people and devices who seek our attention have become better at soliciting it from us. If you feel yourself more distracted than you used to be, or than you would like to be, that feeling may well reflect reality. The pull of our digital distractions is very strong today. You find yourself on social media at times when you could be grading papers or attending to your child. Perhaps you have taken social media fasts, or created digital-free times or zones in your life, and failed to abide by them. You might also perceive significant changes in your students or children, who seem more distracted than they used to be. All of those experiences and observations might still have you wondering about the extent to which our modern distractions might be doing some kind of permanent damage to our brains or fundamentally changing who we are as humans.

I can conclude this chapter with the hopeful note that the current reporting from people who study the brain, and especially attention and the brain, is that we don’t yet have any conclusive evidence to support the notion that human attention has suffered some architectural diminishment in today’s technological era. Just as we did with television and video games, we can ask people about their smartphone use and then test their attentional capacities. If we see high smartphone use and low attention spans, we can draw the conclusion that smartphones have diminished their attention. But we can just as plausibly draw the conclusion that people with shorter attention spans love to use their smartphones (and watch television and play video games). In 2019, the news and culture site Vox posed to multiple scientists the following question: “How is our constant use of digital technologies affecting our brain health?” For the most part, while the scientists all acknowledged the short-term impact that technologies can have on our brains (for example, multitasking can interfere with the effectiveness of studying or learning in the classroom), none of them were willing to assert conclusions about a permanent diminishment of our brains. Stanford psychologist Anthony Wagner explains, with respect to the relationship between multitasking and working memory,

Other respondents to the question argued that we can’t draw such conclusions because the impact of digital technologies on the brain might depend on how we use them. University of Wisconsin’s Heather Kirkorian points out that we should not be collapsing “video-chatting with a grandparent versus watching an educational TV show versus playing a violent video game versus using a finger-painting app.”22 The implications of Kirkorian’s argument are clear: we should not be arguing that digital technologies in general have diminished our brain capacity without acknowledging that we interact with them in hundreds of different ways, each of which may have a differing impact on our brain.

Cognitive scientist Daniel Willingham, who writes frequently to debunk myths about education and to promote research-based perspectives, comes to a similar conclusion in the New York Times. “Although mental tasks can change our brains,” he writes, “the impact is usually modest… Attention is so central to our ability to think that a significant deterioration would require a retrofitting of other cognitive functions. Mental reorganization at that scale happens over evolutionary time, not because you got a smartphone.”23 Our attention systems, in other words, work in coordination with many other parts of the brain. It would not be possible, Willingham argues, for a system so fundamental to suffer degradation without seeing impacts in many other areas, or without other areas changing dramatically to compensate for the losses. His argument also notes that our attention systems evolved over hundreds of thousands of years; it would be almost miraculous to believe that they could change radically within the space of a generation. A comprehensive view of the state of the research investigating the relationship between technology and attention, published in 2017, stated that “while there is clear evidence that engagement with smart devices can have an acute impact on ongoing cognitive tasks, the evidence on any long-term impacts of smartphone-related habits on attentional functioning is quite thin, and somewhat equivocal.”24 In other words, while your phone use certainly can distract you from the pursuit of your goals (thus having an “acute impact”), the evidence for “long-term impacts” in the form of a permanently degraded attention span, in individuals or in the human species as a whole, just isn’t there at this time.25

My review of this research has led me to the conclusion that the brains of your students, like the brain inside your own skull, remain capable of the kind of sustained attention that leads to learning. How to invite and support that sustained attention will be the work of much of this book. But before the light at the end of this tunnel can begin to appear, we have to move from the general considerations of attention and distraction to the specific environment about which every reader of this book likely cares, and which seems especially besieged by distractions these days: the classroom.

25