What is the reason to cultivate and devote one’s single-minded attention? Is this kind of attention even still a possibility? Was it ever? In the years after Adderall, these were the questions I often thought about.
I approached from all angles. Walking the loop in Prospect Park, I listened to attention self-help books through my headphones, books such as Deep Work by Cal Newport and Hyperfocus by Chris Bailey. I was listening not in order to help myself (or so I believed), but, rather, to get a sense of the latest advice, and the language in which attention was now commodified. Bailey, speaking in existentially unruffled tones, offered many useful suggestions: Leave your phone in the other room when you need to get work done. Drink more caffeine. “We are what we pay attention to,” he reminds us. Then he said something that surprised me: “Letting your attentional space overflow affects your memory.”
Indeed, I soon discovered that this is a classic finding of memory research, known for decades: distraction breeds forgetting. To say it another way, the way the neuroscientists say it, interrupting someone’s attention by introducing a “secondary task” (responding to a text message, for example) means this person will not “encode” their present circumstance in all the rich, associative detail necessary for a memory to form and hang around awhile. Attention, it turns out, does not concern only our present circumstance. It bears directly on both our past and our future. What will fail to make it into my memory bank because I’m too busy scanning headlines and replying to text messages to pay attention to my life? And yet, even in the midst of that very train of thought, I go ahead and pull my phone out of my pocket, for no particular reason.
That’s how it is. We have entered into a situation where the gadgets we carry around with us—and the cognitive rhythm they dictate—are pitted against the possibility of deep engagement, or thorough “encoding.” They ask us to be anywhere but here, to live in any moment but now. What struck me was this: we treat such changes as inevitable, even while we lament them, seek antidotes and alternatives, enroll in meditation classes, digital detoxes, silent retreats. I wanted to understand why we choose to pixelate our own attention spans, then hungrily search for ways to patch ourselves back together.
I found that I was still asking such basic questions as: What do we mean when we talk about attention? Perhaps it was inevitable to ask such questions now, in our Silicon age, glued to our screens as we are, our attention in pieces, forever divided among the countless demands our devices ask of it. In any event, these were the questions I found myself asking, found myself stuck with. In the years after Adderall, these questions became the quest I embarked upon.
In the beginning, I did not see how desperately personal this whole thing really was. After all, what is the question of attention really about, if not this: What is worth paying attention to? Hanging on to? What matters?
that on my quest to comprehend the elusive force we call attention, I would have turned without further delay to that subject’s great philosopher king. William James, the nineteenth-century thinker, brother of the novelist Henry James, son of the theologian Henry James, Sr., godson of Ralph Waldo Emerson, wrote the seminal texts on attention that still, inevitably, garnish so much of the contemporary writing on the subject, despite being 130 years old.
It is true that I duly went to Amazon.com and ordered a copy of James’s The Principles of Psychology, volume one, knowing full well his place in the attention literature, even regularly quoting fragments of his famous statements on the subject, such as “Everyone knows what attention is. It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thoughts…” Yet when the paperback version of his tome arrived at my door, I found myself unable even to open it.
At college, I might have swallowed a blue pill to attend to James’s thoughts on attention. I was without those blue pills now, and worse, much worse, I was a citizen of the year 2017, with all of its base distractions. How could my brain really hope to tangle with William James when it had been so thoroughly retexturized by the frequency with which I checked Instagram? (What was I even looking for there?) Not to mention the headlines that now reached me of their own volition, arriving on the screen of my phone at an unpredictable rate, leading me to other headlines, other articles, which I couldn’t even hope to finish, due to the urgent pull of fresh text messages and emails, or even the mere knowledge of the replies I owed, to say nothing of the instant messaging dispatches that often interrupted those emails, most of them typed by my brother somewhere in an office in Los Angeles.
By the time William James arrived on my stoop, therefore, it had become all too clear to me that my brain had not come through the Great Interruption unscathed. Apparently, it was no longer the relatively studious organ it had been with the pills, but really also without them, before them, back when I had not possessed an email address or had ever heard the name Mark Zuckerberg. And so my collection of William James sat on my desk, topped by a softback cover displaying his bearded countenance in sepia tones, alienating me even further from the realities of his day.
But I inched toward him, obliquely. Through biographies and essays, I began to rake in a miscellaneous collection of evocative details. I learned that as a young man, he’d had a nervous breakdown while traveling through Berlin, that his first book wasn’t published until he was forty-nine years old, that he experimented with nitrous oxide, that his father had a wooden leg and wrote long, unappreciated theological tracts and tried to bully his son into a career in the sciences, though William himself had wanted to be an artist and had studied painting before giving up and going to Harvard Medical School. Intriguing details, all of them. And, as well, this observation: “One finds James when one needs him.” So writes Jessa Crispin, in her book The Dead Ladies Project. And finally, I did. I needed him.
The first thing I discovered was that he understood, in his nineteenth-century way, my twenty-first-century plight. “Most people probably fall several times a day into a fit of something like this: The eyes are fixed on vacancy, the sounds of the world melt into confused unity, the attention is dispersed so that the whole body is felt, as it were, at once, and the foreground of consciousness is filled, if by anything, by a sort of solemn sense of surrender to the empty passing of time,” James wrote. “Every moment we expect the spell to break, for we know no reason why it should continue. But it does continue, pulse after pulse, and we float with it, until—also without reason that we can discover—an energy is given, something—we know not what—enables us to gather ourselves together, we wink our eyes, we shake our heads, the background-ideas become effective, and the wheels of life go round again.”
If this was not the most vivid and thorough description of my relationship to Instagram, I know not what could possibly be. The vacancy of my gaze and, more broadly, my mind as I passively surrendered to the scroll of images generated by other people’s lives was exactly as James had said: a spell, one I could not comprehend, least of all when I was deep inside it. For the fact was, this particular kind of capture brought none of the upsides that I experienced when absorbed in reading and writing, or swept up in a great conversation, to say nothing of music, running, sex—all pursuits largely if not entirely unmediated by the young gazillionaires of Silicon Valley. Instead of aliveness, what I felt after such technology-induced spells was sad, inadequate, and, most of all, uninspired.
There are endless metaphors for attention, waxing and waning according to the moods of the day. Attention in the twentieth-century scientific research is, we often hear, a “spotlight” or “searchlight,” whose beam we use to select one feature of our environment to focus on at a time. For James, and others of his era, a more compelling analogy came from water: James compared all mental activity itself to a “stream of thought” with attention as the single force capable of freezing the rushing tide. “Without selective interest,” he wrote, “experience is an utter chaos.”
It is worth emphasizing that the concept of selection has been a part of the attention conversation since the modern attention conversation began. Attention is by definition a trade-off. As the philosopher Émile Durkheim put it: “We are always to a certain extent in a state of distraction, since the attention, in concentrating the mind on a small number of objects, blinds it to a greater number of others.” So that, then, is the rub, the ongoing predicament. What to choose?
at Harvard’s medical school in 1864, it was far from the elite institution it is today. In the 1860s, leeches were still on the curriculum, the cure for liver problems, the young doctors learned, when applied to the anus. Five years before James got to Harvard, Darwin had published On the Origin of Species, and his theory of evolution was the subject of a great deal of debate on both sides of the Atlantic. Before entering Harvard’s medical school, James had apprenticed himself in the studio of William Morris Hunt, pursuing his first great love: painting. Later, perhaps thinking of his hours under Hunt’s auspices, he would write, “It is only the great passions, which, tearing us away from the seductions of indolence, endow us with that continuity of attention to which alone superiority of mind is attached.”
A kind of forced indolence was in fact often his fate: in his younger decades, especially, he was frequently unwell, as was his brother Henry. This was their great shared subject: their ailments. They sent each other endless letters across the Atlantic, Cambridge to Rome, William describing his dark moods and his faulty vision; Henry, his digestive woes. It wasn’t until the 1870s, late in the decade, that James stabilized enough to make his real entrance on the intellectual scene with his much-read ideas on consciousness, will, and attention.
Attention was not a new idea, of course. Nor was it primarily a Western one. One could begin virtually anywhere, and at any point in time, telling the history of attention: as meditation, as prayer. But to me, one of the most intriguing stories was that of the tiny cult of European naturalists who popped up in the eighteenth century and became entranced with their own powers of attention. They studied bugs, but it was no mere nine-to-five pursuit. The historian Lorraine Daston has chronicled their efforts in The Moral Authority of Nature: Take Charles Bonnet, naturalist from Geneva, who spent twenty-one consecutive days, from 5:30 in the morning to 11:00 at night, staring at a single aphid. As the weeks passed, he began referring to it as “mon puceron,” recording every painstaking detail in his journals. His French counterpart René Antoine Ferchault de Réaumur once spent fourteen hours counting the number of bees that left their hive. (Eighty-four thousand.)
Naturalists such as Bonnet justified their endless hours of attending to the wing of a fly or the guts of a worm with a kind of Aristotelian cry: “There be gods even here.” After all, these too were creatures of God, reflections of God’s “exquisite handiwork.” These men believed that with the force of their attention they could bestow on lowly insects something like divine worth, redeem them from their repulsive irrelevance. Yet where they had embarked on a kind of holy mission of focus and attention, the very act of attending changed their relationship with the object of study. “Originally motivated by piety, unwavering attention directed to humble objects became an end in itself, infusing them with aesthetic and sentimental value,” Daston writes. Bonnet was crushed when he lost track of his aphid one day. Réaumur, for his part, was so moved by the spectacle of the bees he’d been observing trying to save their half-drowned queen that he rewarded them with honey for their “good-intentions.” “The naturalists came to regard their bees and aphids and even insects extracted from horses’ dung with wonder and affection,” Daston notes. As well, they derived enormous pleasure from the very act of paying close attention, noting that, absorbed in their observations, they could forget their own discomforts, their vertigo, their foot pain, the broiling summer heat.
Among themselves, the naturalists revered the ordealism of sitting for endless hours with probing eyes, but, within society at large, their behavior approached the taboo. “Too much attention paid to the wrong objects spoiled one for polite society as well as for the sober duties imposed by family, church and state,” Daston explains. Satirists made fun of them. Conversation manuals warned of the terrible rudeness of “ostentatious learning.”
Ever been stuck at a dinner table next to someone like this? I seem to remember an interminable conversation about “the cloud” in the early days of cloud computing. The narrator went on and on, schooling me on the baroque technicalities of remote data storage. Those seventeenth-century conversation manuals would have come in handy. He did not know it, but his faux pas linked him to the naturalists of yore. “The pathology of misdirected attention,” Daston writes, “rendered its victims oblivious to the social cues of age, rank, sex, vocation and education that skilled conversationalists effortlessly registered and to which they adapted their themes and manner accordingly.
“Ultimately,” she notes, “only God was a fitting object for such rapt attention.”
of the nineteenth century that attention as a secular force, attention in its everyday, moment-to-moment forms, began gaining currency in scientific circles. Jonathan Crary, attention historian, documents its trajectory. Attention was emerging from the intellectual background to become the target of laboratory-dwelling researchers, who were trying to answer some of the most basic, concrete questions. Among the unknowns: How many things could we pay attention to simultaneously? Indeed: Could we pay attention to more than one thing at the same time? How did a person attend to some things and not others? Was attention voluntary or automatic? In his Leipzig lab—the first in the world to study human psychology experimentally—Wilhelm Wundt was busy measuring his subjects’ reaction times to various stimuli he presented to them, trying to figure out what happens when the mind is confronted with two signals arriving at the same instant. Wundt was on a mission to make a science of self-reflection.
William James reflected upon Wundt’s studies, those precise, fine-grain, nuts-and-bolts pursuits, but, as well, he “complained about the tedium of German psychology, its immersion in monotonous details, and its failure to uncover even a single elementary law,” according to one James biographer, Gerald Myers. In his own writing on attention, James elevated the subject to loftier heights, introducing a philosophical grandeur and a poetic sensibility to the inquiry. In considering what grabs our attention and what doesn’t, James writes, “A faint tap per se is not an interesting sound; it may well escape being discriminated from the general rumour of the world. But when it is a signal, as that of a lover on the window-pane, it will hardly go unperceived.”
The general rumour of the world…It is phrases like these that made James irresistible. Through the thicket of reaction-time charts and graphs to be found in his pages, and the toxic drumbeat of my own iPhone lying facedown across the room, luring me with its potential unseen messages, I read on. James separated involuntary attention—as when responding to an outside force too compelling to ignore—from the willed, as when concentrating, for example, on intellectual work with effort and determination. “No one can possibly attend continuously to an object that does not change,” he wrote. “The conditio sine qua non of sustained attention to a given topic of thought is that we should roll it over and over incessantly and consider different aspects and relations of it in turn. Only in pathological states will a fixed and ever monotonously recurring idea possess the mind.”
In James’s treatment, attention was the means by which a person could claim agency and create a meaningful life. Attention, for James, was precious, essential, and often difficult. “My experience is what I agree to attend to,” he famously wrote, but he never pretended that “agreeing to attend” was a simple and effortless act. Voluntary, purposeful attention in a world of constant distraction was difficult, even then. It is perhaps the most difficult of all the tasks that confront us. “Whether the attention come by grace of genius or by dint of will, the longer one does attend to a topic the more mastery of it one has,” James wrote. “And the faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.”
In his personal life, James took a fire-hose approach to attention. Whatever came at him, he embraced. He believed this was how it should be done. “There can be no real doubt that William James, in his heart of hearts, embraced and welcomed chaos, cataclysm, change, Zerrissenheit,* impulse, and chance,” writes Robert Richardson, a James biographer. “He required himself to meet every demand…James needed constant challenges and perpetual demands, if only to prove that the inner well hadn’t run dry.” James was often to be found on a ship heading to Europe, compelled to accept the many invitations to lecture that arrived at his door, likely to write the contributing chapter, to dine, to travel, to hike. His world perpetually expanded.
James began an enthusiastic friendship with a new, like-minded correspondent, the French philosopher Henri Bergson, seventeen years his junior. Bergson and James shared a taste for the territory outside the strict rationalism of the scientific method: what might be called the mystical, or the occult. James, for his part, visited mediums and experimented with mescaline. He was interested in all of human behavior, all forms of human consciousness. Bergson too was unafraid to venture beyond the strict boundaries of science. He wrestled with the idea of attention and perception in his book Matter and Memory, writing that attention always proceeds on two different axes, the present and the past. Man was more than an automated response to his present surroundings—his memories were constantly in play, enriching his connection to the present. “The more I think about the question, the more I am convinced that life is from one end to the other a phenomenon of attention,” Bergson wrote to James, just after the turn of the century.
The timing of all this was not a coincidence. James didn’t just happen to take up the subject of attention when he did; neither did Bergson, Wundt, Théodule Ribot, Max Nordau, Edward B. Titchener, Henry Maudsley, or the rest of the thinkers engaged with the subject in the last years of the nineteenth century and the first of the twentieth. Jonathan Crary argues in his dense, erudite way that attention was thrust forward as a whole new kind of problem in these particular decades. It was the era of mass-scale industrialization, when workers were expected to stand in factories all day and somehow maintain perfect vigilance throughout. Indeed, their inattention could imperil the whole scheme. “Part of the cultural logic of capitalism demands that we accept as natural switching our attention rapidly from one thing to another,” Crary writes in Suspensions of Perception. The sociologist Gabriel Tarde described the impositions of the “machinofacture” economy as ones that forced workers to “an exhaustion of attention [that] is a new subtler form of torture, unknown to the crude purgatories of earlier times.”
And yet, if a capitalist economy had forced the shape of attention to change, what, exactly, was attention’s native state? With no interfering environments, no factory assembly lines to monitor, no iPhones in our pockets, what would our natural attentional rhythms look like? In The Shallows, Nicholas Carr argues that human attention had gone through a drastic restructuring long before industrialization: with the invention of the book. “The natural state of the human brain, like that of the brains of most of our relatives in the animal kingdom, is one of distractedness,” Carr writes. “To read a book was to practice an unnatural process of thought, one that demanded sustained, unbroken attention to a single, static object.” The reshaping of human attention by literature was, in its way, as violent as the relentless churn of factory life, strange as it is to say.
“The problem of attention is essentially a modern problem,” Titchener observed, shortly before the turn of the century. As more and more of these experimentalists and philosophers lent their minds to the problem of attention, it became clear, Crary writes, that attention was not the stable force it had formerly been understood to be, but rather volatile, entirely inseparable from the distraction that inevitably followed. “Attention always contained within itself the conditions for its own disintegration, it was haunted by the possibility of its own excess—which we all know so well whenever we try to look at or listen to any one thing for too long.”
What’s more, Crary points out, the newly urgent focus on attention brought an inherently destabilizing realization: if, as was now becoming clear, attention differed so dramatically from one person to the next, if we are each, indeed, constantly selecting a narrow slice from an infinity of options, then the illusion of one shared reality is shattered. And where and when could this be more obvious than here and now, as we move through our shared public spaces while visibly, flagrantly, consumed by the private realities playing out on our screens?
in the history of science, a subject that is passionately taken up for a decade or two can gradually be put down again, evaporating from the forefront of inquiry when the vogue for thinking in the old way succumbs to the new. This is more or less what happened to attention. “The physiology of attention is still a dark continent,” George Herbert Mead’s seminal Mind, Self, and Society would note in 1934. Attention had faded from the central concerns of the day as the nineteenth century bled into the twentieth. In psychology, the behaviorists rose to power, stressing their radically simplified model of stimulus and response to explain human behavior. Attention—implying agency, implying individual variation—didn’t even come into it. “The term ‘attention’ was effectively banished from the vocabulary of scientific psychology: the dominant theorists of the day found it useless,” wrote Daniel Kahneman, future Nobel laureate, in his 1973 tome Attention and Effort. But the 1960s, and the rise of cognitive psychology, had brought attention back from the darkness. It was, once again, the question that so many were trying to answer. Indeed, the most basic, fundamental mysteries about the nature of attention remained. Perhaps the most fundamental of all: Can attention ever truly be divided? If we are having one conversation, are we also following the substance of a second? If we are driving and talking, are we lending our minds to both at once, or cycling back and forth between them?
Though William James had so famously promised “everyone knows what attention is,” it turned out, a century and a half later, this really wasn’t the case. There was even, I discovered, a term in use by academics to label the overstepping: “Jamesian confidence.” I came across this phrase in a paper titled “There Is No Such Thing as Attention,” a paper with a certain kind of renown in its field. According to its author, Britt Anderson of the University of Waterloo, we have lost our bearings in the quest to make headway into the central mysteries of attention. Anderson names a few central culprits, first and foremost researchers’ tendency to “binarize,” to claim that attention is either one thing or the other. “We try and shoehorn everything into being either this or that,” he writes. For example, the question of “pre-attentive” versus “attentive” processing: How much is the conscious act of paying attention actually preceded and made possible by a kind of unconscious, involuntary scan? A preparing for the act of paying attention? For a long time, Anderson says, these two acts were accepted as distinct, pre-attention preceding attention in crisp serial fashion. Many studies were conducted to support this point. “A million trials later we conclude that in fact there are not two distinct kinds of searches,” Anderson notes.
Beyond the false binaries, attention research is muddied by “plurality”: there are too many different meanings of the word, too many different references. There is visual attention, and auditory. There is local versus global spatial attention. There is attentiveness in the sense of being prepared, there is vigilance that can shade over to a pathological excess of attentiveness, and there is the opposite pathology, a deficit of attention. There is the mystery of the attention span. And so on and on. And on. “The fact that there are so many variable definitions empowers researchers to create newer, eclectic ones,” Anderson argues. Anderson’s overall point, though, is even more broad. Rather than thinking of attention as a specific, concrete force, capable of causing certain effects, we should think of the inverse: “attention” as a word for many different brain states that are themselves caused by different environments, different conditions. “We need the right terms if we are to say something meaningful.”
I was in the slow process of gathering, when I took my seat on a hard-backed chair in the elegant, light-filled faculty lounge of New College, Oxford. I still didn’t know exactly what I was after, in this search for attention. Was I trying for intellectual understanding? For self-improvement? For some way to feel okay about the invasion of technology into our attentional fields? For someone to tell me that science said our phones have changed nothing? Or our phones have changed everything? For some way to accept my own shortcomings?
But there in Oxford, I had come to drink tea with Mark Stokes, the neuroscientist who heads Oxford’s prestigious “Attention Group.” Youngish, dark-haired, and Australian, Stokes, in white canvas sneakers, deftly consumed biscotti beneath an oil portrait of an unidentified Oxford luminary, a man who had once likely sat in this same faculty lounge himself. It was inevitable, I knew, that I would come face-to-face with neuroscience, the discipline currently in charge of setting the terms by which we humans seek to understand ourselves. My first question to Stokes was the one I most wanted to know the answer to: Has science convincingly defined “attention”?
“Certainly not,” Stokes replied. “I think one of the difficulties is that ‘attention’ is a common term. It’s a normal, everyday term that carries a lot of baggage. It’s not a scientific term; it’s something borrowed from normal language, which means that everyone who comes into the field comes with all that sort of folk psychology.”
Though attention might be an idea that belongs to all of us, Stokes said, what so often occurs in laboratories devoted to studying attention is a scenario that many outside of science would not recognize as having much to do with the thing itself. For instance, the research paradigm now accepted as the gold standard requires subjects to sit in a contraption that doesn’t allow them to move their eyes, as stimuli are presented in their peripheral vision. With this fixed eye position, researchers believe they can escape the “confounding” influence of eye movements on attention, isolating the relevant brain processes themselves.
“People agreed: the real science is looking at what happens when you don’t move your eyes,” Stokes said. “But of course then you end up in a weird situation. Because of course we do move our eyes, and that’s important. We are not just a brain in a jar.” Can this level of artifice—an essential part of precise, scientific reductionism—ever be relevant for those of us seeking to understand that multifaceted, waxing and waning, lived experience we know as attention?
“The jury is out,” Stokes told me. “The field is so dominated by visual attention. It’s important, but the lay idea of attention is really much more conceptual than that. You’ve got those classic quotes from William James, how attention is the ‘bringing into mind in clear and vivid form,’ and all this stuff, and that’s really what people think of attention—it’s focus, it’s concentration—and that’s really not been studied very much. You open a textbook on attention and it’s all, like, looking at orientation patches in visual fields of fixated monkeys. Not what the average person on the street thinks about when they think about attention.”
“Or cares about,” I added, pointedly. “Especially in our era when we’re living through the disintegration of attention.”
“Why?” said Stokes.
I knew we were now wandering away from the thrust of the conversation: I had come here to ask Stokes about the state of attention science in general, and his lab’s focus in particular. Not the more anecdotal, subjective, controversial, and all-around difficult-to-pin-down lived phenomenon of human attention in the age of Silicon Valley.
“Well, with the invasion of iPhones,” I said.
“But again, that depends on what you mean by ‘attention.’ With the Internet, it’s splintered concentration. You’re attending more. It’s not very sociable to sit at a dinner table attending to your phone, but informationally, it’s actually richer. There’s lots of attention going on, but it’s just on the phone.”
“But it’s not sustained attention,” I replied.
“Well, no, that’s another aspect of attention that’s different to the experimental notion of attention,” he said. “Concentration is the strongest lay idea of attention that’s not very well studied in our field. I think we do concentrate a lot now, it’s just not on the right stuff.”
“Yes, and we all know it feels wrong,” I said, knowing how such a statement would sound to a hard scientist, trained to gather data, to think empirically, not to make such childish statements about how something “feels.”
“Because we’ve got a higher goal in mind. We know we should be working on that book or whatever,” Stokes said. “But in terms of computational neuroscience, it’s all the same. There’s nothing better about focusing on the book than there is on focusing on our phone.”
Stokes told me, as if I couldn’t already tell, that when it comes to technology, he’s an optimist. Lucky him, I thought. What am I, exactly?
We said goodbye and I wandered back out onto the cobblestone, fairy-tale Oxford streets. I felt that my own brain had been powerfully affected by the technology I carried with me in my pocket at all times; I knew my relationship with literature had shifted, that my concentration had frayed, that my own sense of optimism on behalf of us all was never lower than when looking over to see that every person around me was squinting into their tiny screens. It was as if, with our eyes cast phone-ward, we were all telling one another, all the time, “You—whoever you are—are no longer worth my real attention.”
And yet, I couldn’t walk for ten minutes through the magisterial Oxford streets without taking out my phone to consult my “Moves” app, which tracked my steps, to monitor incoming texts from back home in the States, and, of course, to engage in the life-draining blur of the infinite Instagram scroll. This was all to say nothing of our political situation: with the election of Donald Trump to the White House the year before, it had become in some real sense our civic duty to be all the more glued to our phones, assiduously tracing every new headline, every new tweet, so that we might—at the least—bear witness to what was happening to our institutions and values. Tuning out of the endless news cycle, turning away from our screeching screens, now bore the stigma of political complacency.
At home, an entire shelf of my bookcase creaked under the weight of doomsday literature, title after title concerning our age of distraction. I owned these books, but I read them slowly: I did not seem to have the attention to absorb thousands of pages on the death of attention. I told myself that any writer must know they risked boring their readers in railing against our technology-soaked lives. Life in the age of the iPhone was here to stay, was it not? As well: Who was to say that this technological revolution was fundamentally different, larger, or more significant than the great upheavals that organize history, the ones that bring change and wreak havoc? We’ve all heard the argument before: each such disruption, from the printing press forward, has provoked hand-wringing over the future of mankind. Has the Internet spawned a truly new breed of catastrophe?
And yet, wandering Oxford’s ancient byways, the site of centuries of scholastic concentration, I came back to what I always came back to: the breathtaking casualness with which I had succumbed to the new habits and rhythms my phone asked of me. The nonchalance with which I had given up deeper, sustained engagement in favor of perpetually splintered focus. And I knew it wasn’t just me, of course. I wasn’t the only one acting as if my ill-defined, ever-elusive, utterly precious attention didn’t really matter at all.
* Torn to pieces–hood.