INTRODUCTION: ATTENTION AS A CULTURAL PROBLEM
The idea of writing this book gained strength one day when I swiped my bank card to pay for groceries. I watched the screen intently, waiting for it to prompt me to do the next step. During the following seconds it became clear that some genius had realized that a person in this situation is a captive audience. During those intervals between swiping my card, confirming the amount, and entering my PIN, I was shown advertisements. The intervals themselves, which I had previously assumed were a mere artifact of the communication technology, now seemed to be something more deliberately calibrated. These haltings now served somebody’s interest.
Such intrusions are everywhere. Taking a flight recently to Chicago, I pulled down the tray from the seat back in front of me and discovered that the entire tray top was devoted to an advertisement for Droid, the multimedia smartphone. At O’Hare International Airport, the moving handrail on the escalator was covered with an endlessly recurring message from the Lincoln Financial Group: You’re In Charge.® When I got to my hotel, I was handed a key card that was printed on one side with an advertisement for Benihana, the restaurant. Somehow, the fact that such a key card presents about five square inches for inevitable eyeballing had gone unnoticed, or rather unmonetized, until recently. Capitalism has gotten hip to the fact that for all our talk of an information economy, what we really have is an attentional economy, if the term “economy” applies to what is scarce and therefore valuable. As these last examples illustrate, the pertinent development here is a social technology, not something electronic. Turning unavoidable public surfaces into sites of marketing isn’t inherently “digital.”
We have developed methods for tuning out commercial messages, for example by inserting earbuds or burying our faces in our devices. Bus riders in Seoul, South Korea, find themselves at a new frontier: they have advertising squirted into their noses. A smell resembling that of Dunkin’ Donuts coffee is released into the ventilation system as a Dunkin’ Donuts advertisement plays over the bus’s sound system shortly before the bus stops outside a Dunkin’ Donuts store. An announcer points out the fact, in case it has somehow been missed. This kind of advertising is especially aggressive and indiscriminate, yet is also exquisitely well targeted to morning commuters who are primed to want coffee at the time they are exposed to the advertising, and there it is, right next to the bus stop! The advertising agency responsible was rewarded by its peers with a Bronze Lion award for “best use of ambient media.”1
There remain many areas for further progress. The homework, report cards, permissions slips, and other minor communications that a teacher sends home with students are in many school districts still blank on the back. Here is a gross offense against the efficient use of space. One forward-thinking school district in Peabody, Massachusetts, now sells advertising space on the backs of these slips of paper.
But intrusive advertising is just the tip of a larger cultural iceberg; some of the positive attractions of our attentional environment are no less troubling than the unwanted aspects. It’s hard to open a newspaper or magazine these days without reading a complaint about our fractured mental lives, diminished attention spans, and a widespread sense of distraction. Often the occasion for such a story is some new neuroscience finding about how our brains are being rewired by our habits of information grazing and electronic stimulation. Though it is in the first place a faculty of individual minds, it is clear that attention has also become an acute collective problem of modern life—a cultural problem.
Our susceptibility to being buffeted by various claims on our attention is surely tied to the “intensification of nervous stimulation” that the German sociologist Georg Simmel identified with the metropolitan environment over a hundred years ago. Think of the corporate manager who gets two hundred emails per day and spends his time responding pell-mell to an incoherent press of demands. The way we experience this, often, is as a crisis of self-ownership: our attention isn’t simply ours to direct where we will, and we complain about it bitterly. Yet this same person may find himself checking his email frequently once he gets home or while on vacation. It becomes effortful for him to be fully present while giving his children a bath or taking a meal with his spouse. Our changing technological environment generates a need for ever more stimulation. The content of the stimulation almost becomes irrelevant. Our distractibility seems to indicate that we are agnostic on the question of what is worth paying attention to—that is, what to value.2
To answer this question freely requires shelter; a space for seriousness. The moralist will say that one has to carve out this space for oneself resolutely, against the noise, and that to fail to rise to this task of evaluation is to give oneself over to nihilism, in which all distinctions are leveled and all meaning gives way to mere “information.”
A sociologist might go easier on us and locate our difficulty not in our individual moral failures but in a collective situation, pointing out that there aren’t many limits on our mental lives of the sort that prevailed before we had immediate access to the world beyond our own narrow horizon of experience. That horizon has been exploded; all manner of once-weird stuff is now a click away. There are so many enticements, but just as important, there is little in the way of authoritative guidance of the sort that was once supplied by tradition, religion, or the kind of communities that make deep demands on us.
The moralist and the sociologist are both right. The question of what to attend to is a question of what to value, and this question is no longer answered for us by settled forms of social life. We have liberated ourselves from all that. The downside is that as autonomous individuals, we often find ourselves isolated in a fog of choices. Our mental lives become shapeless, and more susceptible to whatever presents itself out of the ether. But of course these presentations are highly orchestrated; commercial forces step into the void of cultural authority and assume a growing role in shaping our evaluative outlook on the world. Because of the scale on which these forces operate, our mental lives converge in a great massification—ironically, under the banner of individual choice.
Our mental fragmentation can’t simply be attributed to advertising, the Internet, or any other identifiable villain, for it has become something more comprehensive than that, something like a style of existence. It is captured pretty well in the following satirical news item from The Onion.
GAITHERSBURG, MD—While cracking open his second beer as he chatted with friends over a relaxed outdoor meal, local man Marshall Platt, 34, was reportedly seconds away from letting go and enjoying himself when he was suddenly crushed by the full weight of work emails that still needed to be dealt with,… an upcoming wedding he had yet to buy airfare for because of an unresolved issue with his Southwest Rapid Rewards account, and phone calls that needed to be returned.
“It’s great to see you guys,” said the man who had been teetering on the brink of actually having fun and was now mentally preparing for a presentation that he had to give on Friday and compiling a list of bills that needed to be paid before the 7th. “This is awesome.”
“Anyone want another beer?” continued Platt as he reminded himself to pick up his Zetonna prescription. “Think I’m gonna grab one.”
Platt, who reportedly sunk into a distracted haze after coming to the razor’s edge of experiencing genuine joy, fully intended to go through the motions of talking with friends and appearing to have a good time, all while he mentally shopped for a birthday present for his mother, wracked his brain to remember if he had turned in the itemized reimbursement form from his New York trip to HR on time, and made a silent note to call his bank about a mysterious recurring $19 monthly fee that he had recently discovered on his credit card statement.3
I think most of us can recognize ourselves in Mr. Platt. Is “modern life” really so burdensome? Yes it is. But Mr. Platt seems to have a deeper difficulty as well: joy can get no grip on him. The sketch seems to be about the little tasks that claim his attention, but at the center of it is an ethical void. He is unable to actively affirm as important the pleasure of being with friends. He therefore has no basis on which to resist the colonization of life by hassle.
Clearly, no single discipline or body of thought is adequate to parse the crisis of attention that characterizes our cultural moment. There is a rich literature on attention in cognitive psychology, extending from William James’s work of a century ago to the latest findings in childhood development. There are scattered treatments in moral philosophy, and these are indispensable. The fact has not been widely noticed, but attention is the organizing concern of the tradition of thought called phenomenology, and this tradition offers a bridge between the mutually uncomprehending fields of cognitive psychology and moral philosophy. What is required, then, is a highly synthetic effort—we can call it philosophical anthropology.
Through this inquiry I hope to arrive at something like an ethics of attention for our time, grounded in a realistic account of the mind and a critical gaze at modern culture. I should note here that I am using the term “ethics” in its original sense—not primarily as an account of what we are obliged or forbidden to do, but as a more capacious reflection on the sort of ethos we want to inhabit. Nor do I wish to join the culture wars surrounding “technology”—as being either an apocalyptic force or a saving one that heralds the arrival of a new global intelligence, etc. I want rather to tunnel beneath that intellectual cul-de-sac and trace the subterranean strata—the historically sedimented geological structures—of our age of distraction, the better to map our way out of it.
An ethics of attention would have to begin by taking seriously, and trying to make sense of, the qualitative character of first-person experience in our contemporary cognitive environment: by turns anxious, put-upon, distracted, exhausted, enthralled, ecstatic, self-forgetting. The thing is, we are very sophisticated. As the inheritors of layers of theorizing about the human person, we find it no trivial task to recover a more direct access to our own experience. In the course of trying to do that, I have found it necessary to scrutinize certain background assumptions about the self that shape our experience. It has been said (by Iris Murdoch) that man is the animal that makes pictures of himself, and then comes to resemble the pictures.
Such pictures come to us from various departments of the human sciences. In ways that bear directly on our theme, these sciences continue to be informed by the agenda of the Enlightenment. (I will have more to say about that shortly.) This agenda shaped a very partial view of the human person, one that we have been operating with for centuries but has become in various ways poorly suited to our circumstances. My hope is that a fuller picture will be both truer and more serviceable for us in finding a way through our current predicament of attention.
But I have gotten ahead of the argument. Allow me to simply describe some further dimensions of that predicament.
THE ATTENTIONAL COMMONS
We have all had the experience of sitting in an airport with an hour to kill and being unable to escape the chattering of CNN. The audio may be turned off, but if the TV is within view, I, for one, find it impossible not to look at it. The introduction of novelty into one’s field of view commands what the cognitive psychologists call an orienting response (an important evolutionary adaptation in a world of predators): an animal turns its face and eyes toward the new thing. A new thing typically appears every second on television. The images on the screen jump out of the flow of experience and make a demand on us. In their presence it is difficult to rehearse a remembered conversation, for example. Whatever trains of thought might otherwise be pursued by those in the room give way to a highly coordinated experience: not the near-simultaneous turning of a troupe of macaques to face the python that has appeared, but the involuntary glances of weary travelers toward the “content” on offer.
Alternatively, people in such places stare at their phones or open a novel, sometimes precisely in order to tune out the piped-in chatter. A multiverse of private experiences is accessible after all. In this battle of attentional technologies, what is lost is the kind of public space that is required for a certain kind of sociability. Jonathan Franzen wrote, “Walking up Third Avenue on a Saturday night, I feel bereft. All around me, attractive young people are hunched over their StarTacs and Nokias with preoccupied expressions, as if probing a sore tooth … All I really want from a sidewalk is that people see me and let themselves be seen…”
A public space where people are not self-enclosed, in the heightened way that happens when our minds are elsewhere than our bodies, may feel rich with possibility for spontaneous encounters. Even if we do not converse with others, our mutual reticence is experienced as reticence if our attention is not otherwise bound up, but is rather free to alight upon one another and linger or not, because we ourselves are free to pay out our attention in deliberate measures. To be the object of someone’s reticence is quite different from not being seen by them; we may have a vivid experience of having encountered another person, even if in silence. Such encounters are always ambiguous, and their need for interpretation gives rise to a train of imaginings, often erotic. This is what makes cities exciting.
Psychologists have suggested that attention may be categorized by whether it is goal-driven or stimulus-driven, corresponding to whether it is in the service of one’s own will or not. A teacher taking a head count on a chaotic school bus is engaged in the first, “executive” kind of attention. By contrast, if there is a sudden bang outside my window, my attention is stimulus-driven. I may or may not go to the window to investigate, but the claim on my attention is involuntary.
The orienting response requires of us a concerted effort of executive attention if we are to resist it, and our capacity for such resistance is finite. Of course, in my airport example, one can simply shift in one’s seat and avert one’s gaze from the screens. But the fields of view that haven’t been claimed for commerce seem to be getting fewer and narrower. The ever more complete penetration of public spaces by attention-getting technologies exploits the orienting response in a way that preempts sociability, directing us away from one another and toward a manufactured reality, the content of which is determined from afar by private parties that have a material interest in doing so. There is no conspiracy here, it’s just the way things go.
When we go through airport security, the public authority makes a claim on our attention for the common good. This moment is emblematic of the purpose for which political authority in a liberal regime is originally instituted—public safety—and rightly has a certain gravity to it. But in the last few years, I have found I have to be careful at the far end of the process, because the bottoms of the gray trays that you place your items in for X-ray screening are now papered with advertisements, and their visual clutter makes it very easy to miss a pinky-sized flash memory stick against a picture of fanned-out L’Oréal lipstick colors.
I am already in a state of low-level panic about departure times, possible gate changes, and any number of other contingencies that have to be actively monitored while traveling, to say nothing of the fact that my memory is tapped out with detailed concerns about the talk I am going to have to give in front of strangers in a few hours. This fresh demand for vigilance, lest I lose my PowerPoint slide show, feels like a straightforward conflict between me and L’Oréal.
Somehow L’Oréal has the Transportation Security Administration on its side. Who made the decision to pimp out the security trays with these advertisements? The answer, of course, is that Nobody decided on behalf of the public. Someone made a suggestion, and Nobody responded in the only way that seemed reasonable: here is an “inefficient” use of space that could instead be used to “inform” the public of “opportunities.” Justifications of this flavor are so much a part of the taken-for-granted field of public discourse that they may override our immediate experience and render it unintelligible to us. Our annoyance dissipates into vague impotence because we have no public language in which to articulate it, and we search instead for a diagnosis of ourselves: Why am I so angry? It may be time to adjust the meds.
In the main currents of psychological research, attention is treated as a resource—a person has only so much of it. Yet it does not occur to us to make a claim for our attentional resources on our own behalf. Nor do we yet have a political economy corresponding to this resource, one that would take into account the peculiar violations of the modern cognitive environment. Toward this end, I would like to offer the concept of an attentional commons.
There are some resources that we hold in common, such as the air we breathe and the water we drink. We take them for granted, but their widespread availability makes everything else we do possible. I think the absence of noise is a resource of just this sort. More precisely, the valuable thing that we take for granted is the condition of not being addressed. Just as clean air makes respiration possible, silence, in this broader sense, is what makes it possible to think. We give it up willingly when we are in the company of other people with whom we have some relationship, and when we open ourselves to serendipitous encounters with strangers. To be addressed by mechanized means is an entirely different matter.
The benefits of silence are off the books. They are not measured directly by any econometric instrument such as gross domestic product, yet the availability of silence surely contributes to creativity and innovation. They do not show up explicitly in social statistics such as level of educational achievement, yet one consumes a great deal of silence in the course of becoming educated.
If clean air and water were no longer the rule for us, the economic toll would be truly massive. This is easy to grasp, and that is why we have regulations in place to protect these common resources. We recognize their importance and their fragility. We also recognize that absent robust regulations, air and water will be used by some in ways that make them unusable for others—not because they are malicious or careless, but because they can make money using them this way. When this occurs, it is best understood as a transfer of wealth from “the commons” to private parties.
A notable feature of the gangsterish regimes that rule in many formerly Communist countries is the apparent absence, or impotence, of any notion of a common good. Wherever communism was established by coercion, when it later collapsed and private interests were allowed to assert themselves it became clear that there was no well-established intellectual foundation for defending such shared resources as clean air and water. Many citizens of these countries now live in the environmental degradation that results when privatization has no countervailing force of public-spiritedness. We in the liberal societies of the West find ourselves headed toward a similar condition with regard to the resource of attention, because we do not yet understand it to be a resource.4
Or do we? Silence is now offered as a luxury good. In the business-class lounge at Charles de Gaulle airport, what you hear is the occasional tinkling of a spoon against china. There are no advertisements on the walls, and no TVs. This silence, more than any other feature of the space, is what makes it feel genuinely luxurious. When you step inside and the automatic airtight doors whoosh shut behind you, the difference is nearly tactile, like slipping out of haircloth into satin. Your brow unfurrows itself, your neck muscles relax; after twenty minutes you no longer feel exhausted. The hassle lifts.
Outside the lounge is the usual airport cacophony. Because we have allowed our attention to be monetized, if you want yours back you’re going to have to pay for it.
As the commons gets appropriated, one solution, for those who have the means, is to leave the commons for private clubs such as the business-class lounge. Consider that it is those in the business lounge who make the decisions that determine the character of the peon lounge and we may start to see these things in a political light. To engage in playful, inventive thinking, and possibly create wealth for oneself during those idle hours spent at an airport, requires silence. But other people’s minds, over in the peon lounge (or at the bus stop) can be treated as a resource—a standing reserve of purchasing power to be steered according to innovative marketing ideas hatched by the “creatives” in the business lounge. When some people treat the minds of other people as a resource, this is not “creating wealth,” it is a transfer.5 The much-discussed decline of the middle class in recent decades, and the ever greater concentration of wealth in a shrinking elite, may have something to do with the ever more aggressive appropriations of the attentional commons that we have allowed to take place.
This becomes especially pertinent in the era of big data, when we find ourselves the objects of attention-getting techniques that are not only pervasive, but increasingly well targeted. There is currently much talk of a right to privacy in our digital lives. Apart from the usual concerns about online security and identity theft, I have to confess that I am not terribly worried about keeping particular facts about myself hidden from the data-mongers—until they use that data to make a claim on my attention. I think we need to sharpen the conceptually murky right to privacy by supplementing it with a right not to be addressed. This would apply not, of course, to those who address me face-to-face as individuals, but to those who never show their face, and treat my mind as a resource to be harvested by mechanized means.
* * *
Attention is the thing that is most one’s own: in the normal course of things, we choose what to pay attention to, and in a very real sense this determines what is real for us; what is actually present to our consciousness. Appropriations of our attention are then an especially intimate matter.
But it is also true that our attention is directed to a world that is shared; one’s attention is not simply one’s own, for the simple reason that its objects are often present to others as well. And indeed there is a moral imperative to pay attention to the shared world, and not get locked up in your own head. Iris Murdoch writes that to be good, a person “must know certain things about his surroundings, most obviously the existence of other people and their claims.”6
Consider the person talking on his cell phone while cruising through a crowded suburban commercial district, with a motorcyclist in the lane next to him. Driving while talking on a cell phone impairs performance as much as driving while legally drunk.7 It doesn’t matter whether the phone is hands-free or not; the issue is that having a conversation uses attentional resources, of which we have a finite amount. It especially impairs our ability to notice and register novel things in the environment; psychologists call this inattentional blindness. Pedestrians who walk while talking on a cell phone weave more, change direction more, cross the street in a riskier way, are less likely to acknowledge others (that is, be sociable), and, in the findings of a recent experiment, are less likely to notice the clown on a unicycle who just rode past.8 Put a person with this level of impairment behind the wheel of a two-ton, two-hundred-horsepower car and his blindness becomes an apt topic in discussions of what we owe one another. In the attentional commons, circumspection—literally looking around—would be one element of justice.
One of the more interesting findings to come out of the research on distracted driving is that, while having a cell phone conversation impairs driving ability, having a conversation with someone present in the car does not. A person who is present can cooperate by modulating the conversation in response to the demands of the driving situation.9 For example, if the weather is bad he tends to be quiet. A passenger acts as another pair of eyes on the situation he inhabits with the driver, and tends to improve a driver’s ability to notice and quickly respond to out-of-the-ordinary challenges.
The idea of a commons is suitable in discussing attention because, first, the penetration of our consciousness by interested parties proceeds very often by the appropriation of attention in public spaces, and second, because we rightly owe to one another a certain level of attentiveness and ethical care. The words italicized in the previous sentence rightly put us in a political economy frame of mind, if by “political economy” we can denote a concern for justice in the public exchange of some private resource.
THE ASCETICS OF ATTENTION
The existentialist writer Simone Weil and the psychologist William James both suggested that the struggle to pay attention trains the faculty of attention; it is a habit built up through practice. Grappling with a problem for which one has little aptitude or inclination (a geometry problem, say) exercises one’s power to attend. For Weil, this ascetic aspect of attention—the fact that it is a “negative effort” against mental sloth—is especially significant. “Something in our soul has a far more violent repugnance for true attention than the flesh has for bodily fatigue. This something is much more closely connected with evil than is the flesh. That is why every time that we really concentrate our attention, we destroy the evil in ourselves.” Students must therefore work “without any reference to their natural abilities and tastes; applying themselves equally to all their tasks, with the idea that each one will help to form in them the habit of attention which is the substance of prayer.”
It should be duly noted that Weil was a mystic who (some say) deliberately starved herself to death, and indeed her dismissal of natural inclinations in the young suggests she was more infatuated with self-mortification than she was seriously concerned with how students might best learn. Yet Weil’s existential melodrama shouldn’t prevent us from appreciating her point that the ascetic disposition has an important role in education. To attend to anything in a sustained way requires actively excluding all the other things that grab at our attention. It requires, if not ruthlessness toward oneself, a capacity for self-regulation.
And reciprocally, the ability to control oneself in the face of some temptation is greatly enhanced by, indeed seems simply to be, the ability to direct one’s attention toward something else. In a classic psychology experiment, Walter Mischel and E. B. Ebbesen gave children the option of having one marshmallow immediately or, if they were able to wait fifteen minutes, two marshmallows.10 Left alone with the marshmallow at hand, some broke down and gobbled it immediately, others after a brief struggle. But about a third of the children succeeded in deferring gratification and getting the bigger payoff. Those who did so were those who distracted themselves from the marshmallow by playing games under the table, singing songs, or imagining the marshmallow as a cloud, for example. In a follow-up study of the same children a dozen years later, their initial performance on the self-regulation task was more predictive of life success than any other measure, including IQ and socioeconomic status. The researchers’ interpretation of their results is that it isn’t willpower (as conventionally understood) that distinguishes the successful children, it is the ability to strategically allocate their attention so that their actions aren’t determined by the wrong thoughts. Self-regulation, like attention, is a resource of which we have a finite amount. Further, the two resources are intimately related. Thus, if someone is tasked with controlling her impulses for some extended period of time, her performance shortly thereafter on a task requiring attention is degraded.
Without the ability to direct our attention where we will, we become more receptive to those who would direct our attention where they will—to the omnipresent purveyors of marshmallows. To the extent that the power of concentration is widely attenuated, so too is the power of self-regulation. We become more easily suggestible and buy more stuff. I suppose this is good for economic growth. But if consumer capitalism can go on only by continuing to accelerate the “intensification of nervous stimulation,” there would seem to be a fundamental antagonism between this form of economic life and the individual who inhabits it. That is, we may have a problem.
INDIVIDUALITY
The media have become masters at packaging stimuli in ways that our brains find irresistible, just as food engineers have become expert in creating “hyperpalatable” foods by manipulating levels of sugar, fat, and salt.11 Distractibility might be regarded as the mental equivalent of obesity.
The palatability of certain kinds of mental stimulation seems to be hard-wired, just as our taste for sugar, fat, and salt is. When we inhabit a highly engineered environment, the natural world begins to seem bland and tasteless, like broccoli compared with Cheetos. Stimulation begets a need for more stimulation; without it one feels antsy, unsettled. Hungry, almost.
One consequence of this is that we are becoming more alike. I open a book of Aristotle and try to read a page of his choppy, gnomic Greek. After a few lines I start to shift my weight in the chair and drum my fingers on the table. It is Tuesday night, after all. I turn on Sons of Anarchy, and share the experience with 4.6 million of my closest friends. The next day, I have some basis for chitchat with others. I am not a freak. If I had gotten absorbed in the Nicomachean Ethics, my head would still be turning in a spiral of untimely meditations that could only sound strange to my acquaintances.
There is, then, a large cultural consequence to our ability to concentrate on things that aren’t immediately engaging, or our lack of such ability: the persistence of intellectual diversity, or not. To insist on the importance of trained powers of concentration is to recognize that independence of thought and feeling is a fragile thing, and requires certain conditions.
What sort of ecology can preserve a robust intellectual biodiversity? We often assume that diversity is a natural upshot of free choice. Yet the market ideal of choice and attendant preoccupation with freedom tends toward a monoculture of human types: the late modern consumer self. At least the market seems to have this effect when we are constantly being addressed with hyperpalatable stimuli. What sort of outlier would you have to be, what sort of freak of self-control, to resist those well-engineered cultural marshmallows?
According to the prevailing notion, to be free means to be free to satisfy one’s preferences. Preferences themselves are beyond rational scrutiny; they express the authentic core of a self whose freedom is realized when there are no encumbrances to its preference-satisfying behavior. Reason is in the service of this freedom, in a purely instrumental way; it is a person’s capacity to calculate the best means to satisfy his ends. About the ends themselves we are to maintain a principled silence, out of respect for the autonomy of the individual. To do otherwise would be to risk lapsing into paternalism. Thus does liberal agnosticism about the human good line up with the market ideal of “choice.” We invoke the latter as a content-free meta-good that bathes every actual choice made in the softly egalitarian, flattering light of autonomy.
This mutually reinforcing set of posits about freedom and rationality provides the basic framework for the discipline of economics, and for “liberal theory” in departments of political science. It is all wonderfully consistent, even beautiful.
But in surveying contemporary life, it is hard not to notice that this catechism doesn’t describe our situation very well. Especially the bit about our preferences expressing a welling-up of the authentic self. Those preferences have become the object of social engineering, conducted not by government bureaucrats but by mind-bogglingly wealthy corporations armed with big data. To continue to insist that preferences express the sovereign self and are for that reason sacred—unavailable for rational scrutiny—is to put one’s head in the sand. The resolutely individualistic understanding of freedom and rationality we have inherited from the liberal tradition disarms the critical faculties we need most in order to grapple with the large-scale societal pressures we now face.
The language of preference satisfaction and the attendant preoccupation with freedom seem ill-suited to our current circumstances, if what we want is to preserve human possibilities from going extinct. If you were to regularly air-drop Cheetos over the entire territory of a game preserve, you would probably find that all the herbivores preferred them right away to whatever pathetic grubs and roots they had been eating before. A few years later, the lions would have decided that hunting is not only barbaric but, worse, inconvenient. The cheetahs would come around eventually—all that running!—and the savannah would be ruled by three-toed sloths. With orange fur.
I recently visited Las Vegas, a place designed for the single purpose of separating you from your money—by tapping into your preferences. The female form is used quite freely there in advertisements, bombarding you from the moment you step off your airplane. These images work just as surely as tying a rope to a person’s neck and giving it a sharp yank. Once the initial excitement wears off, you find yourself in a place that is somehow not a place. No merely local flora can compete for air and light. Nothing subtle—no feeling that isn’t industrial-strength in its urgency and standardized in its appeal—can arise in such a ruthlessly monetized attentional environment.
After a day, I had to get out of there, so I rented a car. Driving through the desert, I stopped at a gas station/slot machine arcade/liquor store/fireworks emporium on an Indian reservation. A few hundred years ago, the fitness of Native Americans for the world they inhabited excited admiration in some European observers: here were natural aristocrats, disdainful of labor, dedicated to war. Unlike European peasants stooped to the grind of agriculture, anxiously accumulating grain against future want, the Indian appeared free because confident of his ability to bear hardship; leisured because tough. Whatever projections this might have involved, whatever need of the European mind was being served by the image of the noble savage, there were real cultural differences here that provided an external point of reference for self-criticism.
Then along came liquor, fast food, satellite television, methamphetamine, and all the rest. Clearly these things tapped into appetites that, before the arrival of the pertinent technologies, had been merely latent in the lifeworld of Native Americans. And clearly these candy-and-narcotics technologies played a role in their conquest and continued pacification. My impression, admittedly superficial, was that the inhabitants of this reservation were in a state of degradation that went beyond economic hardship—and that this little roadside emporium offered a glimpse into the future.
One thing that distinguishes human beings from other animals is that we are evaluative creatures. We can take a critical stance toward our own activities, and aspire to direct ourselves toward objects and projects that we judge to be more worthy than others that may be more immediately gratifying. Animals are guided by appetites that are fixed, and so are we, but we can also form a second-order desire, “a desire for a desire,” when we entertain some picture of the sort of person we would like to be—a person who is better not because she has more self-control, but because she is moved by worthier desires.
Acquiring the tastes of a serious person is what we call education. Does it have a future? The advent of engineered, hyperpalatable mental stimuli compels us to ask the question. The transformation of the Native American lifeworld, like the transformation currently under way in our attentional environment, points up the limitations of the idea of individual self-determination and of exhortations to exert more self-control. We’re in it together. This makes it political.
ACHIEVING A COHERENT SELF
We are wired to attend to our environment, but certain kinds of thinking require that we ignore it. Thus, when trying to recall something from memory, a person will often stare up toward the blank sky, or avert her gaze from the scene before her. Similarly, trying to predict the future and plan for it is an act of imagination that requires getting free of the present. In an influential article in Behavioral and Brain Sciences, Arthur M. Glenberg offers an evolutionary argument for why this kind of thinking feels effortful.
Suppressing the environment is dangerous because features of the environment that normally should be controlling action are ignored. “The effort is a warning signal: Take care; you are not attending to your actions!” Because it is effortful, we use suppression conservatively. Such an account makes sense of certain behaviors. Glenberg observes that “when working on a difficult intellectual problem (which should require suppression of the environment), we reduce the rate at which we are walking to avoid injury.”12
He goes on to make the fertile suggestion that “autobiographical memory arises from suppressing the environment.” Around the age of two or three years, as a child develops language, she learns to use narrative to organize and relate her experiences. By doing so, she starts to develop a coherent concept of self. This requires suppressing environmental input so the child can control what she is thinking about. And reciprocally, the ability to use language supports the ability to suppress the environment and control one’s recollective experience.
While animals certainly have memory and the ability to learn, human beings are thought to be the only creatures who can deliberately recall something not cued by the environment.13 But we do this only in those stretches of time when the environment is not making urgent claims on our attention. It is at these times that we try to find (or impose) coherence on our experience, retroactively. If we are currently facing a culturally and technologically induced trauma to our ability to suppress environmental input, that raises a big question: Is this distinctly human activity of coherence-finding at risk?
I think it is safe to say that our ability to suppress the environment is under greater pressure than it once was. It may be that this pressure is acutely felt only by an adult generation that developed in one attentional landscape and now finds itself inhabiting another, more highly engineered one. Younger people are famously comfortable with it all. The question remains whether we should take comfort in their comfort.
That is to say, is something important to human flourishing at risk or not? How you answer that question would seem to depend on how you understand “rational agency,” to use a term of art from philosophy. Allow me to sketch two positions on this.
According to the first, what we really mean when we say that human beings tell stories and seek coherence is that we do things for reasons. We offer these reasons up to others (and ourselves) in language. This is what it means to be a rational agent rather than a billiard ball that is simply moved by impinging forces, or an animal that lives entirely in the moment. We have this unique tendency to want to justify ourselves, and construct a narrative that conveys the considerations that made an action seem choiceworthy. And sure, this narrative is often self-serving or self-deceptive. But however inept we may be at it, it remains true that we keep trying to “make ourselves, and our proper aspirations, articulate to ourselves,” as the philosopher Talbot Brewer has written.
If Glenberg is right about memory and environmental suppression, it would seem this activity of narrative self-articulation gets under way, developmentally, with the capacity to ignore things. Further, because this self-articulation is something we are never finished with, an ability to ignore things would seem to remain important to the lifelong task of carving out and maintaining a space for rational agency for oneself, against the flux of environmental stimuli. What happens when our attention is subject to mechanized appropriation, through the pervasive use of hyperpalatable stimuli? On this first view, what is at stake in our cultural moment would seem to be the conditions for the possibility of achieving a coherent self.
But there is another position, or family of positions, that would regard this concern with a certain bemusement, because it is convinced that rational agency is an illusion. This stance is evident in a few different departments of the human sciences. Behavioral economics is impressed with psychological findings that suggest that the reasons for our actions are generally opaque to us, not objects of rational scrutiny. Whatever reason-giving we engage in tends to be a post hoc story that we tell ourselves, and is therefore beside the point if we are trying to understand human behavior. And it is indeed behavior that this discipline takes as its subject matter, not the self-understandings that accompany that behavior and give our actions their distinctly human character.
The field of neuroethics pushes this line of argument further: free will is an illusion. The experience we have of deliberating before some important decision is a mere bit of electrical chatter that our brains generate, the effect of which is to obscure from us the fact that our decision was cast before we were even aware of it. This electrical reason-chatter is said to serve some evolutionary function yet to be discovered. But regrettably, claims the neuroethicist, it also gives rise to metaphysical superstitions about the existence of mind.14
On this view, one shouldn’t get too invested in making distinctions between billiard balls and human beings. And there would seem to be no reason for alarm at the transformation of our attentional landscape, as this amounts to a mere change in the array of sensory inputs impinging on the brain. The cherished “coherence” of the self is a myth we ought to grow out of anyway. We can even imagine an especially consistent neuroethicist surveying the airport scene I have described and viewing it with a certain satisfaction: maybe an environment that is sufficiently stimulating will divert us from indulging in reason-giving, that quaint activity by which man clings to the idea that he is somehow special.
Do we have to choose between this scolding antimental view and the alarm that seems warranted if we take rational agency seriously? The problem with the rationalist position as I have sketched it is that it seems too mental—too deliberate and individual. The rare person who has devoted himself to the examined life may consciously struggle to “make himself, and his proper aspirations, articulate to himself.” But the rest of us, standing in line at the Department of Motor Vehicles? It sounds more like a midlife crisis than like something we do day-to-day.
There is another way to think about these things. What if the coherence of a life is in some significant way a function of culture? What if we are situated among our fellows in norms and practices that shape a life? In that case culture matters. That is, the environment matters, in a stronger way than one supposes if one adopts the interior, fully articulate model of rational agency, on the one hand, or the antimental, brain-centered view, on the other.
THE SITUATED SELF
One element of our predicament is that we engage less than we once did in everyday activities that structure our attention. Rituals do this, for example. They answer for us the question “What is to be done next?” and thereby relieve us of the burden of choice and reflection, as when we recite a liturgy. But I want to focus on another sort of activity, one that is neither rote like ritual, nor simply a matter of personal choice. The activities I have in mind are skilled practices.
Cooking an elaborate meal for an important occasion would be one example. Such practices locate the possible answers to the question “What is to be done next?” outside our own heads, in our relations to objects and to other people. They establish narrow and highly structured patterns of attention—what I shall be calling ecologies of attention—that can give coherence to our mental lives, however briefly. In such an ecology, the perception of a skilled practitioner is “tuned” to the features of the environment that are pertinent to effective action; extraneous information is dampened and irrelevant courses of action disappear. As a result, choice is simplified and momentum builds. Action becomes unimpeded.
In a previous book, Shop Class as Soulcraft, I wrote about the de-skilling of everyday life. The core theme was individual agency: the experience of seeing a direct effect of your actions in the world, and knowing that these actions are genuinely your own. I suggested that genuine agency arises not in the context of mere choices freely made (as in shopping) but rather, somewhat paradoxically, in the context of submission to things that have their own intractable ways, whether the thing be a musical instrument, a garden, or the building of a bridge.
A related set of ideas will be elaborated from a different angle in this book, most explicitly in Part I, “Encountering Things.” There I suggest that it is indeed things that can serve as a kind of authority for us, by way of structuring our attention. The design of things—for example, cars and children’s toys—conditions the kind of involvement we have in our own activity. Design establishes an ecology of attention that can be more or less well adapted to the requirements of skillful, unimpeded action.
The terms “submission” and “authority” are jarring to the modern ear. They may be especially unexpected here—haven’t I been making a case for reclaiming our mental autonomy? But in fact, I think the experience of attending to something isn’t easily made sense of within the prevailing Western anthropology that takes autonomy as the central human good.
Understood literally, autonomy means giving a law to oneself. The opposite of autonomy thus understood is heteronomy: being ruled by something alien to oneself. In a culture predicated on this opposition (autonomy good, heteronomy bad), it is difficult to think clearly about attention—the faculty that joins us to the world—because everything located beyond your head is regarded as a potential source of heteronomy, and therefore a threat to the self.
This sounds like an overstatement, perhaps. But it is implicit in the view of the human person we have received from certain early modern thinkers who were working out a new and quite radical notion of freedom. To do justice to the phenomenon of attention, we will have to wrestle with that notion of freedom. This is the explicit theme of the section “Interlude: A Brief History of Freedom.” For now, I will simply alert the reader to be on the lookout for a somewhat paradoxical thread that runs through these pages. The paradox is that the ideal of autonomy seems to work against the development and flourishing of any rich ecology of attention—the sort in which minds may become powerful and achieve genuine independence.
In the chapters that follow we will consider the ways our environment constitutes the self, rather than compromises it. Attention is at the core of this constitutive or formative process. When we become competent in some particular field of practice, our perception is disciplined by that practice; we become attuned to pertinent features of a situation that would be invisible to a bystander. Through the exercise of a skill, the self that acts in the world takes on a definite shape. It comes to be in a relation of fit to a world it has grasped.
To emphasize this is to put oneself at odds with some pervasive cultural reflexes. Any quick perusal of the self-help section of a bookstore teaches that the central character in our contemporary drama is a being who must choose what he is to be, and bring about his transformation through an effort of the will. It is a heroic project of open-ended, ultimately groundless self-making. If the attentive self is in a relation of fit to a world it has apprehended, the autonomous self is in a relation of creative mastery to a world it has projected.
The latter self-understanding is an invitation to narcissism, to be sure. But it also tends to make us more easily manipulated. As atomized individuals called to create meaning for ourselves, we find ourselves the recipients of all manner of solicitude and guidance. We are offered forms of unfreedom that come slyly wrapped in autonomy talk: NO LIMITS!, as the credit card offer says. YOU’RE IN CHARGE. Autonomy talk speaks the consumerist language of preference satisfaction. Discovering your true preferences requires maximizing the number of choices you face: precisely the condition that makes for maximum dissipation of one’s energies. Autonomy talk is a flattering mode of speech. It suggests that freedom is something we are entitled to, and it consists in liberation from constraints imposed by one’s circumstances.
The image of human excellence I would like to offer as a counterweight to freedom thus understood is that of a powerful, independent mind working at full song. Such independence is won through disciplined attention, in the kind of action that joins us to the world. And—this is important—it is precisely those constraining circumstances that provide the discipline.
This claim—about the role of attention in bringing the self into a relation of fit to the external world—is part of a broader anthropological assertion that runs through the book: we find ourselves situated in a world that is not of our making, and this “situatedness” is fundamental to what a human being is.
I will be emphasizing three elements of this situatedness: our embodiment, our deeply social nature, and the fact that we live in a particular historical moment. These correspond to the three major divisions of the book: “Encountering Things,” “Other People,” and “Inheritance.” In these divisions I will reinterpret what are often taken to be encumbrances to the personal will in the modern tradition—sources of unfreedom—and identify them rather as the framing conditions for any worthwhile human performance.
It would be conventional at this point to say that what emerges in the argument is a concept of true freedom as opposed to false freedom. What I want to do instead is simply drop “freedom” as a term of approbation. The word is strained by being made to do too much cultural work; it has become a linguistic reflex that affirms our image of ourselves as autonomous. In doing so, it obscures the sources of our current predicament of attention—by reenacting the central dogma that gave rise to it.
For several hundred years now, the ideal self of the West has been striving to secure its freedom by rendering the external world fully pliable to its will. For the originators of modern thought, this was to be accomplished by treating objects as projections of the mind; we make contact with them only through our representations of them. Early in the twenty-first century, our daily lives are saturated with representations; we have come to resemble the human person as posited in Enlightenment thought. Such is the power and ubiquity of these representations that we find ourselves living a highly mediated existence. The thing is, in this style of existence we ourselves have been rendered pliable—to whoever has the power to craft the most bewitching representations or to control the portals of public space through which we must pass to conduct the business of life.
Autonomy talk stems from Enlightenment epistemology and moral theory, which did important polemical work in their day against various forms of coercion. Times have changed. The philosophical project of this book is to reclaim the real, as against representations. That is why the central term of approbation in these pages is not “freedom” but “agency.” For it is when we are engaged in a skilled practice that the world shows up for us as having a reality of its own, independent of the self. Reciprocally, the self comes into view as being in a situation that is not of its own making. The Latin root of our English word “attention” is tenere, which means to stretch or make tense. External objects provide an attachment point for the mind; they pull us out of ourselves. It is in the encounter between the self and the brute alien otherness of the real that beautiful things become possible: the puck-handling finesse of the hockey player, for example.
Encountering the world as real can be a source of pleasure—indeed of quasi-religious feelings of wonder and gratitude—in light of which manufactured realities are revealed as pale counterfeits, and lose some of their grip on us. It is not that in becoming skilled one somehow becomes immune to distraction. I do believe this book has therapeutic implications, but they are not so immediately obvious as that. Rather, the cultural crisis of attention provides an occasion to examine the big anthropological picture we have been operating within since the Enlightenment, and to revisit the question of how we stand in relation to the world beyond our heads. Anything less far-reaching would be inadequate to the challenges we face.