In the struggle between mind and body, perpetually reenacted by fitness-seekers, the mind is almost universally conceived as the “good guy”—the moral overdog that must by all rights prevail. Contemporary fitness culture concedes a certain advisory status to the body: We should “listen” to it, since, after all, the body is capable of doing a great many important things on its own, from healing wounds to incubating fetuses, with no discernible instructions from the conscious mind. So if your hamstrings are squealing with pain, it may be time to recalibrate the leg lifts and squats. All-purpose guru Deepak Chopra advises:

Be open to your body. It’s always speaking. Be willing to listen.
Trust your body. Every cell is on your side, which means you have hundreds of billions of allies.1

It’s up to you, of course, to tune in to the body or ignore it. As a health columnist puts it:

Your body pays attention to you. It thinks you’re important! If you’ve spent a whole lot of time ignoring how you feel, just bulldozing along—your body has probably decided you’re not interested in listening to these lines of communication. It hits the mute button. That’s OK, you can turn your volume back on.2

The superiority of mind over body, or, more majestically, spirit over matter, is inscribed in every post-pagan religion and philosophical system. In the Manichean religion of third-century-CE Mesopotamia—which drew on Christian Gnosticism as well as Buddhism—all of cosmology is a “struggle between a good, spiritual world of light, and an evil, material world of darkness,”3 a theme that came to full, dark flower in the medieval Catholic Church, with its celebration of self-mortification—saints, for example, who dined on little more than the dust they found in their monastic cells. To achieve spiritual salvation, the spirit had to be freed from the body and all its vile inclinations, including its tendency toward disease and corruption. Today’s Christianity, Islam, and Judaism, while far more permissive, often require adherence to some dietary rules and physical acts of obeisance like kneeling and prostration in prayer or wearing restrictive clothing. At the very least, the mind or spirit is expected to keep a tight rein on the body’s slothful, gluttonous, and lustful impulses. A twentieth-century anorectic associated her wasted body with “absolute purity, hyperintellectuality and transcendence of the flesh,” adding that “my soul seemed to grow as my body waned.”4

But can the mind be trusted? Surveying today’s fitness culture, a mid-twentieth-century psychiatrist would no doubt find reasons to suspect a variety of mental disorders—masochism, narcissism, obsessive-compulsive disorder, and homoerotic tendencies (which were viewed as pathological until the 1970s)—any of which could indicate the need for professional intervention. Even the untrained eye can detect the occasional skeletal anorectic in the gym, sweating through hours of cardiovascular training, and start to question the assumed intellectual superiority of the mind. We have come, hesitantly, to respect the “wisdom of the body,” but can we be sure of the wisdom of the mind?

Just within the last decade, a new reason for alarm has arisen. Not only may the mind be twisted by the traditional emotional disorders like depression, but its fundamental cognitive powers appear to be dwindling. Teachers, parents, and psychologists have noted a steep decline in the ability to pay attention, among both children and adults. A 2015 study found that the average adult attention span had shrunk from twelve seconds a dozen years ago to eight seconds, which is shorter than the attention span of a goldfish.5 Something seems to be going very wrong with the human mind, not in its emotional responses to the world, which have always been a bit unreliable, but in its ability to perceive and understand that world. Among the many diagnoses being bandied about are autism, which now occupies an entire “spectrum” of symptoms, Asperger’s syndrome, attention deficit disorder (ADD), and attention deficit hyperactivity disorder (ADHD)—all of which overlap in symptomatology and can markedly affect academic performance. Any parent whose child was performing less than brilliantly in school would be remiss not to seek medical help.

ADD and ADHD are now the most common pediatric diagnoses after asthma, partly for reasons that have nothing to do with any actual epidemiology. In the first decade of the twenty-first century, drug companies started marketing stimulants like Adderall and Ritalin as treatments for ADD/ADHD, often targeting parents and even children directly. One such ad showed a mother hugging a little boy who has gotten a B+ on a test, captioned with “Finally, schoolwork which matches his intelligence.”6 Another showed a kid in a monster costume removing his monster-head-covering to reveal a smiling blond boy. “There’s a great kid in there,” reads the text. “Now there’s a new way to help him out.”7 Whether the drugs worked or not to boost grades, affluent parents were discovering that a diagnosis of ADD/ADHD could warrant giving their child additional time to complete in-class tests—a small but possibly decisive advantage in the competition to get into a good high school or college.

It did not take years of laboratory research to get to the likely source of this new “epidemic.” Parents could see what was happening to their own children, who were being drawn to electronic devices—cell phones, computers, and iPads—as if to opium-infused cupcakes. They stare at the small screens for hours a day, often switching moment by moment between games, videos, and texting their friends. They have trouble focusing on homework or anything else in “the real world” even when their devices are forcibly removed. Neuroscientists confirmed that electronic addiction was “rewiring” the human brain, depleting attention span8 and degrading the quality of sleep.9 In fact, as they withdrew from the physical world into their texts and their tweets, adults could see the same things happening to themselves. The term “distracted parenting” was invented to describe the parent who could barely focus on his or her children anymore, certainly not to the degree required to enforce a few hours of abstinence from devices a day. And what good could a parent do when the schools themselves increasingly use laptops and iPads as instruments of learning? The small screens seemed to have swallowed the world.

The Technological Fix

The perpetrator was easy enough to locate—in Silicon Valley or, more generally, the high-tech industry that created the tempting devices and social networks that consume so much of our time. Silicon Valley is not just the source of the problem; it also seemed to be ground zero of the inattentiveness epidemic. A 2001 article in Wired sounded an early alarm: Diagnoses of autism and Asperger’s syndrome were skyrocketing in Santa Clara County, home of Silicon Valley.10 Among the adult population of the Valley, surely something was wrong with Steve Jobs, who alternated between obsessive attention to details and complete withdrawal into himself, between a spiritual aloofness and uncontrolled temper tantrums. Some observers thought they detected a hint of autism in the unblinking, almost affect-free Bill Gates, and the characters in HBO’s Silicon Valley are portrayed as well “within the spectrum.” There is even a “Silicon Valley syndrome,” defined, incoherently, by the crowdsourced Urban Dictionary as “a collection of personality traits and physical characteristics specific to individuals residing around the San Francisco Bay Area. The effects of SVS are often confused for autism or Helen Keller [sic].”11 Put that together with Apple’s slogan “Think different,” and you might conclude that Silicon Valley has a problem not only with grammar, but with thinking itself.

Rising concerns about shrinking attention spans should—if anyone had been paying attention—have created a sense of crisis in Silicon Valley. Suppose the company manufacturing a nutritional supplement advertised as “miraculous” was confronted with claims that its product actually enfeebles its users—which was roughly the situation the tech industry found itself in. Not only did Silicon Valley’s corporate culture encourage a “syndrome” of inattentiveness and self-involvement, but its products seem to spread the same derangement to everyone else. The devices that were supposed to make us smarter and more connected to other humans were actually messing with our minds, causing “Net brain” and “monkey mind,” as well as physical disorders associated with long hours of sitting. As we click between Twitter and Facebook, text and hypertext, one link and another, synapses are being formed and then broken with febrile inconstancy—or so the neuroscientists warn us—leaving the neuronal scaffolding too fragile to house large thoughts. Hence the emergence of “digital detox camps” where grown-ups pay to live without electronic devices—as well as alcohol, sex, and gluten—in order to “reconnect” with the real world.12

A less arrogant industry might have settled for warning labels on its phones and pads—“Do not use while driving or attempting to hold a conversation,” for example. But Silicon Valley “has an arrogance problem,” tech columnist Farhad Manjoo announced in the Wall Street Journal in 2013, in response to a tech titan’s plea for greater independence from regulation:

For Silicon Valley’s own sake, the triumphalist tone needs to be kept in check. Everyone knows that Silicon Valley aims to take over the world. But if they want to succeed, the Valley’s inhabitants would be wise to at least pretend to be more humble in their approach.13

But humility was not in Silicon Valley’s repertoire. Had they not, in just a couple of decades, transformed—or to use their current favorite verb, “disrupted”—the worlds of entertainment, communications, business, shopping, dating, and just about everything else? In the process, at least fourteen billionaires had emerged in the Valley itself, which is certainly an undercount of tech billionaires nationwide. Wall Street and Hollywood could generate centi-millionaires; only in Silicon Valley could a young man (and it is almost always a man) without a college degree rather suddenly acquire an eight-figure fortune. Silicon Valley, whether in the Bay Area, Austin, Cambridge, or New York’s Silicon Alley, is a setting that breeds megalomania or, as tech critic Evgeny Morozov terms it, “‘solutionism’: an intellectual pathology that recognizes problems as problems based on just one criterion: whether they are ‘solvable’ with a nice and clean technological solution at our disposal.”14

Anything is possible, any problem solvable, with a simple “hack.” Space travel? PayPal cofounder Elon Musk now heads up SpaceX, the first private space travel company. Health? Silicon Valley generates the personal monitoring devices that can continually reveal your inner workings far better than a doctor’s office could. Who needs a doctor anyway? Picking up on the evidence-based critiques of medical practice, Vinod Khosla, “one of Silicon Valley’s most revered venture capitalists,” publicly announced that “healthcare is like witchcraft and just based on tradition” rather than being driven by data.15

Far better to pick up a little biochemistry and proceed to “biohack” your own body. Dave Asprey describes himself as “a young, brand-new multimillionaire entrepreneur” when he confronted his own obesity and attempted, unsuccessfully, to cure it by dieting and doing a ninety-minute workout per day. Then he realized that

our bodies and the Internet are not all that different. They are both complex systems with big pieces of data that are missing, misunderstood or hidden. When I looked at my body that way, I realized that I could learn to hack my biology using the same techniques I used to hack computer systems and the Internet.16

Asprey’s lifesaving hack turned out to be “Bulletproof Coffee”—expensive mold-free coffee containing a generous portion of melted butter—which he now markets online and through his cafés. Exercise turned out to be just too time-consuming.

For obsessive biohacking no one tops Ray Kurzweil, the futurist, inventor, and bestselling author of a book on the coming “singularity,” when artificial intelligence will become self-improving and overtake the human mind. Like Asprey, Kurzweil sees the body as a machine—in fact a computer—that can be continually upgraded. “I have a personal program to combat each of the degenerative disease and aging processes,” he writes. “My view is that I am reprogramming my biochemistry in the same way I reprogram the computers in my life.”17 The only exercise he undertakes is walking, and his nutritional routine would seem not to leave time for workouts in a gym. Every day he takes “about 250” pills containing nutritional supplements, on top of which he spends a day a week at a clinic where supplements are delivered right into his bloodstream. “Every few months,” he relates, “I test dozens of levels of nutrients (such as vitamins, minerals, and fats), hormones and metabolic by-products in my blood.”18

The goal here is not something as mundane as health. Silicon Valley’s towering hubris demands nothing less than immortality. The reason why Kurzweil has transformed himself into a walking chemistry lab is to prolong his life just long enough for the next set of biomedical breakthroughs to come along, say in 2040, after which we’ll be able to load our bodies with millions of nanobots programmed to fight disease. One way or another, other tech titans aim to achieve the same thing. As Newsweek reports:

Peter Thiel, the billionaire co-founder of PayPal, plans to live to be 120. Compared with some other tech billionaires, he doesn’t seem particularly ambitious. Dmitry Itskov, the “godfather” of the Russian Internet, says his goal is to live to 10,000; Larry Ellison, co-founder of Oracle, finds the notion of accepting mortality “incomprehensible,” and Sergey Brin, co-founder of Google, hopes to someday “cure death.”19

There is, to say the least, a profound sense of entitlement here. Oracle’s Larry Ellison is reportedly “used to getting his way, and he doesn’t see why that should ever stop. ‘Death makes me very angry,’ he has said, explaining why he has spent hundreds of millions to fund antiaging research.”20 If you are one of the richest men in the world, and presumably, since this is Silicon Valley, one of the smartest, why should you ever die?

Controlling Your Mind

With immortality on the agenda, surely the little matter of mass inattentiveness had a solution, and I mean a “solution” in the “solutionist” sense—something convenient, marketable, and preferably available on existing devices. But the solution, when it made its way to Silicon Valley, came from a realm apparently unrelated to digital technology, and that is religion—in this case, Buddhism. Jon Kabat-Zinn, a Zen-trained psychologist in Cambridge, Massachusetts, had already extracted what he took as the secularized core of Buddhism and termed it “mindfulness,” which he extolled in two bestsellers in the late 1990s. I first heard the word in 1998 from a wealthy landlady in Berkeley, who advised me to be “mindful” of the suffocating Martha Stewart–ish décor of the apartment I was renting from her, which of course I was doing everything possible to unsee. The probable connection to Buddhism emerged when I had to turn to a tenants’ rights group to collect my security deposit. People like me—renters?—she responded in an angry letter, were oppressing Tibetans and disrespected the Dalai Lama.

During the same stint in the Bay Area, I learned that rich locals liked to unwind at Buddhist monasteries in the hills where, for a few thousand dollars, they could spend a weekend doing manual labor for the monks. Buddhism, or some adaptation thereof, was becoming a class signifier, among Caucasians anyway, and nowhere was it more ostentatious than Silicon Valley, where star player Steve Jobs had been a Buddhist or perhaps a Hindu—he seems not to have made a distinction—even before it was fashionable for CEOs to claim a spiritual life. Guided by an in-house Buddhist, Google started offering its “Search Inside Yourself” trainings, promoting attention and self-knowledge, in 2007.

Mindfulness went public as a kind of “movement” only in the second decade of the twenty-first century, though, when Soren Gordhamer, a former teacher of meditation to at-risk youth and at one point an aide to Hollywood’s chief Buddhist, Richard Gere, found himself broke, divorced, and in the grip of a terrible Twitter addiction. Something had to be done to counter the addiction to devices, and it had to be something that in no way threatened the billionaires who had lured us into it. As Mindful magazine later pointed out:

The lords and leaders of high tech aren’t about to dismiss new technology as the beginning of the end of humankind—not only because they don’t want to work against their own economic interests, but because they believe in the innovative, interactive world fostered by new technologies.…Yet they also know that technology can be distracting, not only from where we are in any given moment but from where we ought to be going.21

In a stroke of genius, Gordhamer found a way to raise the issue while actually flattering the tech titans. He claims to have discovered that, while the rest of us struggle with intractable distraction, leaders from Google, LinkedIn, Twitter, and other major tech companies seem to be “tapped into an inner dimension that guides their work.”22 He called it “wisdom” and started a series of annual conferences called Wisdom 2.0, based originally in San Francisco, in which corporate leaders, accompanied by celebrity gurus, could share the source of their remarkable serenity, which was soon known as mindfulness.

At the same time, in London, a former Buddhist monk with a degree in Circus Arts, Andy Puddicombe, was trying to figure out how to spread Buddhist meditation techniques within the generally religion-averse business class. He and a partner created a company called Headspace, which at first staged events where large groups of people paid to participate in guided meditation sessions. When the customers demanded more convenient ways of packaging the experience, Headspace started marketing CDs, podcasts, and eventually a cell-phone-accessible app, distributed by Apple and Android. Politically and monetarily, this was another stroke of genius. It catapulted Puddicombe from near destitution to a net worth of £25 million,23 while, through efforts like Wisdom 2.0, simultaneously transforming the tech titans from being the villains in the inattentiveness epidemic to the putative saviors. There was an “irony,” Fast Company noted, “in using technology to deliver mindfulness coaching to a population that’s more and more tech-frazzled.”24 Bestselling psychologist Daniel Goleman observed, more bluntly, “What a clever way to make money: Create a problem you can then solve.”25

Mass-market mindfulness began to roll out of the Bay Area like a brand-new app. Very much like an app, in fact, or a whole swarm of apps. There are over five hundred mindfulness apps available, bearing names like “Simply Being” and “Buddhify.” Previous self-improvement trends were transmitted via books, inspirational speakers, and CDs; mindfulness can be carried around on a smartphone. Most of these apps feature timed stretches of meditation, some as brief as one minute, accompanied by soothing voices, soporific music, and cloying images of forests and waterfalls.

This is Buddhism sliced up, commodified, and drained of all reference to the transcendent. In case the connection to the tech industry is unclear, a Silicon Valley venture capitalist blurbed a seminal mindfulness manual by calling it “the instruction manual that should come with our iPhones and Blackberries.”26 You might think that the actual Buddha had devoted his time sitting under the Bodhi tree to product testing; the word “enlightenment” never arises in the mindfulness lexicon.

Today mindfulness, in its sleek and secular form, has spread far beyond Silicon Valley and its signature industry, becoming just another numbingly ubiquitous feature of the verbal landscape, as “positive thinking” once was. While an earlier, more arduous version of Buddhism attracted few celebrities other than Richard Gere, mindfulness boasts a host of prominent practitioners—Arianna Huffington, Gwyneth Paltrow, and Anderson Cooper among them. It debuted at Davos in 2013 to an overflow crowd, and Wisdom 2.0 conferences have taken place in New York and Dublin as well as San Francisco, with attendees often fanning out to become missionaries for the new mind-set—starting their own coaching businesses or designing their own apps. A recent Wisdom 2.0 event in San Francisco advertised speeches by corporate representatives of Starbucks and Eileen Fisher as well as familiar faces from Google and Facebook. Aetna health insurance offers its thirty-four thousand employees a twelve-week program and dreams of expanding to include all its customers, who will presumably be made healthier by clearing their minds. Even General Mills, which dates back to the nineteenth century, has added meditation rooms to its buildings, finding that a seven-week course produces striking results:

[Eighty-three] per cent of participants said they were “taking time each day to optimise my personal productivity”—up from 23 per cent before the course. Eighty-two per cent said they now make time to eliminate tasks with limited productivity value—up from 32 per cent before the course.27

It was Silicon Valley, though, that legitimized mindfulness for the rest of the business world. If mindfulness had first taken root in General Mills, it would never have gained the status it’s acquired from Google and Facebook; baking products just don’t have the cachet of digital devices. Silicon Valley is, after all, the “innovation center of the universe,” according to its boosters, home of the “best and the brightest,” along with the new “masters of the universe” who replaced the old ones after the financial crash that temporarily humbled Wall Street. Mindfulness may have roots in an ancient religion, but the Valley’s imprimatur established that it was rational, scientific, and forward-looking.

To the tech industry, the great advantage of mindfulness is that it seemed to be based firmly on science; no “hippie bullshit” or other “woo-woo” was involved. Positive thinking had never gained much traction in Silicon Valley, possibly because the tech titans needed no help in believing that they could do (or hack or disrupt) anything they set out to do. The other problem with positive thinking is that despite the efforts of PhD-level “positive psychologists,” it had no clear scientific backing and in fact bore a strong resemblance to “magical thinking”—“If I think it, it must be so.” But advocates of mindfulness could always point to a 2004 study by a neuroscientist showing that Buddhist monks with about ten thousand hours of meditation under their belts had altered patterns of brain activity.28 Shorter bouts of meditation seemed to work at least temporary changes in novices. The field of “contemplative neuroscience” was born, and Silicon Valley seized on it for a much-needed “neural hack.” Through meditation, monastic or app-guided, anyone could reach directly into their own moist brain tissue and “resculpt” it in a calmer, more attentive direction. Mindfulness, as its promoters put it, fosters—or as it is often put, even “induces”—“neuroplasticity.”

“Neuroplasticity” is an impressively scientific-sounding term, but it is an innate property of neuronal tissue, which persists whether we make a conscious effort to rewire our brains or not. Everything we experience subjectively, every thought and emotion, produces at least transient physiological changes in the brain. Trauma and addiction can lead to longer-lasting changes; even fleeting events may leave the chemical traces in the brain that we experience as memory. In fact, “plasticity” is a pallid descriptor for the constant, ongoing transformation of brain tissue: Neurons reach out to each other through tiny membranous protrusions called “spines,” which can form or disappear within minutes or seconds. Spines appear to be involved in the formation of new synapses linking neurons, which in turn hold together the ever-changing structure of neural firing patterns. Synapses that fire frequently grow stronger, while the inactive ones wither. Well-connected neurons thrive while neglected ones die. There is even some evidence that neurons in mature animals can reproduce.

What there is no evidence for, however, is any particularly salubrious effect of meditation, especially in byte-sized doses. This was established through a mammoth federally sponsored “meta-analysis” of existing studies, published in 2014, which found that meditation programs can help treat stress-related symptoms, but that they are no more effective in doing so than other interventions, such as muscle relaxation, medication, or psychotherapy.29 There is no excuse for ignoring this study, which achieved worldwide attention. So maybe meditation does have a calming, “centering” effect, but so does an hour of concentration on a math problem or a glass of wine with friends. I personally recommend a few hours a day with small children or babies, who can easily charm anyone into entering their alternative universe. As for Silicon Valley’s unique contribution, mindfulness apps, a recent study concluded that there is

an almost complete lack of evidence supporting the usefulness of those applications. We found no randomized clinical trials evaluating the impact of these applications on mindfulness training or health indicators, and the potential for mobile mindfulness applications remains largely unexplored.30

For an industry based on empirical science and employing large numbers of engineers, Silicon Valley has been remarkably incurious about the scientific basis of mindfulness—probably because the “neuroplasticity” concept is just too alluring. The line of reasoning—or, I should say, analogizing—goes like this: If the brain can be resculpted through conscious effort, then mindfulness is as imperative as physical exercise; the brain is a “muscle,” and, like any muscle, in need of training. The metaphor of mind-as-muscle is almost ubiquitous in the mindfulness industry. For example, one popular and highly rated mindfulness app, called “Get Some Headspace,” advertises itself as a “gym membership for the mind.” Google’s chief motivator, Chade-Meng Tan, whose official corporate title was “Jolly Good Fellow,” installed the company’s mindfulness training program, “Search Inside Yourself,” in 2007, later telling the Guardian:

If you are a company leader who says employees should be encouraged to exercise, nobody looks at you funny.…The same thing is happening to meditation and mindfulness, because now that it’s become scientific, it has been demystified. It’s going to be seen as fitness for the mind.31

So it’s not “science” that legitimates mindfulness practice. The only thing that science contributed was the notion of neuroplasticity, which morphed into the mind-as-muscle metaphor, which in turn suggested the metaphor of mindfulness as a form of fitness training. The mind can be controlled much as the body can—through disciplined exercise, possibly conducted in a special space, like a corporate meditation room, which, Tan suggests, should be seen as no more outré than the company gym.

Of course, there is a slight metaphysical mystery here: Who is in charge? In the physical fitness case, the duality lies only between the body, which was thought to be inert, and the mind, imagined as an immaterial essence—the site of “I” or “us.” But if the mind has also been reduced to a substance, though fortunately a malleable one that can be molded and controlled, then where is the “I”? This is one of the paradoxes of the endeavor to use the mind, conceived as a conscious agent, to control itself. Ruby Wax, a high-profile British mindfulness teacher and promoter, seems to hint at the problem when she says:

The difficult thing is, your brain can’t tell there’s something wrong with your brain. If you have a rash on your leg, you can look down and see it. But you don’t have a spare brain to make an assessment of your own brain. You’re always the last to know—that’s the bitch.32

But whichever prevails in the mind-body duality, the hope, the goal—the cherished assumption—is that by working together, the mind and the body can act as a perfectly self-regulating machine. Certainly the body had seemed willing to cooperate ever since the 1932 publication of physiologist Walter B. Cannon’s book The Wisdom of the Body, which laid out the delicate mechanisms of homeostasis, through which the body attempts to keep blood sugar level, acid/base balance, and body temperature at constant “normal” levels. Now add in the brain, with its ability to send the individual mind ranging out into the collective mind represented by books, experts, and the Internet—bringing back important new information: Eat more vegetables (or turmeric, or whatever is fashionable at the moment); exercise daily; take time to unwind. Combine mind plus body with freshly updated data, some of it perhaps collected on your self-monitoring devices, and act quickly to generate fresh instructions to forestall any looming problems. This, I imagine, is how Silicon Valley “immortalists” spend their time—scanning all the health-related information and instantly applying it—which may seem a small price to pay for eternal life.