“We have met the enemy and he is us.”
Walt Kelly
I was driving at the speed limit on Interstate 40 in Nashville, Tennessee, when a car raced up behind me and began riding my tail. After a few uncomfortable moments, I changed lanes and the car whizzed by. Glancing over at the driver—certain it would be some testosterone-drunk teenager—I beheld a well-dressed, white-haired lady.
Someone’s grandmother!
Our behavior on the highway is a lot like our behavior on the web. There is something about the two settings that brings out our dark sides.
There is a word for people behaving badly on the web: trolls. Like crazy drivers, trolls come in all shapes, sizes, and colors. “These are mostly normal people,” explains Whitney Phillips, author of This Is Why We Can’t Have Nice Things: Mapping the Relationship Between Online Trolling and Mainstream Culture. “You want to say this is the bad guys, but it’s a problem of us.”183
Jessica Moreno, a former Reddit executive agrees: trolls are us. “The idea of the basement dweller drinking Mountain Dew and eating Doritos isn’t accurate,” she says. In her experience of tracking down people who post offensive comments online, more often than not, she discovered, “They would be a doctor, a lawyer, an inspirational speaker, a kindergarten teacher.”184
So, what explains our bad behavior on the web? There are, I believe, three major reasons.
First, there is our basic, imperfect nature, a subject of endless fascination and mystery for both science and religion. Primatologist Dian Fossey summarizes it this way: “The more you learn about the dignity of the gorilla, the more you want to avoid people.”185
Second, there is the effect of anonymity. In real life, we don’t usually blow up at people who annoy us, for fear of creating a commotion, getting slugged, being fired, or worse. “But in a car—and on the Internet— all bets are off,” says sociologist Anna Akbari. “We have a vehicle for fleeing the scene, for logging off from that session. We can act without social consequence, which brings out the worst in us.”186
Third, the world wide web is so ultrademocratic it borders on being lawless. Like a twenty-first–century Dodge City, it is populated by characters every bit as colorful (and notorious) as Cherokee Bill, Prairie Dog Dave, Fat Jack, and Cockeyed Frank. Rowdies who, as one real-life Dodge City resident recalled, “feared neither God, man, nor the devil, and [were] so reckless they would pit themselves, like Ajax, against lightning, if they ran into it.”187
On the wild wild web, as in the Dodge City of yore, we see the full spectrum of human behavior on display—the good, the bad, the ugly. No one knows it better than today’s teens, for whom the web is like the proverbial water cooler. It’s where they hang out. “Social networking sites have created new spaces for teens to interact, and they witness a mixture of altruism and cruelty,” observes Amanda Lenhart of Pew Research Center’s Internet & American Life Project.188
The combination of our having flawed natures, hiding behind computer screens (at least, we think we’re hiding; more about that later), and not being policed explains most of the bad and truly ugly behavior we see online.
In the book of “Ecclesiastes,” King Solomon despairs about the human experience, crying out, “Vanity of vanities! All is vanity.” Whatever you think about the Bible, the wise monarch was spot on. Working in the TV and movie industries, I’ve had a front row seat on the timeless spectacle of human self-importance. It’s not pretty.
According to psychologists Jean Twenge and W. Keith Campbell: “Narcissism—a very positive and inflated view of the self—is on the rise.” In their book, The Narcissism Epidemic, they cite a study of 37,000 college students, which shows “narcissistic personality traits rose just as fast as obesity from the 1980s to the present.”189
The web is probably not entirely to blame for the trend. But as we’ve seen, its incomparable star-making power surely does encourage, amplify, and reward self-glorification.
In 2017 LendEDU, an online student loan business, published a revealing analysis of millennials and social media. “They use these platforms,” the study observed, “to boast of their daily tidings, carefully craft their public image, and feed their egos in this interconnected digital age.”190
On the web, egotism gets to play to a global captive audience, a temptation hard to resist. “With just a few filters, a little saturation, and a clever caption,” the LendEDU study points out, “social media can make even the most average joe look like an esteemed socialite.”191
The urge to polish our online images is so irresistible, and doing it is so easy, the web’s social media experience has become as scripted and phony as a reality TV show. According to LendEDU, only 6 percent of college students’ online accounts are “completely true” depictions of themselves (see WEB: DISRUPTION AND DECEPTION.) “The 15 percent that said their social media was ‘not true of me at all,’” LendEDU reports, “know that they are totally fabricating their lives and have not only accepted it but are seemingly fine with it.”192
You might say the web baits us into becoming phony politicians chasing after votes—or likes, to use the proper vernacular. As the Lend-EDU study explains “If you post enough artsy, chic pictures of yourself that rack up plenty of ‘likes,’ then real-life accomplishments will not matter because the popularity of your social media accounts will determine your status on the social hierarchy.”
Curiously, LendEDU adds, our online self-centeredness even drives us to do something seemingly illogical—vote for others. “It does not matter if Instagram users genuinely enjoy other Instagrammers’ posts; the only thing that matters is that each insincere expression of emotion from you will lead to your own Instagram page gaining more status.”
Nothing says online narcissism more than the selfie—a photo taken of and by ourselves and usually posted forthwith on the web for all to see. All too often our self-absorption lures us into photogenic but extremely dangerous poses. The result: death by selfie.
The exact numbers are hard to come by—published estimates vary wildly—but everyone agrees the number of selfie deaths is increasing. According to Emerging Technology, after the first eight months of 2016, the number was at seventy-three fatalities, an all-time high.193
On May 3, 2018, KRIV-TV aired a story about the death of sixteen-year-old Kailee Mills of Spring, Texas. While riding in a car with friends, she removed her seat belt to take a selfie. Her dad explains what happened next: “The car went off the road. She was ejected, and she died instantly. All the other kids in the car, they had their seat belts on and they all survived with very little injury.”194
Inexplicably, many of the fatalities happen in India.195 For example, according to an article in The Pioneer, the venerable newspaper based in New Delhi, on October 3, 2007, “three young men died on the railway tracks outside Bengaluru while ostensibly attempting to take selfies with an onrushing locomotive as the background.” Also, it noted “Earlier this year, in southern Bengal, five young men died trying to save one who was taking a selfie hanging off a railway door.”196
Jesse Fox, an Ohio State University communications technology professor, summarizes the deadly and growing selfie phenomenon this way: “It’s all about me. It’s putting me in the frame. I’m getting attention and when I post that to social media, I’m getting the confirmation that I need from other people that I’m awesome.” She adds, “so who cares if you’re dangling off the side of the Eiffel Tower?”197
Years ago, when I was at Good Morning America, I became good friends with a duo named The God Squad—Monsignor Tom Hartman and Rabbi Marc Gellman. One day on the show, Rabbi Gellman rightly complained that TV was helping vacuous celebrityhood upstage true heroism.
Since then, the web has made things even worse by completely gutting celebrityhood of whatever tiny bit of substance it might have had. It has done so by turning all of us into celebrities—at least, in our own narcissistic minds it has.
We promote ourselves endlessly by posting photos of nothing particularly remarkable: a colorful sunset, a great meal, a roadside attraction, and on and on. Fully “63% of social media,” says Neil Patel, a well-known web marketing guru, “is made up of images.”198
I call it the triumph of the trivial meets the triumph of the visual.
When we browse through the web’s enormous library of images and text, research shows we invariably gravitate to the former —like kids choosing candy over broccoli every time. One survey, for instance, found “four times as many consumers would rather watch a video about a product than read about it.”199
It’s because the brain is wired to handle visual data phenomenally well. How well? The brain can process a full-sized image in just thirteen milliseconds200—ten times faster than the blink of an eye.201
Today’s web-driven visual age reminds me of my visit years ago to Lascaux Cave in southwestern France. The unique cavern is closed to the public, but as ABC News’s Science Editor, I was allowed inside for twenty minutes (without any cameras) and will never forget the experience. Painted on the cave’s earthen walls are crude renderings of humans and animals dating back 20,000 years, long before we had a written language. Such pictographs were a main way paleolithic humans communicated.
I’m not suggesting the web’s age of photo-centricity is hurling us all back to the paleolithic era, but clearly it does appeal to the lizard-brained, preliterate caveman and cavewoman in all of us. In so doing, I fear, it discourages deep thought and reduces our complex world to a series of online slideshows.
Pope Francis addressed this concern during the 2017 World Youth Day. “In the social media, we see faces of young people appearing in any number of pictures recounting more or less real events, but we don’t know how much of all this is . . . an experience that can be communicated and endowed with purpose and meaning.” He admonished young people to spend less time posting photos of their superficial, day-to-day experiences and more time pondering the profound, eternal purpose and meaning of their lives.202
It’s only going to get worse, warns Cisco, the world’s largest computer-networking company. “Globally, IP video traffic will be 82 percent of all consumer Internet traffic by 2020, up from 70 percent in 2015.” In other words, on the web, pictures—videos, in particular—are leaving words in the dust.203
At a conference I recently attended, Bishop T. D. Jakes recounted his experience writing an online guest column for The Washington Post. Reading it, he was aghast at the hundreds of poisonous comments posted by readers. Many of the rants were not even on point, venturing off on tangents having absolutely nothing to do with the column’s subject.
I, too, write guest columns for major online publications—among them, Fox News, U.S. World & News Report, and The Christian Post— and have long since stopped reading the comments section. It’s not that I’m thin-skinned; I simply don’t enjoy seeing people—ordinary, presumably decent people—behaving so badly.
Sadly, vituperation has never had a more accommodating, more powerful ally than the web. It’s so bad we’ve even coined a special term for it: cyberbullying.
George Carlin, the late 1970s-era comedian, would be especially heartsick over this development. One of his most popular routines went like this:
“Another plan I have is World Peace Through Formal Introductions. The idea is that everyone in the world would be required to meet everyone else in the world, formally, at least once. . . . My theory is, if you knew everyone in the world personally, you’d be less inclined to fight them in a war . . .”204
Carlin, I believe, was on to something. As a correspondent, I’ve gotten to know people in dozens of countries. To this day, whenever I read a disturbing news story about one of the nations, I empathize deeply with my friends who live there.
What Carlin had in mind is not exactly what passes for relationships on social media today. On Facebook, Instagram, or Snapchat, we do not, to quote Carlin, “look the person in the eye, shake hands, repeat their name, and try to remember one outstanding physical characteristic.”205
Nevertheless, in this age of global connectedness—of online friends and likes—surely the web represents a step in the right direction. Surely we’ve increased global comity by enabling far-flung strangers to greet one another as never before, no?
Facebook CEO Mark Zuckerberg, for one, thinks so; he’s been saying it from the very beginning. “People sharing more—even if just with their close friends or families—creates a more open culture and leads to a better understanding of the lives and perspectives of others,” he wrote in 2012, when taking Facebook public. “We believe that this creates a greater number of stronger relationships between people and that it helps people get exposed to a greater number of diverse perspectives.”206
Notwithstanding Zuckerberg’s idealism (or crafty marketing spiel), social media’s actual effect on us looks rather disastrous. In bringing us together, the web is not making the world a smaller, more understanding, more loving place. It appears to be driving us apart—like a football stadium, where rivalries are amplified, not dampened.
One such deafening amplification swamped the 2016 US Presidential campaign. Hateful posts, messages, and Tweets from all sides—Republicans, Democrats, Greens, Libertarians—amped up our differences so much that today, many months after the election, we are still at each other’s throats. As a Baby Boomer, I’ve never witnessed anything like it.
Perhaps we will find salvation in the soaring popularity of messaging via apps such as WhatsApp, WeChat, Telegram, and the granddaddy of them all, Facebook’s Messenger, which claims 1.2 billion active users.207 After all, messaging, being one-on-one communication, does come closer to approximating Carlin’s vision.
Pavel Durov for one, Telegram’s founder, is lobbying for messaging to completely replace social media. “It’s pointless and time-consuming to maintain increasingly obsolete friend lists on public networks. Reading other people’s news is brain clutter. To clear out room for the new, one shouldn’t fear getting rid of old baggage.”208
I tend to agree with Durov’s position. But I would not count on messaging becoming our ticket to world peace—for one simple reason.
Think about it. We live in relative peace and harmony with neighbors whose views on hot-button topics—say, abortion, homosexuality, and religion—are unknown to us. Yet, the minute we learn about them, our benign relationship is immediately placed in danger of turning into an ugly war of words.
A great deal of published research affirms this common-sense observation. In the book Privacy Online: Perspectives on Privacy and Self-Disclosure in the Social Web, coauthor and media psychologist Sabine Trepte explains: “With the advent of social media . . . it is inevitable that we will end up knowing more about people and also more likely that we end up disliking them because of it.”209
As the old saying goes, familiarity breeds contempt.
Let me tell you about an incident that happened on February 28, 2017. Though not villainous, it illustrates precisely why the web is so vulnerable to malefactors.
I became aware of it early in the day, while sitting in my quiet office in Nashville, Tennessee. There was no big boom or bright flash of light; I simply noticed my favorite online news sites suddenly struggling to work properly. I kept getting error messages.
Within minutes the web was consumed by a mushroom cloud of apocalyptic-sounding complaints from around the world. Tens of thousands of websites were malfunctioning—from Apple and Airbnb to News Corp and the Securities and Exchange Commission. Also misbehaving were countless everyday devices and services that rely on the so-called Internet of Things (IoT). Everything from web-controlled light bulbs, thermostats, and home security systems to satellite radio networks, smartphone apps, and even TV remotes and computer mice.210
People were nonplussed, wondering: What in the heck is going on?!
Soon, people started directing blame at the world’s largest cloud computing network, Amazon Web Services (AWS)—in particular its Simple Storage Service, or S3. Yes, Amazon doesn’t just sell books and stuff; it sells time and space on sixteen, monster-sized computer facilities scattered worldwide.211 Websites of every category use AWS: People & Society (11.55%); Arts & Entertainment (11.48%); Business & Industry (11.07%); Shopping (5.01%); and Others (60.89%).212
The massive outage – affecting a reported 150,000 websites – cleared up after about four hours. Two days later, Amazon came clean with an explanation:
“The Amazon Simple Storage Service (S3) team was debugging an issue causing the S3 billing system to progress more slowly than expected. At 9:37AM PST, an authorized S3 team member using an established playbook executed a command which was intended to remove a small number of servers for one of the S3 subsystems that is used by the S3 billing process. Unfortunately, one of the inputs to the command was entered incorrectly and a larger set of servers was removed than intended.”213
Two cynical tweets pretty well sum up how everyone felt about the news. @chrisalbon remarked, “The S3 crash was caused by a single person typing in a single part of a single line of code wrong.” @RedEar-Ryan said, “Oh. So a typo took down half the internet this week.”214
The takeaway was unmistakable: the world wide web is super-sensitive to villainous attacks. There are two reasons for it.
One, the web is like an electric grid that can be crippled by a single lightning bolt—except a billion times worse, because the web is not regional, it’s global.
Two, as the Amazon crash proved, just one person with a computer can disrupt our lives significantly.
It is why governments are now taking cyberattacks seriously, the way they did nuclear attacks right after World War II. Indeed, just days after his election, Donald Trump promised “to develop a comprehensive plan to protect America’s vital infrastructure from cyberattacks.”215
Hackers—cyberbullies of a particularly dangerous sort—date back to the early 1970s. Even before the web was invented, they concocted viruses with names such as Creeper, Brain, Elk Cloner, and Jerusalem to infect computers.216
But the web has exponentially worsened the threat. In 2000, when the web was still young, the I Love You virus—AKA the Love Bug— instantly went viral, deleting files and stealing user names and passwords from more than a million computers.217
Today’s hackers—individuals and nations—have a dizzying number of ways to attack us via the web. In addition to viruses, there are worms, Trojan Horses, shadyware, PUPs, adware, keyloggers, RAM scrapers, botnets, backdoors, rootkits, browser hijackers, ransomware—the list goes on and on.218
In April 2017, a Korean undergrad released into the web a novel kind of malware (malicious software) that takes computer files hostage. To get their files back, victims were required to score more than 200 million points in an impossibly difficult anime video game. He eventually apologized but not before the malware infected his own computer.219
The proliferation of villainy on the web has, predictably, spawned an entire counter-industry offering online security software and services. Worldwide, the cybersecurity market is expected to be worth $170 billion by 2020.220
Still, nothing ever works perfectly—as Webroot, a prominent, Colorado-based cybersecurity firm, was recently reminded. On April 24, 2017, the company’s popular antivirus software system abruptly shut down countless computers around the world. Why? Because it mistook ordinary, essential parts of Microsoft Windows for criminal activity.
Webroot’s own computers became overloaded by the barrage of complaints from users, but soon enough the problem was traced to a faulty rule in the antivirus system’s playbook. “The rule was removed,” the company announced, “and we are in the process of rolling back all of the false positives that reside in the Webroot Threat Intelligence platform.”221
For years, the strident liberal talk-show host Phil Donahue lobbied unsuccessfully to air a prison execution on live TV. “I am on record as being against capital punishment,” he said on the Larry King Show in 1999. “I am making this argument not as an ideologue, but as a person who believes that this free-press establishment has a responsibility to show this issue, which has split families.”222
He cited as precedent a 60 Minutes segment about euthanasia that aired a year earlier. In it, Dr. Jack Kevorkian injects fifty-two-year-old Thomas Youk—beset by Lou Gehrig’s disease—with potassium chloride, the same chemical used on death-row inmates. “All movement stopped,” one newspaper reporter observed, “and Youk sat dead in the chair, his mouth hanging open.”223
“It’s not necessarily murder,” Kevorkian says in the segment. “It could be manslaughter, not murder. But it doesn’t bother me what you call it. I know what it is. This could never be a crime in any society which deems itself enlightened.”
Enlightened.
Today, the web is going where no TV camera has ever gone before. Pressing upon us as never before the singular questions raised by Kevorkian’s and Donahue’s rhetoric. Is it a sign of enlightenment to publicize films of people being killed, for any reason? Or is it evidence of our well-known weaknesses for high ratings, propagandizing, voyeurism—and worst of all, outright depravity?
I’m inclined to believe the latter, for reasons I will now explain.
The web’s complicity with our darkest, most gruesome, most degenerate behavior arguably began with a group of Pakistani terrorists in 2002. After kidnapping Daniel Pearl, a reporter for The Wall Street Journal, they beheaded him on camera and posted the video on the web.
Since then, terrorists—most notably members of the Islamic State of Iraq and Syria (ISIS) have beheaded, torched, and drowned multitudes of innocent people, always promptly posting videos of the heinous executions on the web. Snuff films are nothing new, but the web has enabled them to go mainstream.
Things got even grislier after April 16, 2016, when Facebook launched a new app called Live—designed “to make it easier to create, share and discover live videos.” In typical fashion, Zuckerberg painted a thoroughly rosy picture of the new technology, the quintessence of scientific hype:
“Live is like having a TV camera in your pocket. Anyone with a phone now has the power to broadcast to anyone in the world. When you interact live, you feel connected in a more personal way. This is a big shift in how we communicate, and it’s going to create new opportunities for people to come together.”224
It didn’t take long before Zuckerberg’s newest profit center served up live, streaming videos of a twelve-year-old girl hanging herself from a tree; an eighteen-year-old man being tortured; a thirty-three-year-old aspiring actor shooting himself in the head; a fifteen-year-old girl being gang-raped; a jilted man murdering a random, seventy-four-year-old stranger in cold blood; a father hanging himself and his eleven-month-old daughter . . . The litany is long and sickening.225
On Periscope—the live-video streaming app launched in 2015—a nineteen-year-old French woman livestreamed her suicide; she threw herself under a moving train.226
Like Zuckerberg, Periscope’s founders rationalize their brainchild with high-minded rhetoric—more scientific hype:
“Periscope was founded on the belief that live video is a powerful source of truth and connects us in an authentic way with the world around us. We are fascinated by the idea of discovering the world through someone else’s eyes. What’s it like to see through the eyes of a protester in Ukraine? Or watch the sunrise from a hot air balloon in Cappadocia?”227
Right. Got it.
Enlightenment.
I neither entirely blame nor exonerate the web and its cool, mega-rich hypesters. People like us—not just true monsters—are committing these heinous acts; many of them born of disappointment, despair, hopelessness, and yes, the evil streak that runs through all of us. In the words of Walt Kelly, the beloved American satirist: “We have met the enemy and he is us.”
If you still do not believe it, then ask yourself who watches these despicable acts on the web? It is ordinary people, the same rubberneckers who cause horrific traffic jams on the highway because they slow down to gawk at an accident.
After that twelve-year-old girl hung herself from a tree on Facebook Live, a spokesperson for the local police department said this: “The superintendent was visibly upset when he saw the pictures of the girl and was dismayed when he learned that people were watching the incident live and no one called police.”228
If we are corrupting the world wide web with our bad behavior, then there are at least four ways the web is getting back at us.
Perfect Memory
Like old soldiers, as the saying goes, online content never dies. But neither does it fade away. I just checked YouTube and discovered one copy of the video of the 2014 beheading of James Foley, an American photojournalist—just one copy—has 1,160,304 views.
Even if YouTube were to delete the offensive video, it and countless other gruesome reminders of our flawed nature will always find a home inside shock sites, truly creepy websites specifically dedicated to jolt us. This nasty stuff used to be confined to the pages of true crime magazines, but now, thanks to the web, it is out there for all to see—every man, woman, and child the world over.
The web’s eternal memory can also shame us personally. Just ask Mark Driscoll, the former lead pastor of the now-defunct Mars Hill Church in Seattle.
In 2014—while former pastors of the multi-campus, megachurch ministry began leveling various charges against Driscoll229—critics found some unseemly online posts Driscoll made back in 2000 under the pseudonym William Wallace II.230 He had already apologized for them in 2006, but now he was forced to reiterate his contrition.
“The content of my postings to that discussion board does not reflect how I feel, or how I would conduct myself today,” he said. “Over the past 14 years I have changed, and, by God’s grace, hope to continue to change. I also hope people I have offended and disappointed will forgive me.”231
But it was all for naught. On October 14, 2014, under pressure, Driscoll resigned as lead pastor and withdrew from the public eye. A few months later Mars Hill—once the third-fastest-growing mega-church in the nation—permanently closed its doors.232
Fast Paced
The web’s lightning speed is helping to increase the already frenetic pace of life to nearly inhuman limits.
I recall in the late 1980s, when I joined ABC-TV, a typical scene in a news segment lasted seven to ten seconds. Also, it wasn’t unusual for my Good Morning America science reports to last five to six minutes, including some Q&A with the anchor. By the time I left ABC News, in 2002, scenes lasted a few seconds at most and Good Morning America segments lasted but a few minutes.
The web is speeding things up even more than TV ever did. In 2015 Microsoft Canada studied what effect today’s instantaneous, multitasking, tweet-sized environment has on the brains and behavior of 2,000 consumers. With the help of EEGs, the researchers found “that increased media consumption and digital lifestyles reduce the ability for consumers to focus for extended periods of time.” The problem is most serious among “heavy social media users,” 65 percent of whom are prone to being distracted and daydreaming.233
How big a detrimental effect are we talking about? According to the study, the average human attention span is now eight seconds long, an historic low—down from twelve seconds in 2000. By contrast, the researchers claim, the average goldfish has an attention span of nine seconds.
Mob Mentality
The web is also striking back at us by threatening the very First Amendment it was designed to protect and glorify. “An anonymous poll of the writers at TIME,” says columnist Joel Stein, “found that 80% had avoided discussing a particular topic because they feared the online response.”234
“I probably hold back ninety percent of the things that I want to say due to fear of being called out,” a college student confessed to Conor Friedersdorf, a writer for The Atlantic. The student went on to describe the ugliness of today’s online call-out culture:
“People won’t call you out because your opinion is wrong. People will call you out for literally anything. On Twitter today I came across someone making fun [of] a girl who made a video talking about how much she loved God and how she was praying for everyone. There were hundreds of comments, rude comments, below the video. It was to the point that they weren’t even making fun of what she was standing for. They were picking apart everything. Her eyebrows, the way her mouth moves, her voice, the way her hair was parted. Ridiculous. I am not the kind of person to be able to brush off insults like that. Hence why I avoid any situation that could put me in that position. And that’s sad.”235
“I feel genuine fear a lot,” explains Lindy West, writer for GQ.com. She is an outspoken feminist who writes about her struggle with obesity and self-image.
“Someone threw a rock through my car window the other day, and my immediate thought was it’s someone from the Internet.” That makes for a tragic irony, she says. “Finally we have a platform that’s democratizing and we can make ourselves heard, and then you’re harassed for advocating for yourself, and that shuts you down again.”236
“If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent—both good and bad,” confesses Samidh Chakrabarti, a product manager at Facebook. “At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy.”237
In all candor, Chakrabarti adds, “I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t.”238
Habit Forming
All told, scientific studies paint a disturbing picture of our relationship with social networking sites (SNS). Especially for young people.
Consider, for instance, these key findings of a major study published in May 2017, by the Royal Society for Public Health and the Young Health Movement:239
• Ninety-one percent of sixteen- to twenty-four-year-olds use SNS.
• Social media has been described as more addictive than cigarettes and alcohol.
• Social media use is linked with increased rates of depression and poor sleep.
• Rates of anxiety and depression in young people have risen 70 percent in the past twenty-five years—with young people saying their favorite SNS “actually make their feelings of anxiety worse.” One of them states, “I’m constantly worried about what others think of my posts and pictures.”
• Overall, the most harmful SNS to young people’s mental health are: Instagram, Twitter, and Facebook, in that order.
In a 2017 report published in the International Journal of Environmental Research and Public Health, psychologists Daria Kuss and Mark Griffiths at Nottingham Trent University confirm those findings, categorically stating that: “There is a growing scientific evidence base to suggest excessive SNS use may lead to symptoms traditionally associated with substance-related addictions.”240
Indeed, an October 2017 survey of 5,000 public- and private-school students in England commissioned by the Headmasters’ and Headmistresses’ Conference (HMC) found that fully 56 percent of the eleven- to eighteen-year-olds felt “they are on the edge of addiction.”241
Psychologist Susan Weinschenk, adjunct professor at the University of Wisconsin, explains it in terms of dopamine, a brain chemical that causes us to desire and to seek exciting, unpredictable experiences. “With the internet, twitter, and texting we now have almost instant gratification of our desire to seek,” she says. “We get into a dopamine induced loop . . . dopamine starts us seeking, then we get rewarded for the seeking, which makes us seek more. It becomes harder and harder to stop looking at email, stop texting, stop checking our cell phones to see if we have a message or a new text.”242
Adam Penenberg, a journalism professor at New York University, learned firsthand that other brain chemicals as well are involved in SNS addiction. They include oxytocin, the cuddly, feel-good hormone; and the stress-related hormones cortisol and ACTH.
As an experiment, Penenberg had his blood tested while he tweeted merrily away. “My oxytocin levels spiked 13.2%,” he reports, comparable to the hormonal spike experienced by a love-struck groom at a wedding. “Meanwhile,” he adds, “stress hormones cortisol and ACTH went down 10.8% and 14.9%, respectively.”243
The addictiveness of SNS is entirely intentional, says Nir Eyal, author of Hooked: How to Build Habit-Forming Products. It’s the outcome of a tried-and-true formula—one involving triggers, actions, rewards, and investments—used by all successful Silicon Valley titans. They do so because they realize it’s not the company with the best product that necessarily wins. “Instead,” Eyal says, “it’s the company that holds on to the monopoly of the mind—the habit—that wins.”244
Sean Parker, Facebook’s first president and cofounder of Napster, agrees with Eyal. “The thought process that went into building these applications, Facebook being the first of them,” he explains, “was all about: ‘How do we consume as much of your time and conscious attention as possible?’ And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever.”245 He adds candidly, “The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway.”246
According to psychologist Jean M. Twenge, young people born between 1995 and 2012—the iGen generation, she calls them—are being deeply wounded by their addiction to social media. “Around 2012, I noticed abrupt shifts in teen behaviors and emotional states,” she reports. “In all my analyses of generational data—some reaching back to the 1930s—I had never seen anything like it.”247
Compared with previous generations, she explains, iGen-ers spend way more time at home, because “their social life is lived on their phone. They don’t need to leave home to spend time with their friends.”248
As iGen-ers mature, they’re having a hard time leaving the nest, which is not at all healthy. “More comfortable in their bedrooms than in a car or at a party, today’s teens are physically safer than teens have ever been. They’re markedly less likely to get into a car accident and, having less of a taste for alcohol than their predecessors, are less susceptible to drinking’s attendant ills,” explains Twenge. “Psychologically, however, they are more vulnerable than Millennials were: Rates of teen depression and suicide have skyrocketed since 2011. It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades.”249
As our eyes are fully opening to the utter mayhem SNS are causing us, especially young people, Silicon Valley innovators are publicly expressing contrition. It’s reminiscent of the profound remorse many of my fellow physicists felt about having created nuclear weapons, especially after witnessing the death and devastation they caused in Nagasaki and Hiroshima.
Chamath Palihapitiya, a former vice president at Facebook, recently told an audience at the Stanford Graduate School of Business that he feels “tremendous guilt” about his role in creating Facebook. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he says. “No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem—this is not about Russian ads. This is a global problem.”250
“I don’t know a more urgent problem than this,” says Tristan Harris, a former Google employee. He studied under behavioral psychologist Brian J. Fogg, director of the Stanford Persuasive Technology Lab, whose ideas undergird the addictive formula described by Palihapitiya. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.”251
“It is very common for humans to develop things with the best of intentions that have unintended, negative consequences,” remarks Justin Rosenstein. He co-created Facebook’s famous Like button and is now upset over what it’s become.
So much so, in fact, that he has sworn off using SNS—all apps, actually—and welcomes a candid discussion about what he and his colleagues have intentionally and unintentionally wrought. “One reason I think it is particularly important for us to talk about this now,” he says ruefully, “is that we may be the last generation that can remember life before.”252