Humans adapt. It is what we do better than any other species. Accordingly, our predecessors have always had to embrace a changing world where new inventions and technologies have, in turn, driven lifestyles, insights, tastes, and priorities. So why should this digital age be any different?
The automobile, for example, had vast, life-transforming effects. Using this kind of analogy, you could view digital devices as just the latest in a long line of innovations exciting and disturbing at first, then ultimately incorporated into our lives as the driver of some new development that will always be hard for some traditionalists to accept. Take the printing press, its introduction to Europe by Johannes Gutenberg around 1439 was undeniably a giant milestone in the progress of civilization. It democratized knowledge, and the reactionary forces of the status quo just didn’t like it—a parallel, you might argue, to those who seem to be technological Luddites nowadays. Books began to disseminate insight to ever greater numbers of individuals, who then could, and did, foment social change, which led to personal advancement and universal education. Even fiction invariably raised issues about the human condition that enabled the reader to see the world through the eyes of others in other eras and locations, all the better to appreciate and shape one’s own perspective and self-understanding; how could anything ever be more transformational?
Then there was electricity. Up to the end of the nineteenth century, nighttime would have brought uncontrollable darkness; the only redress for our ancestors would have been candlelight to fend off whatever unknown perils, real or supernatural, might be lurking just beyond that feeble, flickering pool of light. Our ancestors’ experience of daily life would, for much of the time, be one of half-formed shapes, half-light, and a helpless inability to control their surroundings. Imagine the cataclysmic difference when eventually that dark and sinister world was flooded with electric light. What kind of new thinking and mindset might have occurred? Whatever it was, it was clearly a dramatic revision of reality to which our species adapted, and which thereby changed us.
Let’s move to a more recent development: television. From the time of its invention around the middle of the twentieth century, the concern was that television would be bad for children’s brains, that they would get “square eyes” and stop reading and playing outside. However, since television broadcasts occurred only during limited periods in the evening, and since there was at the time a dominant culture of outdoor games, reading, and collective family meals, the TV in fact complemented an existing lifestyle rather than disrupting it. In one sense, rather than being an early forerunner to the home computer, the TV was more like the Victorian piano in being a means of cohesive family activity and interaction.
This is no nostalgia for the golden days gone by. The middle years of the twentieth century were physically uncomfortable and tough, and turning back the clock, even if it were in some way possible, is not an attractive proposition: who in their right mind would ever opt for an unheated bedroom with uncooperative layers of thin scratchy blankets? But those were different times. There was only one TV set to a household, and that was if you were lucky; at first, usually only one home on a street might boast of such a marvel, attracting endless visitors to share in the wonderment. And even into the 1960s, watching TV had a communal feel.
Nothing could have been further from the twenty-first-century scenario of a family member rushing in from work or school to sit for hours in voluntary solitary confinement in front of the screen. One of the big differences between the earlier technologies and the current digital counterparts is quantitative, the amount of time the screen monopolizes our active and exclusive attention in a way that the book, the cinema, the radio, and even the TV never have. The futurologist Richard Watson certainly thinks that the degree to which digital technologies are dominating our lives makes the crucial difference: “We’ve always invented new things. We’ve always worried about new things and we’ve always moaned about younger generations. Surely most of [this] is conjecture mashed up with middle-aged technology angst? I think the answer to this is that it’s a little different this time. [Screens] are becoming ubiquitous. They are becoming addictive. They are becoming prescribed.”1
It’s not so much the physical ubiquity of screens that might now differentiate the appearance of the average home from its predecessors, but an invisible feature, inconceivable a decade ago, whereby family members can be constantly connected beyond the household more intimately than with the immediate family members with whom they live in close proximity. Each individual adult and child now owns multiple digital devices that they use for entertainment, socialization, and information.2
There is a push and pull, respectively, toward the cyberspace offered by, say, the isolation of the mobile device and/or the multifunctional bedroom and away from the erstwhile epicenter of the family. In the past, bedrooms were places of punishment to which a child would be exiled for bad behavior—a far cry from the havens they are regarded as by many young people today. The warm kitchen or drawing room where the nuclear family sat together was the primary forum for interaction and information, and it provided a framework and a timetable for daily existence. Now the world of the screen in the bedroom, or anywhere else, has in many cases offered an alternative context for setting the pace, establishing standards and values, offering conversations, and providing entertainment, while the nuclear family eating a meal together is becoming less central in the midst of more complex societal trends of divorce and remarriage, as well as more variable and demanding work patterns.
Beyond the all-pervasiveness of digital technologies compared to inventions from previous eras, another difference is the shift from technology as a means to its being an end in and of itself. A car gets you from place to place; a fridge keeps your food fresh; a book can help you learn about the real world and the people in it. But digital technology has the potential to become the end rather than the means, a lifestyle all on its own. Even though many will use the Internet to read, play music, and learn as part of their lives in three dimensions, the digital world offers the possibility, even the temptation, of becoming a world unto itself. From socializing to shopping, working, learning, and having fun, everything we do every day can now be done very differently in an indefinable parallel space. For the first time ever, life in front of a computer screen is threatening to outcompete real life.3
You wake. The first thing you do is check your smartphone (62 percent of us), and in all probability you’ll be checking your phone within the first fifteen minutes of consciousness (79 percent of us).4 In 2013, 25 percent of U.S. smartphone users ages eighteen to forty-four could not recall a single occasion during which their smartphone was not within reach of them or in the same room. After waking, you grab a cup of coffee and a Danish while checking out emails that may have come in overnight as well as sending some yourself. Let’s say your job enables you to work from home, as some 20 percent of American professionals do;5 you’ll then get down to business. While you have your tasks up in front of you, you will also have Twitter open to follow your favorite celebrity, along with your Facebook page to ensure that you don’t miss out on any news. You’ll also need to keep checking your social network sites, such as Instagram updates or Snapchat, and taking quick photos of what you’re having for lunch (time has flown), all at the same time as being on the alert for good old-fashioned text messages. Exhausted by all this multitasking while working, you then relax by watching a YouTube video that attracted a large number of views, or you download the latest episode of a TV show. Next it’s time to place your grocery order and have more serious retail therapy with some online shopping. In 2011, 71 percent of adult U.S. Internet users bought goods online,6 and the following year a comparable number, 87 percent, of U.K. adults ages twenty-five to forty-four were shopping online.7 By 2017, online sales are projected to account for 10 percent of all retail sales in the United States. Needing stimulation, excitement, and escapism after it hits home how much money you’ve just spent, you’ll then immerse yourself in a thrilling videogame, just like some 58 percent of all Americans.8 But now you feel a bit isolated and in need of some company. So you check out social networking, but this time looking more closely at online dating sites. U.S. Internet users spend 22.5 percent of their online time on social networking sites or blogs.9 More than a third of couples who married between 2005 and 2012 in the United States reported meeting their spouse online, with about half of these meeting through online dating sites and the rest through other online sites such as social networking sites and virtual worlds.10 The real, physical world and what we do in it may be becoming less and less relevant, as traditional constraints of time and space are fading. And as each of us adapts to an unprecedented new dimension, what sort of individual might eventually emerge?
For certain, someone who is less attuned to the outdoors. Since 1970, the radius of activity for a child, namely the amount of space surrounding the home in which the child freely wanders, has shrunk by an astonishing 90 percent.11 And this restriction on play is unprecedented. In his book A History of Children’s Play and Play Environments, Dr. Joe Frost traces the history of children’s play from their early records in ancient Greece and Rome to the present time and concludes that “children in America have become less and less active, abandoning traditional outdoor play, work and other physical activity for sedentary, indoor virtual play, technology play or cyberplaygrounds, coupled with diets of junk food.”12 The consequences of play deprivation and the abandonment of outdoor play may well become fundamental issues in the welfare of children.
The content of a screen-based lifestyle is unprecedented not only in how it shapes thoughts and feelings but also because of the corollary effects of not exercising and not playing and learning outside. While an increasing number of digital aficionados may eventually opt for mobile technologies exclusively, for the time being an appreciable amount of time is still spent sitting down in front of a computer screen. In any case, if we’re busy texting on our mobile phones or tweeting, even if we’re out walking, we’re still less likely to be taking more strenuous physical exercise than we may otherwise do. A clear corollary of a sedentary disposition is that we put on weight. Obesity stems from many factors, including the wrong kind and quantity of food, but also from reduced energy expenditure. It is hard to specify a particular order of events: whether a child who doesn’t much like sports will be more attracted to the screen or a screen lifestyle has an allure that trumps climbing a tree is a chicken-and-egg scenario that is impossible to resolve here. Rather, we need to look at the whole digital lifestyle, both the increase in time spent in two dimensions and the simultaneous decrease in time spent in three.
For example, I recently received an email from a father of two young children in Australia that sums things up in a really arresting way:
Last weekend I had an eye-opening moment where the children had been lazing around the house, using and fighting over technology. When finally I was able to coerce them out for a short walk, we took bikes and I watched with delight the laughter and fun the kids had purely riding up and down this one particular steep-ish dogleg bend on this quiet country road. The enjoyment, laughter, and giggles from one’s children are truly music to the ears of a parent. I do not ever hear that laughter when they are using technology.
A former teacher, Sue Palmer, flagged this issue back in 2007. Her book Toxic Childhood contained a list of simple activities that a child should have experienced before reaching adolescence, such as climbing a tree, rolling down a really big hill, skipping a stone, and running around in the rain.13 How sad it is that these childhood activities, which would have been taken for granted a generation or so ago, should now be listed as identifiable goals that might otherwise not be achieved. Meanwhile, in a recent National Trust report, the term “nature deficit disorder” was coined not to describe a genuine medical condition but as a vivid expression of an endemic pattern of behavior, indicating for the first time ever that we have become dissociated from the natural world with all its beauty, complexity, and constant surprise.14 Even the most diehard digital zealot cannot escape the simple fact that every hour spent in front of a screen, however wonderful, or even beneficial, is an hour spent not holding someone’s hand or breathing in sea air. Perhaps even simply being at ease and happy in total silence could become a rarefied commodity that, instead of being a normal part of the human repertoire, will find itself on a wistful wish list of the future.
Professor Tanya Byron, a British psychologist best known for her work as a child therapist on television, was initially concerned specifically with regulation of the Internet; however, only two years later she recognized that the issue was not merely one of doing no harm but one of identifying the best possible environment beyond screen experiences. “The less children play outdoors, the less they learn to cope with the risks and challenges they will go on to face as adults,” she wrote. “Nothing can replace what children gain from the freedom and independence of thought they have when trying new things out in the open.”15 In the past, play was most usually outside in fields and woods or in urban backstreets. Just look at the many books from the children’s author Enid Blyton, written around the mid-twentieth century, where the young heroes and heroines were so busy catching smugglers and other shady villains that they only ever went indoors to have tea and to sleep.
At that time, in both fiction and fact, the environment in which you happened to be growing up provided a backdrop and props, not the actual narrative. The story came from inside your head—it had to—and arose from interaction with your friends as you became a cowboy or an Indian. It was the same inside the home, as plots were devised and story lines emerged from playing with dolls or toy soldiers or from dressing up. Trees, drawing pads, and toys (typically along with the cardboard boxes the toys came in) were merely tools and prompts for your game, your story, your internally driven scenario—above all, for your imagination. Sometimes, even quite regularly, you might be bored. But it was that very state of understimulation that impelled you to draw a picture, make up a game, or go outside to play. The point I want to stress is that you were the driver and you would be in control of your own inner world, your own private reality.
But now the screen can be the driver. Admittedly, you have to be mildly proactive in turning the device on and navigating your options, but once you have selected an activity, spectacular cyberexperiences contrived by someone else engulf you. You are now a passive recipient, and even though games such as The Sims, for example, allow you to modify and create worlds, it is always within the secondhand parameters of the game designer’s thinking. I wonder how much of the time that previously would have been spent walking in the fresh air, playing the piano, or having a face-to-face conversation has now been forfeited in favor of a cyberactivity, a completely new type of environment where taste, smell, and touch are not stimulated, where we can be completely sedentary for long periods of time, yet where the ensuing experience trumps more traditional ways of life for appeal and excitement.
It would be simplistic in the extreme to think of the powerful and pervasive new digital lifestyle as either the apotheosis of human existence or the most toxic culture ever. We are being offered an unprecedented and complex cocktail of opportunity and threat, but not everyone is likely to agree on exactly what constitutes which.