CHAPTER 25

THE FOURTH SCREEN AND THE MIRROR OF NARCISSUS

As the first decade of the new century rolled on, something was happening in the streets of America. More and more men and women, most in suits, could be seen compulsively pulling out a device with a tiny screen and craning their necks to stare at it. At that point, there would begin the familiar exercise of rolling a little dial before urgently typing away with both thumbs. In those years, this distinctive habit marked one as part of either the corporate class or the federal government. To everyone else, it looked self-important and faintly ridiculous. Few realized they were beholding an image of their future selves.

This new attentional habit—actually an extension of the check-in—originated in the late 1990s, when, with no particularly grand vision, two young Canadians, Mihalis (Mike) Lazaridis and Doug Fregin, developed what they styled as an improvement on the pager.* Their device, a clamshell of sorts, allowed both the sending and receiving of pages, but also, in a very primitive way, the capacity to read and write emails while on the go. They named it “The 900 Inter@ctive Pager.” The @ sign was meant to position it toward the future, and at the same time to reflect the engineering aesthetic at the heart of the company, Research in Motion, located in Waterloo, Ontario.

The RIM 900 was primitive; its monochrome screen was the size of a piece of bacon on its side, a cooked piece at that. But it was a modest success, leading Lazaridis and Fregin to develop a more advanced version, which they boldly called the “Research in Motion 950 Inter@active Pager.” This one was smaller overall, but with a bigger screen, a well-designed keyboard, and the capacity to retrieve emails automatically (at the time, this was heralded as “push” technology). It also ran for weeks on a single AA battery, a feature even current smartphone users can envy.

While no marketing expert—indeed, he had an engineer’s hostility to marketing’s influence—Lazaridis nonetheless sensed that his product’s name lacked pizzazz. He considered, for a while, calling it the “PocketLink” but at the last moment decided to consult some branding experts from California. After some analysis, in a way that might have done Ernest Dichter proud, the experts opined that the letter B conveyed reliability, and the device’s little keyboard looked like a strawberry. Out of that came the “BlackBerry.” Thus rechristened, the device finally began to feel some wind in its sails.

Slowly the new fruit began cropping up across North America and then the world, but while it was ultimately a great success, it was not a mass-market product like the television or radio; it remained the instrument of a certain elite. Indeed, so it was marketed not to the general public but rather to corporations keen on keeping their employees in reach, or on call, at all times. Thus it was those corporate types, and also federal government types (most famously Barack Obama), who pioneered an attentional habit that would define the coming century. In comparison, television’s momentous conquest of time and space in the twentieth would seem woefully incomplete.

As we’ve repeatedly seen in chronicling the past hundred years, it is such habits that make and break great attention merchants. The attending public were first captured reading daily newspapers, then listening to evening broadcasts, before they were entranced into sitting glued to the television at key intervals, and finally, over the 1990s, into surrendering some more of their waking time, opening their eyes and minds to computers—the third screen—in dens and offices around the world. Now a new device appeared capable of harvesting the attention that had been, as it were, left on the table, rather in the way fracking would later recover great reserves of oil once considered wholly inaccessible. Of course, the terms of this surrender seemed favorable, as they usually do: email on the go meant you could take your work with you and weren’t stuck at home or the office. By 2015, the fourth screen would be in the hands of virtually everyone, seizing nearly three of the average American’s waking hours. And so it would become the undisputed new frontier of attention harvesting in the twenty-first century, the attention merchants’ manifest destiny. From now on, whither thou goest, your smartphone goes, too, and of course the ads.

As so often happens in tech industries, Research in Motion started the game but did not master it. That job would be left to the world’s two mightiest computing empires, which by the 2010s had clued in to the potential for far broader adoption of what the BlackBerry represented. With their nearly unlimited reserves of engineering and design talent, Apple and Google would go on to create iPhones and Androids, respectively, and thoroughly clobber the Canadians at their own game, offering a much more appealing user interface. BlackBerry seemed to many invincible even with a mere 9 million subscribers in 2007, when the iPhone was first launched. By 2011, there would be 472 million smartphones sold worldwide in one year. There would soon follow not just an attentional habit but a new social norm, that of never parting from one’s device; of standing and staring at it, as if paralyzed, as the world goes by; of not looking up in the presence of others, except when the urge to take a picture erupts at the strangest moment—autre tech, autre moeurs: it is probably the thing a visitor from a previous century would find the weirdest.

Where the human gaze goes, business soon follows, and by the 2010s virtually everyone in the attention industries was trying to figure out what might be the best way to get a piece of all that attention now directed downward at billions of palms. Google and Apple had the proverbial head start, but there was not a company that did not have at least someone trying to devise what was called at the time “a strategy for mobile,” now that the web was following millions of people everywhere.

Some of the adaption would be obvious from the start: clickbait, for instance, could certainly travel, if one imagined the bored commuter. Yet these were still early efforts, transplants from other platforms, the way radio shows had been adapted to TV. More specific to mobile were games like Angry Birds, Candy Crush, or Flappy Bird. Designed for passing the time on a little screen, these tended, in their simplicity, to resemble the earliest video games, like Space Invaders, Pac-Man, or Tetris. Facebook earned billions by selling advertisements that encouraged and made easy the installation of other apps. But the first attention-harvesting applications to make full use of the smartphone’s distinctive capabilities were, as usual, unpredictable, yet once they’d arrived, utterly obvious.

Instagram was begun by a soft-spoken, fairly square serial tech entrepreneur named Kevin Systrom, whose inspiration came, in part, from his tenure as president of his high school photography club. He and his coding partner and cofounder, Mike Krieger, were rather typical San Francisco entrepreneurs of the 2010s, if there was such a thing: young, restless, looking for an idea, having already failed at one effort to create a saleable social media app—there was no shame in that; most of them went nowhere. They lacked the chutzpah of a Zuckerberg. Their invention was not at the level of an Edison or Bell; but as we’ve seen, in the attention industries, if the right chord is struck, it does not take much. Their new iPhone app did two things. First, it enhanced the device’s camera with a series of attractive filters that could be applied to photos, creating some immediate utility. Its second innovation, ultimately more influential, was to create a photo-centered social network—a sort of Twitter built on images. On Twitter, one shared a message; a photo was optional. Instagram made the photo mandatory and the message optional. Yep, that was it.

The simplicity of Instagram’s concept was its calling card—people got it—and it gained users quickly: by the spring of 2012, just eighteen months after its launch, it had 30 million. The thing was fun and easy to use, but it was also well timed, for it was the first popular social networking app that truly made full use of the smartphone’s functionality, tying its integrated camera to its Internet connection and app functionality. Moreover, it connected those devices with what, for want of a better name, can be called the market for aspiring celebrity. Like Twitter, Instagram users had followers—and the followers had the option with the click of a to “like” photos they enjoyed or otherwise approved. That “like” feature was the heart of Instagram, even more than of Facebook. Every photograph was, in a sense, rated by peers, thus providing a kind of instantaneous feedback.

Some would use Instagram to showcase their photographic skills or the luscious dishes they were tucking into. But far and away Instagram’s best known feature—its killer app—was the self-portrait, or as the Australians called it, “the selfie,” the term that became the Oxford Dictionary’s Word of the Year in 2013. Early on, Instagram reported that it had 53 million photos so designated (with #selfie). The selfie was so important that the form and the platform rose in tandem.

While Facebook was also a place for posting photos, and at some level presenting oneself as one would like to be seen, Instagram allowed for a seamless and more continuous visual narrative. In this way, active Instagrammers “created an Instagram life,” telling their own stories, or at least a stylized and idealized version of it, in images. If the average Facebook post showed a group of friends or a family in good times (perhaps posing with the dog, or holding up drinks at a party), Instagram’s photos would generally be more dramatic, glamorous, and often edgier, for the simple reason that they were posted with the calculated intent to seize attention and to elicit a reaction. This was particularly true for the app’s many younger users, who flocked to it as their grandparents started joining Facebook. A number of them would cultivate multiple narratives with different accounts: the “real” one (Rinstagram), which was the more contrived to impress; and the “fake” one (Finstagram), on which, for a smaller circle, they were content to be more themselves. Instagram thus occupied the territory on which Zuckerberg had originally positioned Facebook: the place to be. And the real-time fix of “likes” and comments would become for untold numbers an addictive form of self-affirmation.

Typically Instagram feeds looked as much like Facebook as Vice magazine spreads resemble a high school yearbook. The photos were not candid but posed, retouched, art directed. Some users adapted the form to more explicitly imaginative expression, creating themed accounts. For instance, a teenager called Liam Martin posted low-budget re-creations of celebrity glamour shots, usually with himself in drag; and @socalitybarbie artfully chronicled the life of “Hipster Barbie” in a way meant to mock Instragram culture. “Hipster Barbie is so much better at Instagram than you.”1

Like the early blogs, a good Instagram feed was labor-intensive. When she quit, Socality Barbie noted “the insane lengths many of us go to create the perfect Instagram life.”2 A “fashion” Instagrammer named Amanda Miller who offered an online guide to gaining followers confesses that “getting to 18.5k followers was A LOT of work.”3 In addition to composing and shooting photos, the feed demands interacting with strangers to make them feel engaged or heard, the way a politician or other public figure might—the way a real celebrity doesn’t have to do. As Miller writes, “Engagement: This is the hardest part because it takes something money can’t buy, and that is TIME. It takes A LOT OF TIME.”

Occasionally, the effort was monetized. Some of the more popular users of Instagram (typically young, attractive, and female) managed to become attention merchants, selling product placements in their photographs. “Companies have realized that one photo on the Instagram account of someone with over 100,000 followers is reaching more people directly than any traditional ad campaign,” explained a new social media ad agency; a popular, microfamous Instagrammer might charge brands $1 per “like” earned by a sponsored photo.4

For the better known—those already in the business of capturing and reselling attention, the utility of Instagram as an additional platform and revenue stream was obvious; of course their publicists or other staff could be expected to do most of the work for these incorporated personalities. In a major attention-capture stunt, Kim Kardashian famously used Instagram in 2014 to send out a picture of her nearly naked backside in an attempt to “break the Internet.” It may not quite have done that, but it paid off in attention earned; it must have, because Kardashian’s management, normally so efficient about getting paid, negotiated no other compensation. It was further evidence of attention’s having become an accepted form of currency. James Franco, a C-list actor under the old metrics, and himself a self-confessed “Instagram addict,” would speak to its broad acceptance: “It’s what the movie studios want for their products, it’s what professional writers want for their work, it’s what newspapers want—hell, it’s what everyone wants: attention. Attention is power.”5

In Franco’s case, tending his image has become a career in itself, more or less, and so he is driven to self-market to maintain his audience. Calling it a business model as opposed to mere narcissism at least provides an excuse, insofar as many careers excuse what would otherwise be considered unhealthy or insane behavior.

But the resale remains the main opportunity. For a sense of the range of possibilities, a tale of two backsides may suffice: one, as just mentioned, was that of Kim Kardashian. Upon her attempt to “break the Internet,” she was in possession of some 55 million Instagram followers. Such is the power of multi-platform, People-cover-worthy celebrity. Now consider Jen Selter, a twenty-two-year-old fitness advocate. Selter’s renown came about almost exclusively from social media, based on photographs revealing her own assiduously developed posterior. As of this writing, Selter has some 8.5 million Instagram followers, more than enough to attract all kinds of business opportunities, including sponsorships, appearances, and others enjoyed by Kardashian, if on a more modest scale. Unknown to most, she is nevertheless enough of an Instagram star to be a perfectly successful attention merchant.

But what of all those many among us busily keeping up an Instagram feed with no hope of ever reselling the attention? For most, the effort is an end in itself, and the ultimate audience is the very subject of the camera’s interest. Understood this way, Instagram is the crowning achievement of that decades-long development that we have called the “celebrification” of everyday life and ordinary people, a strategy developed by attention merchants for the sake of creating cheap content.

Let us review our story in brief, as it might relate to Instagram: For most of human history, the proliferation of the individual likeness was the sole prerogative of the illustrious, whether it was the face of the emperor on a Roman coin or the face of Garbo on the silver screen. The commercialization of photography may have broadened access to portraiture somewhat, but apart from wanted posters, the image of most common people would never be widely propagated. In the twentieth century, Hollywood created a cohort of demigods, whose image everyone recognized and many, in effect, worshipped. With the arrival of the smartphone and Instagram, however, much of the power of a great film studio was now in every hand attached to a heart yearning for fame; not only could one create an image to rival those of the old icons of glamour, but one could put it on a platform where millions might potentially see it.

Perhaps a century of the ascendant self, of the self’s progressive liberation from any trammels not explicitly conceived to protect other selves, perhaps this progression, when wedded to the magic of technology serving not the state or even the corporation but the individual ego, perhaps it could reach no other logical endpoint, but the self as its own object of worship.

Of course, it is easy to denigrate as vanity even harmless forms of self-expression. Indulging in a bit of self-centeredness from time to time, playing with the trappings of fame, can be a form of entertainment for oneself and one’s friends, especially when undertaken with a sense of irony. Certainly, too, the self-portrait, and the even more patently ludicrous invention, the selfie stick, has become too easy a target for charges of self-involvement. Humans, after all, have sought the admiration of others in various ways since the dawn of time; it is a feature of our social and sexual natures. The desire of men and women to dress up and parade may be as deeply rooted as the peacock’s impulse to strut. Like all attention harvesters, Instagram has not stirred any new yearning within us, merely acted upon one already there, and facilitated its gratification to an unimaginable extent. Therein lies the real problem.

Technology doesn’t follow culture so much as culture follows technology. New forms of expression naturally arise from new media, but so do new sensibilities and new behaviors. All desire, the philosopher and critic René Girard wrote, is essentially mimetic; beyond our elemental needs, we are led to seek after things by the example of others, those whom we may know personally or through their fame. When our desires go beyond the elemental, they enter into their metaphysical dimension, in which, as Girard wrote, “All desire is a desire to be,” to enjoy an image of fulfillment such as we have observed in others. This is the essential problem with the preening self unbound by social media, and the democratization of fame. By presenting us with example upon example, it legitimates self-aggrandizement as an objective for ever more of us. By encouraging anyone to capture the attention of others with the spectacle of one’s self—in some cases, even to the point of earning a living by it—it warps our understanding of our own existence and its relation to others. That this should become the manner of being for us all is surely the definitive dystopic vision of late modernity. But perhaps it was foretold by the metastatic proliferation of the attention merchants’ model throughout our culture.

In the fall of 2015, an Australian teenager, Essena O’Neill, quit Instagram in utter despair. A natural beauty and part-time model, she had become an Instagram celebrity, thanks to her pictures, which had drawn half a million followers. But her Instagram career, she explained, had made her life a torment.

“I had the dream life. I had half a million people interested in me on Instagram. I had over a hundred thousand views on most of my videos on YouTube. To a lot of people, I made it,” she confessed in a video. But suddenly it had all become too much.

Everything I was doing was edited and contrived and to get more views….Everything I did was for views, for likes, for followers….Social media, especially how I used it, isn’t real. It’s contrived images and edited clips ranked against each other. It’s a system based on social approval, likes, validation in views, success in followers. It’s perfectly orchestrated self-absorbed judgement….I met people that are far more successful online than I am, and they are just as miserable and lonely and scared and lost. We all are.6

A survey of Instagram and other social media users by the London Guardian yielded similar responses, suggesting that even among those with relatively few followers the commitment is grim. “I feel anxiety over how many likes I get after I post a picture. If I get two likes, I feel like, what’s wrong with me?” wrote one woman.7 “I do feel insecure if I see girls who look prettier than me,” wrote another, “or if they post really pretty pictures, and I know I won’t look as good in any that I post. I do feel pressure to look good in the photos I put up. I don’t feel anxious about not getting enough likes on a photo but if it doesn’t get enough likes, I will take it down.”

In April 2012, a mere eighteen months after its debut, Instagram was purchased by Facebook for $1 billion. The high-flying start-up’s founders had cashed out without ever having devised a business model. No matter: by November the following year, the first ad feed would run in Instagram, following Facebook principles of limited targeting. The acquisition would prove astute. In April 2012 Instagram had 30 million users, but by the fall of 2015 it had 400 million, more than Twitter. And so Facebook would join the ranks of hoary behemoths with a war chest. A transfusion of young blood would preserve their status in the uppermost echelon of attention merchants.

As for Instagram, its upward glide portended a future in which the line between the watcher and the watched, the buyer and the seller, was more blurred than ever. The once highly ordered attention economy had seemingly devolved into a chaotic mutual admiration society, full of enterprising Narcissi, surely an arrangement of affairs without real precedent in human history.


* The pager was a portable device used in the 1980s and 1990s that allowed the bearer to receive notifications that a return call was desired.