WE KNOW FROM THE RECENT past that Silicon Valley is not destiny. It’s possible to turn away from its monopolies. There’s an example to illuminate the path—a precedent exists of consumers rejecting the primacy of convenience and low prices, where they have rebelled against homogenization. They have subsidized artisanal practices, which had been deemed doomed. The optimistic prospects for escaping the pull of Facebook, Google, and Amazon come in the form of yogurt, granola, and mâche.
A nontrivial percentage of the populace now stridently cares about what it puts in its mouth, which suggests it can be persuaded to apply the same care to what it ingests through the brain. Caring about the morality and quality of food has become a symbol of social status, which raises the question, why can’t concern for books, essays, and journalism acquire the same cachet?
Back when Stewart Brand was editing the Whole Earth Catalog—peddling the artifacts of an alternative lifestyle to all the hippies who had retreated to the communes—he touted the promise of organic food. Food was as essential to the counterculture as drugs because the hippies were rebelling against a culture that placed food on a pedestal. During the Depression, Americans slipped into their beds feeling hungry. After the war, poverty began to dissipate. It was a world of genuine abundance, new suburbs with new supermarkets featuring a panoply of newly invented items. Food shouted its magical properties—Wonder Bread, Spam (“the miracle meat”), instant breakfast, Minute Rice. Advertisements touted new foods, like Tang, as space age achievements. What made these products so incredible was that they seemed to solve the essential problem of modernity, the crunch of time. At least that was the anxious-making pitch of television commercials, which warned there were no longer enough hours to cook. The freezer case would claw back the day.
It wasn’t hard to see through this marketing hokum, and the counterculture did so with sneering disdain. “A fresh-frozen life in some prepackaged suburb,” the intellectuals of the movement snapped. Butterball, Twinkies, and Jell-O perfectly symbolized everything amiss with postwar America. They were tasteless, conformist, and stamped with the imprint of corporate capitalism. Theodore Roszak wrote about the technocratic evils of Wonder Bread: “Not only do they provide bread aplenty, but the bread is as soft as floss; it takes no effort to chew, and yet it is vitamin-enriched.” If you wanted to pick an object worthy of rebellion, an avatar of alienation, food wasn’t a bad place to begin.
The hippies shoved all the store-bought crap from the plate and replaced it with a vision of goodness. Communes, with their back-to-the-land faith in self-sufficiency, cultivated gardens and raised livestock. Bohemian neighborhoods across the country opened nonprofit cooperatives with aisles of ethically produced food. Vegetarianism—once the relatively esoteric practice of Seventh-Day Adventists, Hindus, and assorted freethinkers—found a large following among the Woodstock set. An entirely new diet emerged, including such novel items as tofu and yogurt. The cultural critic Warren Belasco has written brilliantly about the semiotics of hippie cuisine. “White vs. brown was a central contrast. . . . Whiteness meant Wonder Bread, White Tower, Cool Whip, Minute Rice, instant mashed potatoes, peeled apples, White Tornadoes, white coats, white collar, whitewash, White House, white racism. Brown meant whole wheat bread, unhulled rice, turbinado sugar, wildflower honey, unsulfured molasses, soy sauce, peasant yams, ‘black is beautiful.’”
The counterculture combined austere politics with hedonism, ethical righteousness, and corporeal pleasure. Food was in many ways the apotheosis of the movement. In the shadow of all that radicalism, the so-called Gourmet Ghetto flourished in Berkeley. Alice Waters transferred to the University of California just in time to witness the green shoots of activism there. The school’s Free Speech Movement, with its charismatic leaders and utopian politics, transfixed her. Waters began to host salons, where she cooked for the likes of Huey Newton and Abbie Hoffman. Waters tied her cuisine to the Bay Area zeitgeist. Her ideal was the French food she tasted as an exchange student—food that wasn’t frozen or canned, but intimately connected to the farm, forest, and sea. Through food, it was possible to achieve the holism that Stewart Brand touted, to see our interplanetary connection. (“Eating is a political act,” Waters proclaimed.) In 1971, Waters opened Chez Panisse, arguably the most influential restaurant in American history. Through tarragon and escarole, Waters attempted to inject countercultural values into the mainstream.
It’s an oft-told tale, how mainstream American society effortlessly absorbed the ethos of the counterculture. All that rebellious spirit was domesticated and turned into Madison Avenue slogans. Capitalism came to extol the virtues of rebellion and nonconformity. Dare to be different, the car commercial taunted. This is the story of food, of how a company like Celestial Seasonings went from a bunch of hippies selling herbal products to a company that does $100 million in annual sales; or how two Jewish transplants to Vermont, with a jones for the Grateful Dead, created an ice cream brand found in every 7-Eleven and Walmart; or how even McDonald’s now sells salads full of once-exotic greens.
This merits cynicism, but also praise and awe. The journalist David Kamp has written that the transformation of the American diet might well be the counterculture’s greatest and most lasting triumph. Wonder Bread America seemed an irreversible way of life. Although it hasn’t been reversed, it has been dented. The ideal of farm-to-table cuisine—food that is minimally processed and grown locally—has fixed itself in the upper middle class. From there, it has started to spread to society at large, through the exhortations of Michelle Obama and the instruction of celebrity chefs.
Let’s not mistake the food movement for a Marxist one. There’s more than a hint of conspicuous consumption in the fetishization of heirloom tomatoes and the obeisance paid to slabs of grass-fed dry-aged beef. The rich have always used food to set themselves apart from the rest of us. Once hippies became yuppies, it was unavoidable that they would spend their disposable income on cuisine, which is what helped propel the rise of Williams-Sonoma, the debut of the Food Network, and the whole era of gastropornography.
Still, in the farmers’ markets and the Whole Foods, there remains something radical—a turn away from the cheap, mass-produced, and heavily marketed. Why have American consumers taken this unexpected turn? Okay, the food is often better, but sometimes it’s indistinguishable in taste from the stuff that can be bought at Safeway. What they’re really purchasing is the sensation of virtue and rectitude. Michael Pollan has written:
Though seldom articulated as such, the attempt to redefine, or escape, the traditional role of consumer has become an important aspiration of the food movement. In various ways it seeks to put the relationship between consumers and producers on a new, more neighborly footing, enriching the kinds of information exchanged in the transaction, and encouraging us to regard our food dollars as “votes” for a different kind of agriculture and, by implication, economy. The modern marketplace would have us decide what to buy strictly on the basis of price and self-interest; the food movement implicitly proposes that we enlarge our understanding of both those terms, suggesting that not just “good value” but ethical and political values should inform our buying decisions, and that we’ll get more satisfaction from our eating when they do.
These are aspirations worthy of transposition.
• • •
THERE ARE SIMILARITIES between the new corporate concentration in culture and the old corporate concentration in food. But we shouldn’t simply cast blame on capitalist villainy. Just as the American consumer ushered in the age of the Twinkie, the American consumer is complicit in the degradation of the culture. Over the last two decades, readers have come to regard words as disposable goods. They pay shamefully little, if anything, for much of the writing they consume. That’s a depressing truth, but it also contains the possibility for redemption: If readers helped create the conditions for monopolistic dominance, they also have the ability to reverse it.
Everything hinges on undoing the devil’s bargain of advertising. Media has always subsisted on it. For most of their history, media lived off two streams of revenue. Readers paid for newspapers and magazines, either in the form of subscriptions or as copies bought at the newsstand. Subscriptions rarely covered the cost of printing and delivery, but that didn’t matter. A publication’s circulation file was evidence of a committed, captive audience—and the attention of that audience could be sold for a handsome profit to advertisers.
Because circulation was never a profitable business, the Internet hardly required a large leap of imagination. Instead of selling journalism to readers at a loss, media would give it away for nothing. Media executives bet everything on a fantasy: Publishing free articles on the Internet would enable newspapers and magazines to increase their readership manifold; advertising riches would follow the audience growth. It was a scenario that entranced nearly the entire industry, except the few brave contrarians with the steel to draw paywalls over their sites.
It might have worked, were it not for Google and Facebook. Newspapers and magazines assumed that the Web would be like a giant newsstand—and readers would remain attached to the sterling reputations of their titles, their distinctive sensibility, and brand-name writers. The new megaportals changed all that. They became the entry point for the Internet—and when readers entered, they hardly paid attention to the names attached to the journalism they read.
With their enormous scale, Facebook and Google could undercut media, selling ad space for phenomenally little because they had nearly infinite windows of display. Since they specialized in collecting data on their users, they could guarantee advertisers a precisely micro-targeted audience. By deflating the cost of advertising, Facebook and Google overthrew the entire go-with-your-gut expense-account regime that had dominated advertising for nearly a century. Indeed, you could buy their ads online—on Google the process is an automated auction—without having to deal with brokers and commissions. Textbook economics could have predicted the consequence of this deflation: As the media critic Michael Wolff has written, “To overcome falling ad prices you had to redouble audience growth.”
Advertising has become an unwinnable battle. Facebook and Google will always beat media. Between 2006 and 2017, advertiser spending on newspapers dropped by nearly 75 percent, with most of that money redirected to Facebook and Google. Money shifted because the tech monopolists simply do a much better job of steadily holding the attention of audiences. Readers are committed to those platforms, returning to them all day long. Grabbing the attention of readers has become a very difficult project for media, often requiring trickery. Media increasingly rely on “drive-by traffic.” Readers on Facebook and Google are conned into clicking on a piece, based on a carefully engineered headline, a provocative photo, or trendy subject matter. The New York Times media reporter John Herrman mocked this mind-set: “Websites plausibly marketed these people as members of their audiences, rather than temporarily diverted members of a platform’s audience. Wherever they came from, they were counted in the Chartbeat. They saw at least 50 percent of at least one ad for at least one second, and so they existed.”
What’s worse, victory in the traffic wars is slippery. Just as soon as a media outlet achieves its ambitious goals, advertisers deem those goals inadequate. Wolff has noted that advertisers keep shifting the targets ever higher. In 2010, a site needed about ten million unique visitors per month to score a meaningful buy. By 2014, that number rose to fifty million “uniques.” There’s no plausible strategy for growing at that clip, especially not a strategy that preserves editorial identity and integrity.
You know who foresaw the terrible tyranny of Internet advertising? Larry Page and Sergey Brin. They resisted turning Google into an advertising engine, at least at first. While they were still students at Stanford, they wrote a paper arguing, “We expect that advertising-funded search engines will be inherently biased toward the advertisers and away from the needs of consumers.” It was such a worrying concern that they even doubted whether a trustworthy search engine could ever thrive in the marketplace. “We believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.” They discarded their own wisdom long ago.
While media chase a fake audience, they consciously neglect their devoted readers. Subscribers to print editions are considered vestiges of a bygone era, even though they remain reliable sources of revenue. You might never know it from the digital rush, but circulation departments are efficient streams of revenue for many media companies. Still, it is often assumed that these readers will eventually die—and younger readers are habituated to never pay—so there is no point in growing their lot.
That assumption requires reversing. The time has arrived to liberate media from their reliance on advertising. Media need to scale back their ambitions, to return to their niches, to reclaim the loyalty of core audiences—a move that will yield superior editorial and sustainable businesses, even if such retrenchment would crush owners’ (mostly delusional) fantasies of getting gobbled by conglomerates or launching IPOs. To rescue themselves, media will need to charge readers, and readers will need to pay.
• • •
IN 1946, George Orwell wrote a charming essay on a closely related subject, called “Books v. Cigarettes.” The piece begins with Orwell, who had worked in a used-book shop, taking inventory of his own library. Orwell wasn’t a quant. This is his only essay to include charts and tables. To be sure, the calculations weren’t the most challenging. Orwell found that he had spent twenty-five pounds annually on books. In the grand scheme of his expenses, this was a trifle. “The cost of reading, even if you buy books instead of borrowing them and take in a fairly large number of periodicals, does not amount to more than the combined cost of smoking and drinking.”
His point was that reading is one of the cheaper forms of recreation, yet it was widely considered an expensive hobby beyond the reach of the workingman. This was an assumption that the workingman internalized. The average British citizen spent one pound or less on books each year, which depressed Orwell to no end. He concluded his essay with this bit of sourness: “It is not a proud record for a country which is nearly 100 per cent literate and where the ordinary man spends more on cigarettes than an Indian peasant has for his whole livelihood. And if our book consumption remains as low as it has been, at least let us admit that it is because reading is a less exciting pastime than going to the dogs, the pictures or the pub, and not because books, whether bought or borrowed, are too expensive.”
Of course, Orwell’s subject was books. If we were to update his arguments, his primary concern would be journalism. Books, indeed, offer a measure of optimism. The American public paid good money to buy 652,673,000 hardcover books in 2015. So let’s stop rationalizing the inane economics of magazines and newspapers. It’s silly to assert that information wants to be free. That was a piece of nineties pablum that has survived far too long. Consumers have no inherent problem paying for words, so long as publishers place a price tag on them.
Orwell tried to provoke the public to spend on words by nagging his readers with wit, charm, and shame. He would surely deem his efforts a failure. But he wasn’t wrong. The culture industry can indeed persuade consumers to spend on worthy texts. At the same time he was mulling these questions, a man on the other side of the Atlantic was showing how it could be done.
• • •
SIGMUND FREUD’S NEPHEW EDWARD BERNAYS arrived in New York as an infant. Far from Vienna, he still managed to become versed in his uncle’s theories. As an adult, he found a novel application for them. Bernays took the theory of the subconscious and used it to create the profession of public relations. Early in his career, Bernays devised slogans for the Wilson administration to help float public support for World War I. After the peace, he took his techniques and turned them into a manifesto, as well as a business. He wrote the tract Propaganda, one of the most influential handbooks of the twentieth century. Bernays’s devotees included Joseph Goebbels. (To be fair, Bernays declined to work for Hitler and Franco, both of whom solicited his services.) His firm concocted slogans and advertising campaigns for the biggest corporations in America. Bernays convinced Americans that bacon and eggs are the quintessential wholesome breakfast foods. He used subliminal images of vaginas and venereal diseases to help promote disposable Dixie Cups as the only sanitary method of drinking. “Propaganda,” he wrote chillingly, “makes it possible for minority ideas to become effective more quickly.”
In the 1930s, the New York book publishers were filled with existential dread. The stock market crash and ensuing Depression wounded their business. They were bereft of ideas that might stimulate revival. In their desperation, Simon & Schuster, Harcourt Brace, and others turned to Bernays for salvation. Bernays developed a thoroughgoing critique of book publishing, which he accused of underpricing its product. But he also came up with an ingenious formula for transforming the industry: bookshelves. “Where there are bookshelves, there will be books,” he confidently asserted.
Bookshelves were alien to most American homes, a luxury reserved for Jay Gatsby and his kind. Bernays methodically went about introducing bookshelves to the middle class. He persuaded architects to include them in their plans and encouraged stories in magazines (House Beautiful, American Home, Woman’s Home Companion) that celebrated built-in shelving. The shelves were obviously an adornment, but also more than that. The presence of books in the household was implicitly meant to signify social advancement—books were hallmarks of the ascendant professional class, whose jobs demanded intellectual skills; they were consumer goods that indicated purchasing power. The cultural historian Ted Striphas has written that the shelving fad of the interwar years represented “the allure of propriety and abundance, which could be realized not only through the consumption, but, equally important, through the accumulation and display of printed books.”
The proliferation of bookshelves was hailed for injecting new life into publishing. An article in Publishers Weekly proclaimed, “We are profiting at the moment from the need for books in individual homes built during the past few years. . . . Now is the time to get behind it and keep going!” This phenomenon was a classic example of what the midcentury sociologist Erving Goffman described in his book The Presentation of Self in Everyday Life. He understood that we unfurl ourselves as if we are actors in a play. We choose props and sets to make our character more convincing. For the growing middle class, anxious about its place in the world, books created an impression of well-deserved elevation to the higher ranks of society.
The New Yorker serves as this sort of prop, read on the subway, displayed on the coffee table. Readers gaudily unfurl the magazine as a totem of their cosmopolitanism and literary bent. The New Yorker publishes its share of refined clickbait and self-help pieces (in the guise of social science). Still, the magazine has largely managed to wean itself of its dependence on advertising, shifting toward a financial reliance on its readers. (Even as the New Yorker profited from advertising, it approached that income warily. (During his long reign as the magazine’s editor, the famously prudish William Shawn frequently turned away advertising that he considered distasteful, especially lingerie ads.) The New Yorker has guarded the value of its prime property, its print magazine, by resisting the impulse to give away all of its pieces for free on the Internet.
Of course, the New Yorker occupies a unique place in the culture, always has. But it is possible to build cultural cachet from scratch. Strangely, it’s the tech companies that have best pulled off that trick. Ads for the iPad show it as a method for reading the New York Times and the New Yorker, a way to pursue hobbies like astronomy and fine photography. Amazon’s marketing features travelers sitting in cosmopolitan locales, Kindle in hand. They have billed themselves as both status symbols and the devices that will bring cultivation.
Here’s where the food movement provides an object lesson. The culture industries need to present themselves as the organic alternative, a symbol of social status and aspiration. Media must denounce their most recent phase, to lead a rebellion against the processed, ephemeral, speed-based writing encouraged by the tech companies. Subscriptions are the route away from the aisles of clickbait. (The New York Times successfully marketed itself as a bulwark of democracy in the wake of Donald Trump’s election, acquiring 130,000 subscribers in the immediate aftermath of that debacle, implicitly contrasting itself with Facebook’s morass of conspiracy and falsehood.) Sure, it will always be possible to get lots of information online for nothing. But if enlightenment and virtue aren’t free and easy, that’s a reasonable price to pay.
• • •
AGRICULTURE AND CULTURE come from the Latin colere. The great critic Raymond Williams excavated the fossilized forerunner. “Colere had a range of meanings: inhabit, cultivate, protect, honour with worship,” he wrote. When the Latin passed into English, it referred specifically to husbandry. Culture meant tending to the natural growth of crops and animals.
On the eve of the Enlightenment, the term became a metaphor for humans, who required tending, too. Most specifically, it was the mind that required attention, protection, and cultivation. Thomas More: “to the culture and profit of their minds”; Francis Bacon: “the culture and manurance of minds.” “Culture” is a word that never settles into a stable meaning. Rather, we apply it promiscuously and infuse it with our own biases. Williams called culture “one of the two or three most complicated words in the English language.”
Through the word’s long, wending history, it retains traces of colere. Our faith in culture is diminishing, replaced by our mania for data, but we still worship at its shrines. We still believe that art, books, music, and film have the power to cultivate the self. This is the very thing that obsessed Louis Brandeis, his fixation on “developing the faculties.”
We know this is a noble sentiment, but also a tinged one. To describe oneself as “cultured” is an assertion of superiority. The sociologist Pierre Bourdieu made a career of pointing this out, if a bit too emphatically. Bourdieu, the son of a peasant, grew up speaking a moribund dialect of French. He soared through the French meritocracy to the most rarefied heights of the intelligentsia. Once admitted to the club, he railed against it. Bourdieu argued that a dominant class enforces rules about what is and is not acceptable. It defines good art, good food, good books—and creates an exclusionary vocabulary for describing them. “Taste classifies, and it classifies the classifier,” he famously wrote.
The world Bourdieu described was very French, a bit hard for an American to appreciate. Overt snobbery has largely foundered on the shoals of hamburger and apple pie. Another Frenchman, Alexis de Tocqueville, understood this. The nature of American society was to eschew elitism. In Tocqueville’s account, elites interacted with workingmen as equals, even if their bank accounts told a different story (and even if our faith in equality bred rampant mediocrity). Cultural elites have made it their business to elevate the masses—an ethos that culminated in the glorious rise of the middlebrow culture of midcentury America. In those anni mirabiles, Henry Luce’s publications hired serious writers (James Agee, Dwight Macdonald, John Hersey, Daniel Bell) and put serious intellectuals (Walter Lippmann, Reinhold Niebuhr, T. S. Eliot) on its covers, which were designed by serious artists (Fernand Léger, Diego Rivera, Rockwell Kent). NBC hired Arturo Toscanini to conduct its orchestra; Leonard Bernstein hosted a prime-time show on CBS to teach appreciation of symphonic music. The Book of the Month Club and the Readers’ Subscription delivered literature to American homes on a regular basis.
A sense of noblesse oblige permeated these efforts—and a sense of status anxiety made the American public receptive to them. Thanks to the GI Bill, millions of Americans attended college, often the first in their families to make the trek beyond high school. The prosperity of the postwar years swelled the middle class. To validate their arrival in a higher social station, Americans swaddled themselves in higher culture. They filled the shelves that Bernays inspired with encyclopedias, leather-bound editions of classics, and hardcover novels. Art house cinema proliferated, because there was a meaningful market for Godard and Antonioni. Midsized cities sprouted symphony orchestras.
Not everything that flourished in this era was worthy of praise. “Middlebrow” became a term of derision for good reason. There was tension in the vision of culture. The elites that ran media and publishing, record labels and movie studios believed they were great patrons. But, of course, they were running commercial institutions. At their worst, they peddled mass-market novels posing as great art. At their best, they nurtured ambitious art and challenging ideas and sold them to society.
Our greatest companies in publishing and journalism have mythologized their mission; many are still varnishing themselves in a patina of nobility. This high-mindedness isn’t hard to strip away. Those companies may pose as guardians of intellectual seriousness, but they also exist to turn a profit. They aren’t latter-day Medicis, even if that’s the sense of self that helps them through the day. The health of our culture, however, depends on the persistence of that mythology. It’s the myth that ties these companies to colere, the old root of culture, a faith that they must cultivate minds. Without this myth, culture is just another market-pleasing commodity.
That myth is still standing, but only just. We’re on the cusp of an age of algorithmically derived art and ideas. Machines are increasingly suggesting the most popular topics for human inquiry, and humans are increasingly obeying. Instead of experimentation and novelty, data is leading the way, propelling us toward formula. The myth of cultivation gives way to crass manipulation.
One common reaction to this change is resignation—fatalism in the face of technology’s inevitable march and the shifting habits of rising generations. Criticizing change can feel like an act of fist-shaking grumpiness and standing athwart history. Better to be mature, the thinking goes. Better to accede and make the most of circumstances, to steadily navigate the roiling. But writers and editors know, deep down, that the compromises come at too great a price; some readers have a sense that superior alternatives exist. There are moments when we all seem to agree on this point. The election of Donald Trump came with the shock of collective recognition that our media culture has decayed—and a sense that we need more committed protectors of truth than the feckless gatekeepers at Facebook and Google. Grasping the problem is not enough. We need to permit our analysis of the problem to guide us to sweeping solutions before we irreversibly change our most important institutions and values.