Chapter 1

Emory’s ducking-below-the-parapet strategy had much to recommend it. We could keep our heads down, shuffling the world in camouflage like soldiers wearing dun in the desert, duly observing every new linguistic prohibition and suppressing perceptions of our species once prevailing, now retrograde, the better not to stand out. We could constrain our confidential heresies to small gatherings of the like-minded, rigorously vetted beforehand and convened rarely, with doors locked and phones off. By 2012, I had long since inculcated some of those “good habits” Emory had commended, never articulating outré ideas or off-color jokes in texts or emails, which made my correspondence dull. The sandstorm could peak and subside while we hunkered down. If in historical retrospect we wouldn’t be seen to have distinguished ourselves, we’d have plenty of company in that department, so the chances of a blanket amnesty were high. Most of all, we will have survived.

All very well save for the fact that curling up in a ball and waiting for the craze of intellectual egalitarianism to go away cut against the grain of my nature. Moreover, social hysterias do not stand still. If they are not yet losing steam, they are getting worse. And this one was getting worse. Radical movements keep ratcheting up their demands, because nothing enervates a cause more than success. Crusaders resent having their purpose stolen out from under them by the fulfillment of their quest; reaching the promised land leaves seekers bereft. There’s little to do in a utopian oasis but sip coconut water. So the journey must never be completed. The goal must remain out of reach. To preserve the perfect impossibility of getting there, the desired end point becomes ever more extreme.

Sure, many of the cultural casualties seemed mere bagatelle. My son was upset by the cancellation of the Darwin Awards. This annual catalog of the stupidest ways people had improved the gene pool by departing the human race was now deplored as the modern equivalent of the minstrel show. But then, his partiality to the goof website may have been nomenclatural. The spiking of a long-scheduled feature remake of The Three Stooges was neither surprising nor a grievous civilizational loss, the slapstick vaudeville act being an example par excellence of grotesque belittlement of the otherwise for cruel gladiatorial entertainment.

As its patronizing know-it-all protagonist shamelessly advertised his “exclusionary intelligence,” the British drama Sherlock met the same fate. A few bootleg DVDs might still be kicking around, but no one would admit to watching such a hidebound horror today, so it’s easy to forget how popular the series was when it first aired. Yet the debut episode fatally coincided with the near-universal ideological pivot in the summer of 2010 among the newly minted anti-intelligentsia intelligentsia. I gather association with the hateful stereotype he portrayed deep-sixed Benedict Cumberbatch’s acting career for the foreseeable.

Me, I reserved my personal grief for The Big Bang Theory. The uncommonly witty sitcom for grown-ups had been going strong since 2007 and, until 2012, had shown little sign of flagging. Yes, the writers tried desperately to make their scripts more “relevant” by having the objectionable fatheads in the cast make big mistakes—though the very concept of a “mistake” was becoming problematic—and by introducing a token alternative processor whose less readily recognizable intelligence was always showing up the defective thinking of characters who brandished PhDs. None of these brave efforts saved the series in the end because the show was still witty, and wit itself had become suspect. When CBS replaced the program with Young Sheldon, in which the conceited physicist in the original is shown to have been a perfectly ordinary little boy no more capable than his classmates, no one watched it, but at least no one marched with placards in front of the network demanding it be canceled, either. Hungry for unimpeachably anodyne fare, ideas for which, I’m told, are surprisingly hard to come up with, last I checked, the network was still filming all those scrupulously unexceptional primary school children into a twelfth season.

Of course, the repudiation of Sheldon Cooper and his stuck-up chums was just the beginning, and the cull was two-pronged. First, any portrayal of elevated intelligence even in eternally rerun classics had to be expunged for being an expression of cerebral supremacy. The scriptwriters of Family Guy had their haughty genius baby Stewie meet a swift crib death. Embarrassed by their participation in historical prejudice, Paramount claimed to have unearthed a last episode of Star Trek: The Next Generation in the late Gene Roddenberry’s papers that had never been filmed. In this much anticipated add-on finale, Data, who ostensibly stores all the known information in the universe, is fed the very last bytes that will perfectly complete his data set: the spelling of the word “Mississippi.” But he hasn’t room for even one more fact and his head explodes all over the starship. By contrast, the intellectual chauvinism in the original series of Star Trek and its spin-offs was so interwoven into every episode that, rather than excise all the scenes in which a certain pointy-eared wisenheimer Vulcan appears, the production company extirpated the whole shebang. (The commercial withdrawal of figurines, fancy-dress costumes, and other lucrative merchandise was a grave loss for the franchise, since Spock had been their biggest seller. Black-market box sets of the show itself now sell on the dark web for thousands.) Niles and Frasier Crane were irredeemably brain-vain snobs, along with the contemptuous ex-wife, Lilith: NBC proudly announced good riddance to all eleven seasons of Frasier’s eponymous show.

Second, portrayals of dunderheads obviously got the chop. Dumb and Dumber was one of the very first films to be prefaced with a warning about offensive representations of cognitive inferiority—which the uninitiated interpreted as one more gag; after the audience consistently laughed at the caution, censors offed the whole movie cold. Not only was Rain Man disappeared, but the Academy withdrew the 1989 Best Actor Oscar from Dustin Hoffman, going so far as to demand the return of the statuette. (The fact that his character of Raymond wasn’t meant to be a dummy but an autistic savant was far too fine a distinction by 2012.) They did the same thing to Tom Hanks, in defiance of a small but vocal campaign maintaining that his portrayal of Forrest Gump was politically redemptive. If Forrest wasn’t very smart, he was very wise: another differentiation too subtle by half for the times. Then there were the shows canned for embodying both eggheads and pinheads. The Simpsons was damned twice over, for Homer the doofus and his bookish daughter, Lisa. Gilligan’s Island played on the now unacceptable opposition of The Professor versus the airhead first mate. The Road Runner Show relied on the same cognitive polarity, so even sly ground cuckoos and less than wily coyotes weren’t safe.

I realize that most of you reading this—assuming that anyone is reading this—would have noted many of these vanishings as they occurred. Still, out of sight, out of mind, right? It’s worth remembering, then, how much of our pop-culture canon has been shoved down the historical garbage disposal. We’ve lost all those archetypal characters who were famously two fries short of a Happy Meal: Woody Harrelson’s namesake in Cheers, Chevy Chase’s Clark Griswold in National Lampoon’s Vacation, Steve Martin’s Navin in The Jerk, Rowan Atkinson’s Mr. Bean, even the starfish in SpongeBob SquarePants, for Pete’s sake. Betty White’s Rose in The Golden Girls, Matt LeBlanc’s Joey in Friends, Leslie Nielsen’s Dr. Rumack in Airplane! . . . Who still remembers “Don’t call me Shirley!” or “Looks like I picked the wrong week to quit drinking”? Even aw-shucks Barney Fife on The Andy Griffith Show was tossed on the trash heap for being a crude stereotype. I sometimes wonder what uproarious films we’ve all missed out on since actors like Ben Stiller, Adam Sandler, Jim Carrey, and Sacha Baron Cohen retired in disgrace.

It took me a while to notice as well that even hagiographic documentaries and biopics had dried up, because it’s easy to overlook what people are not doing. Like, they were not filming tributes to Leonardo da Vinci, Marie Curie, James Watson, Isaac Newton, or Alexander Graham Bell. So it wasn’t just that Ron Howard’s A Beautiful Mind went bye-bye; no one was scripting life stories of other tortured geniuses like Galileo or Alan Turing. The snide conceit ran that all these supposed icons were merely regular schmoes who’d tripped over whatever they were credited with creating or discovering by accident. Michelangelo’s fruit seller could have painted the Sistine Chapel, too; he just hadn’t felt like it.

If the many clones of The Calumny of IQ that colonized the bestseller list in 2012 seemed behind the curve, that was because it had been just long enough for lumbering commercial publishers to commission and release tomes like The Collapse of the Cognoscenti or How to Be an Anti-Smartist: A Practical Workbook. As children’s literature had got in on the action, too, for her seventh birthday Wade’s parents sent Lucy a copy of All My Friends Are Clever—which, thanks to all that passive cleverness with which the girl was now purportedly endowed, she could not read. Few of you will have forgotten some neurologist’s Getting Our Minds Right, because the MRI imaging he included, demonstrating that everyone’s brains were essentially identical, went viral on Twitter and occupied twenty minutes of the “Canvas” segment on the PBS NewsHour. On the other hand, it’s doubtful you recall Walter Isaacson’s chunky biography Steve Jobs, because it triggered another huffy boycott and sank like a stone. Naturally, Fifty Shades of Grey kept right on selling, because it was stupid.

Meanwhile, it must also have been around 2012 when I finally concluded that an across-the-board media dumbing down (look, live with it) wasn’t all in my imagination. The semantics that guests employed on talk shows were noticeably pared back. They preferred shorter words and shorter sentences. Whenever they allowed thoughts to dribble off on an ellipsis or to degenerate into outright incoherence, hosts nodded approvingly. News anchors were restricting their vocabulary to punchy Anglo-Saxon like “hit” and “dog,” while eschewing more esoteric words like “ratiocination” and, well, “eschewing.” Systematically eliminating complex sentences and dependent clauses, newspapers followed suit, so that stories about a shooting at a Colorado movie theater in July read like the “Look, look! Go, Sally, go!” primers that Darwin and Zanzibar were too old for by the time they were four: “James Holmes was watching a movie. The movie was The Dark Knight Rises. This is a Batman movie. Mr. Holmes shot twelve people dead. He also injured seventy people.” Oh, and doubtless somewhere in such an article the reporter would have inserted the standard disclaimer that “Mr. Holmes has the same mental ability as everyone else in Aurora.” With one glance at eyes that might have been propped open with two-inch toothpicks and hair dyed the flaming orange of Bozo the Clown (may the “gross caricature of alternative processing” rest in peace), anyone who hadn’t been brainwashed within an inch of his life could tell the pro forma media assurance was a screaming lie. If ever there was such a thing as a total muttonhead, James Holmes was it.

At some point that year, I picked up a copy of The New York Times only to discover that the crossword puzzle was no longer printed in the Arts section. I flapped through the whole edition to find the puzzle not moved but abolished. A search of the website turned up an explanatory apology from the paper’s ombudsman. Failure to complete even the easy-peasy Monday crossword had rained untold trauma on the readership since 1950, while the larger, more demanding Sunday version had left the preponderance of subscribers anguished that they might have “something wrong with them” or “something missing.” Sure enough, acrostics, anagrams, and sudoku puzzles had also vanished from print media altogether, presumably because they fostered a bigoted self-congratulation in puzzle solvers and a gloomy, psychically deleterious sense of inadequacy in the stumped.

Of greater moment than the ransacking of our television schedules: the Democratic Party’s apparatchiks had concurred by January that Barack Obama had become a liability. The president was aloof, snooty, and supercilious. Never having gotten the memo about suppressing that silver tongue, he still deliberately rubbed the popular nose in his own articulacy. Either he was failing to track the national mood or he just didn’t like the mood. Frantic advice from his press secretary notwithstanding, he continued to convey the impression that he thought he was smarter than the average bear. However challenging it may be to recall now, in many a previous era having a leader who was outstandingly astute, eloquent, and well informed would have seemed to any country’s considerable advantage. Yet by 2012, appearing as anything but one of the folks was electoral death, because the whole notion that one might want to look up to anyone in a position of authority had become preposterous. Worse, the president’s effortless cool and dry sense of humor had the same effect as those New York Times crosswords: he made voters feel that in comparison there was “something wrong with them” or that they had “something missing.”

I’m reasonably sure that for party functionaries to convince a sitting president’s own VP to challenge the incumbent for the nomination in the primaries was a historic first. Nonetheless, I’d be the first to agree that Joe Biden was an ideal fit for the times: he was impressively unimpressive. In contrast to Obama’s irksomely inspirational oratory, Biden’s speaking style was delectably leaden. His version of profundity was to make a prosaic point and then repeat it word for word. Whenever the vice president was at a loss, his compulsive insertion of “C’mon, man!” conspicuously failed to seem rousing and hip, and that election year any practice that underscored a shortcoming was the ticket. In other words, the more poorly Biden campaigned, the more voters he won over. One especially ingratiating appearance on The View, in which the VP neglected to utter a single complete sentence, while his salad of sports metaphors left the audience at perfect sea over metaphor-for-what, accumulated millions of hits on YouTube, secured the nation’s adoration in perpetuity, and guaranteed that Obama would lose the South Carolina primary by a shocking margin. The speech to the nation in which the president announced his withdrawal from the race only emphasized his transparent unelectability—because the short address was deft, elegant, and droll. The fact that Obama had class was merely one more reason to hate the guy, remember? Along with any other attribute associated with preeminence, class was out of fashion.