Façade Law

Grace Gold was her name, and she was 17 years old, a first-year student at Barnard College. On the evening of May 16, 1979, as she walked past 601 West 115th Street, a chunk of masonry broke off a seventh-floor lintel and hit her in the head. She died five minutes later.

The building had been flaking apart for some time, and it was not the only one. The combination of age and poor maintenance in a city gone broke meant that many old structures had crumbling concrete and rusting cast iron. Within a year, the City Council passed (and Mayor Ed Koch signed) Local Law 10, requiring that all New York buildings over six stories high have their façades inspected every five years. Now revised and called Local Law 11, the statute has become even more important as age creeps up on the city’s postwar apartment buildings, many of which have sheer brick curtain walls that can become unstable over time. Although the law is not ideal from an architectural-preservation standpoint—landlords have been known to jackhammer carved details off their buildings rather than repair them—Chicago, Philadelphia, and many other cities have followed with similar ordinances.

Fandom1

It all began, as so many stories beloved by geeks do, with an improbable name: Hugo Gernsback. Born Hugo Gernsbacher in Luxembourg in 1884 and emigrating to Manhattan in 1904, he, like so many pioneers of nerd culture, was a Jew who adopted a new moniker in order to fit in with the Gentiles. Gernsback was eminently a man of the Roaring Twenties, a self-made (and self-remade) man, always ready to invest in a new scheme, tuned in to the flash of consumer capitalism, and—perhaps most important—obsessed with technology and the march of progress. He sold mail-order electrical equipment, started one of the country’s first commercial radio stations, and wrote about the concept of television before the invention even existed. But his greatest achievements came in his role as publisher of an array of forward-thinking magazines about tech and the future. It was the latter endeavor that gave birth to the concept of fandom.

One of Gernsback’s publications was Amazing Stories, launched in 1926. It’s generally regarded as the first magazine dedicated to science fiction—or, as he initially called it, scientifiction, a term that never quite caught on. Inside, readers found reprints of stories by pioneers like Jules Verne and H. G. Wells alongside the earliest works of future legends like Ursula K. Le Guin and Isaac Asimov. Youngsters ate the stuff up and, equally valuable, began to form a community around it. Gernsback would print letter-writers’ addresses along with their names, so interested parties could write to one another and circulate their own stories. He was their interlocutor and idol, and when he lost control of that magazine in 1929, they flocked to his new one, Science Wonder Stories. Eventually, these enthusiasts coined a term for their little community: fandom.

Not only did they invent the terminology, they invented the praxis. On December 11, 1929, a tiny coterie of fandom members met up in a Harlem apartment to talk about the present and future of their favorite genre and the wider world of cutting-edge science. They called themselves the Scienceers, and they were the first geek fan club. Some of them started to put out a magazine called The Time Traveller in 1932, filled with adolescent spaceflight dreams. The first thing resembling a fan convention was held in Philadelphia in 1936, but it attracted only a handful of attendees. The first proper “con” was the World Science Fiction Convention, organized in 1939 in conjunction with the World’s Fair in Flushing Meadows, albeit held in midtown (at 110 East 59th Street, to be exact). About 200 people showed up.

All that would be enough to confirm New York’s hold on the title of fandom’s birthplace, but one more event makes it indisputable: In 1964, the first declared comic-book convention was held in Greenwich Village. (Game of Thrones author George R. R. Martin claims he was the first to sign up to attend.) These days, San Diego Comic-Con is the most prominent of the fan conventions, and the World Science Fiction Convention (now known as Worldcon) moves from city to city every year. But New York’s influence is set in stone: The comics awards in San Diego are named after a Brooklyn boy, cartoonist Will Eisner, and Worldcon’s awards for excellence in sci-fi are called the Hugos.

1 How to Navigate New York Comic Con

From Dimitrios Fragiskatos of Anyone Comics

Check the whole schedule ahead of time and make a plan. “If you’re a fan of someone who draws a line, they might be at their booth most of the time, they could be signing at a lower-profile publisher or bookstore’s table at another point in the day. This could mean a shorter line and more time to spend with them.”

Carry as little as possible. “Going through security will be a pain, and wading through the crowds will be a pain. Minimize your discomfort by having less to look through and less for people to bump into. I’m not saying don’t bring books to get signed or dress up, but don’t bring your entire box for Frank Miller to sign—and cosplayers, don’t bring weapons!”

Every day has its benefits. “To me, Thursday is the best day. Every exhibitor is fresh and enthusiastic, no one has to cancel the rest of their appearances because they’re sick, and there are fewer attendees around. But Sunday is the best day to get good deals. If the item you want survives until Sunday, ask the exhibitor if they can come down on the price. It’ll be worth it for them to bring one less item back with them. Friday is great if you want to attend a ton of panels. Saturday is great if you’re basic.”

Eat elsewhere. “The convention food is overpriced. Step outside to get a cheaper meal. (Not at the McDonald’s; it’s crowded as all hell.) You’ll see cosplayers and creators doing the same, so it won’t feel like you left at all.”

Be part of the comics ecosystem! “You want to learn to make comics? Go to the how-to panel. If one of the artists has a style you dig, go to their table. Buy a commission; watch them draw the Ant-Man you asked for. They’ll tell you about the book they drew. Go to the comic-store tables and get that book. Take a picture with cool costumes along the way. Go visit Paul Rudd and get that drawing of Ant-Man signed. Actors love the comics their movies inspired—trust me!”

Fashion see SEVENTH AVENUE

Federalism

As a mode of government featuring both centralized and regional layers of authority, federalism aims to strike a rough parity between the two. Although the general idea has been with us since the time of Hellenistic Greece, the modern conception of federalism originated in the eighteenth-century equivalent of a New York City tabloid feud.

In 1787, the American republic was still operating under the Articles of Confederation, a “league of friendship” between largely independent sovereign states that gave the central government little power. From the perspective of New York’s Alexander Hamilton and his aristocratic allies, this arrangement had a lot of flaws. For example, when the merchant class tried to shift the burdens of repaying the Revolutionary War debt onto subsistence farmers (forcing many to sell their land to aristocrats at bargain-basement prices) and those farmers responded by violently rebelling, there were no federal troops to force them into submission. What’s more, some of the 13 states had enacted constitutions that awarded power to non-rich white men. (They did not go so far as to empower women or people of color.) Thus, Hamilton and his fellow Founding Fathers pushed for the enactment of a new, federalist constitution for the U.S. republic that would allow for a strong central national government. These “federalists” insisted it was possible to combine the best features of regional autonomy (e.g., the protection of quaint local customs such as enslaving human beings) with those of national rule (e.g., being able to violently suppress unruly peasants).

Not everyone agreed. After the Constitution was drafted and sent to the states for ratification, an anti-Federalist writing under the pseudonym Cato started bashing the proposal in the New York City press. Hamilton responded by teaming with John Jay and James Madison to write a series of pro-Constitution articles under their own Roman pseudonym, Publius, and addressed their arguments directly to the people of New York. These “Federalist Papers” detailed the virtues of keeping a strong national government in tension with state and local authorities. Ultimately, Hamilton won his highfalutin flame war, and the U.S. Constitution was ratified. Over the ensuing centuries, it would become a model for democratic constitutions the world over, thereby exporting the “made in NYC” model of federalism to nations as disparate as Australia, Yugoslavia, and Venezuela.

Federal Reserve System

The most powerful financial institution on earth isn’t in midtown (JPMorgan Chase), and it’s not between One World Trade Center and the Hudson River (Goldman Sachs). It’s housed in a neo-Florentine limestone-and-sandstone palazzo between Nassau Street, Maiden Lane, Liberty Street, and William Street. (It’s also the fulcrum of Die Hard With a Vengeance.) The Federal Reserve Bank of New York is first among equals in the twelve branches of the federal central-banking system and, in the Fed’s early years, was essentially its power base, largely setting the monetary policy for the country.

The U.S. was late to monetary governance—two abortive attempts were made in 1791 and 1816 with the Philadelphia-based First and Second Bank of the United States. And between the Civil War and World War I, American finance was mainly in a state of anarchy, with thousands of banks cobbled together into a ramshackle network with no overarching entity like the German Reichsbank or the Bank of England. Booms, panics, and collapses happened nearly every decade as this free market swooped and dived.

That started to change when a group of bankers, economists, and politicians pushed for reform, and New York financiers like Paul Warburg of Kuhn, Loeb & Co. and JPMorgan partner Henry Davison took a leading role. After years of political wrangling, the Federal Reserve that emerged in 1914 was dominated by Benjamin Strong, who left his position as the president of Bankers Trust to become the New York Fed’s first president. The early Fed held the reserves of its member banks, issued Federal Reserve notes (i.e., money), and even lent to banks.

By the 1920s, Fed banks realized that by buying and selling U.S. government debt, they could move interest rates up and down, foreshadowing the Fed’s current “open market operations,” its signature task (one carried out in New York). The forerunner of today’s Federal Open Market Committee (on which the New York Fed president always has a seat) was chaired by Strong. The Fed was designed with its power divided among a board of governors in Washington and in the regional branches dispersed from Boston to San Francisco. But thanks to Strong’s geographical and social proximity to the titans of American finance, his connections to financiers in Europe, and his knowledge of central banking, he quickly came to dominate the institution.

While Federal Reserve power eventually settled in Washington, the New York Fed is still where the action is—at least compared with its branches in Chicago, Cleveland, or Kansas City. Presidents of the New York Fed have gone on to become Fed chair (Paul Volcker) and Treasury secretary (Tim Geithner). And in our age of bailouts and international crises, the New York Fed’s oversight of and financial connections with its local financial industry—which happen to include, well, Wall Street—gives it unique power and influence. When the Fed does business with Wall Street, it does it in New York.

The buying and selling of bonds that determine the federal funds rate happens through the New York Fed, as do the more exotic bailout, rescue, and economic-support plans. It’s no coincidence that the LLCs set up to facilitate JPMorgan Chase’s emergency purchase of Bear Stearns and the bailout of AIG were named after Maiden Lane. And although precious metals no longer have the central place they once did in the global financial system, the New York Fed looks like a fortress for a reason: Its vault, built on bedrock some eighty feet below street level, holds 12 million pounds of gold bars.

Felt-Tip Pen see MAGIC MARKER

Feud, Literary2

Edgar Allan Poe was a drunk, an ephebophile, a hoaxer, and an itinerant literary vagrant, and he invented the art of literary shit talk as we know it in America. Born in Boston and raised and educated in Virginia, he devised the detective story while living in Philadelphia in 1841 and, soon after, moved to New York City, where his polemical, personal tract “The Literati of New York City” was serialized over six issues of Godey’s Lady’s Book in the summer and fall of 1846. Poe named names, described faces, called out frauds, and spared no one whose tastes he deemed too “Flemish” or style too “bizarre.” With a fondness for the deprecating double negative, Poe isn’t entirely ungenerous in his appraisals. His mission was to expose the gulf “between the popular ‘opinion’ of the merits of contemporary authors and that held and expressed of them in private literary society.” He delivered the raw truth about his peers: “The most ‘popular,’ the most ‘successful’ writers among us (for a brief period, at least) are, ninety-nine times out of a hundred, persons of mere address, perseverance, effrontery—in a word, busy-bodies, toadies, quacks.” One of those quacks was Henry Wadsworth Longfellow, but he was spared the full treatment because he lived in Cambridge, Massachusetts, where, due to his position at Harvard, he had “a whole legion of active quacks at his control.” The controversy over Poe’s breaking ranks lasted months.

It also started a tradition of calling out literary quackery. In 1935, Mary McCarthy and Margaret Marshall took aim at the reviewers of their day in a five-part series for The Nation called “Our Critics, Right or Wrong.” It effectively put McCarthy, then 23, on the map, and she would remain one of the country’s most cutting critics for more than five decades. In 1959, Elizabeth Hardwick wrote an essay for Harper’s called “The Decline of Book Reviewing,” which became the blueprint for The New York Review of Books. That same year, Norman Mailer took aim at his fellow novelists in his collection Advertisements for Myself. The essay “Evaluations: Quick and Expensive Comments on the Talent in the Room” rated J. D. Salinger as “no more than the greatest mind ever to stay in prep school” and James Baldwin as “too charming a writer to be major.”

For a brief decade or two, writers were a staple of television chat shows, not as pundits but as personalities, and often those personalities clashed on air. On the Dick Cavett Show, Mailer called Gore Vidal “a liar and a hypocrite” for comparing him to Charles Manson. “It hurts my sense of intellectual pollution,” Mailer said. “As an expert, you should know about that,” Vidal replied. (Although they were epic feuders, neither participant was exclusive: Vidal carried on a long-simmering battle with Truman Capote, and Mailer fought it out with Tom Wolfe, Michiko Kakutani, and virtually everyone else in New York.) In 1979, Cavett asked McCarthy which writers she thought were overrated, and she turned the subject to Lillian Hellman. “I said once in an interview that every word she writes is a lie, including and and the,” McCarthy said. Hellman sued her for more than $2 million, a libel case that ended only with Hellman’s death in 1984. (The feud was later portrayed in a play by Brian Richard Mori, in which Cavett starred as himself.) Social media, of course, is the ultimate in feud-enabling technology. If a day goes by without a literary dustup, it’s only because some outside entity has writers united in their disgust.

2 The Essential New York Reading List

Chosen by New York literary critic Molly Young

Image

Fifth Avenue

see COOPERATIVE APARTMENT BUILDING; PENTHOUSE; ROBBER BARON; SOCIALITE; TRUMPISM

Flagel

Back in the ’90s, the Atkins diet craze hit gluten with the force of a high-glycemic food coma, and a certain dense breakfast staple (see also BAGEL) became Exhibit A. So New York’s enterprising bagel purveyors improvised: Unlike other enemies of a low-carb diet, like doughnuts and pasta, bagels could be made healthier without affecting their most important part—the outer crust—by either scooping out the innards or flattening a smaller amount of dough out to the regular diameter.

Purists ridiculed scooping as “utterly sacrilegious,” but flattening proved a more artful solution. Former Village Voice food critic Robert Sietsema has argued that Tasty Bagel, a legendary outpost in Bensonhurst, Brooklyn, boiled the inaugural flagel in the mid-’90s. But Tasty’s owner, Joe Geraldi, downplays its role, saying customers simply kept special-requesting “just a flat bagel.” Meanwhile, Bagel Boss—a chain founded in 1975 by Mel Rosner, who styled himself a “bagel innovator, not an imitator” and purportedly invented the chocolate-chip variation—claims it has been serving a flagel since 1999. In 2010, the company officially trademarked the name.

In reality, bagels had been deflating for decades, their bakers inspired by old-world flatbreads, among them the Middle Eastern laffa, Turkish simit, Lebanese ka’ak, and especially the elongated Jerusalem bagel. Rosner got his start making bialys, the flat oniony rolls that many non–New Yorkers might mistake for flagels, although their crust is quite different.

Flat breadstuffs ultimately popped up all over town—at Russ & Daughters, Pick A Bagel, David’s Bagels, Zaro’s, H&H Midtown—lending them staying power after all the fad dieters got back to carbs. Thinner than a hero roll, and therefore more merciful on an eater’s jaw, flagels grew popular during the rise of the all-day bagel sandwich in the early aughts. As with the bagel it modernized, the flagel has undergone changes, such as when the bagel-sandwich craze itself got caught in the industrywide “supersizing” wave and lots of flagels followed suit. Today, many are the same weight as traditional bagels, only flatter.

Flashmob

The origins of the flashmob—a brief gathering of hundreds of people at a random site, arranged purely as a befuddling prank—can be dated with precision. On May 27, 2003, the writer Bill Wasik (“bored and therefore disposed towards acts of social-scientific inquiry,” he wrote some years later) created and then forwarded an email to about sixty people, telling them to convene upon a Claire’s accessories store near Astor Place the following Tuesday at 7:24 p.m. and disperse at 7:31. The point was not to do anything in particular; the point was to show up for the sake of showing up, creating a harmless sight gag or physical punch line.

That first one didn’t go as planned, because the cops heard about it in advance and dispersed the flashmobbers. But another, two weeks later, indeed came together (two hundred people in the Macy’s rug department for no evident reason), and a lightly anarchic, if somewhat pointy-headed, fad was born. That July, an organizer in Boston called for hundreds of people to enter a department store, each one claiming to need a greeting card to send to “my friend Bill in New York.” More events in other cities followed. The ones Wasik himself arranged ran ten minutes or less, all the better to leave barely a trace as the (always benign) mob dissipated.

In the outsize amount of media coverage surrounding these events, Wasik cited as a forebear Stanley Milgram, the social psychologist known for his experiments in obedience that had asked people to administer electric shocks to others, as well as Allen Funt’s reality-TV progenitor Candid Camera. Certain styles of performance art, notably the 1960s “happenings,” were similarly part of the trend’s DNA. Yet the flashmob was uniquely attuned to its time and place, particularly to the aughts’ hipster ethos of ironic detachment, wherein fine distinctions among small things mattered, yet large issues could be dispatched with a “yeah, whatever.” The flashmob participant took the event seriously enough to show up, while taking very little else, including the flashmob itself, very seriously at all. And like everything hipster-adjacent, it drew the attention of the corporate world. By 2005, the flashmob had been co-opted by the Ford Motor Company, which began arranging “flash concerts” to market its automobiles. Barely a year after Wasik’s first experiment, flash mob popped up in the Concise Oxford English Dictionary.

FM

The electrical engineer Edwin Armstrong is hardly a household name, but this native New Yorker changed the soundscape of America (see also RADIO BROADCASTING) not once but three times over. In 1912, he developed what’s known as the regenerative circuit, which (broadly speaking) allowed a radio to receive weak signals without a lot of extra electronics. He followed that up in 1918 with the superheterodyne circuit, which eliminated a lot of the squeals and chirps that plagued early radio tuners, thus accelerating radio’s shift from a hobbyist attic medium to a mainstream living-room fixture. And in 1933, from his lab at Columbia University, came his biggest breakthrough of all: frequency modulation, an entirely new way of encoding information into a broadcast signal that was far more resistant to interference. Whereas AM signals were perpetually full of static, FM was not.

Armstrong spent the next decade and a half trying to hang on to his invention. Although he’d grown wealthy licensing his earlier designs to the giant Radio Corporation of America, he and the company could not come to an agreement over FM, and RCA eventually offered its own knockoff circuits based on his idea. Armstrong sued and spent years in court and many thousands of dollars. In 1954, he cracked under the strain and leapt from his thirteenth-floor window on East 52nd Street. Within the year, the respective legal teams settled and Armstrong’s widow received approximately $1 million.

Although the early FM receivers were imperfect—the tuners tended to drift off the signal as their electronics warmed up, requiring repeated adjustment—solid-state technology resolved that problem in the 1950s (see also TRANSISTOR). The new broadcast band gradually came to be the main destination for classical music, jazz, and eventually popular music. In 1978, the band Steely Dan even recorded an ode to it, titled “FM,” with the refrain “No static at all!” Today, the AM radio dial is nearly all news and talk; FM is where the music is.

Folkie3

For many listeners, “folk music” began in the late 1950s at sleepaway camp, a relatively low-cost, vaguely socialist, sanity-saving respite from the urban parent-child continuum. After sunset, a “counselor,” usually a young man with a cantorial voice or a lank-haired woman strumming a Sears Silvertone guitar, led campers in renditions of “The Midnight Special,” “Swing Low, Sweet Chariot,” and, of course, “This Land Is Your Land.”

It never crossed most of these kids’ minds that many of their “folk songs” had originally been sung by enslaved African-Americans and sharecroppers and Celtic-Appalachian dirt farmers. (Some did associate them, vaguely, with leftism and commie rabble-rousers like Woody Guthrie, who had THIS MACHINE KILLS FASCISTS written on his guitar.) A mere three years later, these same adolescents would be sitting in bohemian Greenwich Village coffee houses like Le Figaro and Caffe Reggio, boldly ordering double espressos. Attired in woolly red-and-black Pendleton jackets and corduroy caps, they cast their budding intellectual-political lot in with the revolution. No mindless rock and roll there. They were folkies.

From roughly 1957 through the early 1970s, the intersection of Bleecker and Macdougal was the undisputed nexus of what came to be called the American folk-music revival. San Francisco, Boston, and Chicago had their protest singers, but the Village was the hothouse, the mecca. Some, like Bob Dylan, Joan Baez (who graced the cover of Time magazine in 1962), and Peter, Paul and Mary (whose eponymous debut album topped the Billboard charts for six straight weeks that same year) became voices of the civil-rights movement, appearing before the multitudes at the 1963 March on Washington opening for Martin Luther King Jr.’s “I Have a Dream” speech.

Image

Bob Dylan and Joan Baez, 1964.

PHOTOGRAPH BY BARRY FEINSTEIN

For the rest, it was a scene. The 14-year-old folkie could emerge from the West 4th Street station, a six-foot scarf wrapped around his or her neck, with every chance of seeing Dave Van Ronk, Phil Ochs, Eric Andersen, Janis Ian, Patrick Sky, Mark Spoelstra, Happy and Artie Traum, Carolyn Hester, John Sebastian, Richie Havens, Richard and Mimi Fariña, Odetta, Buffy Sainte-Marie, Guy Carawan, Gil Turner, Peter La Farge, Tom Paxton, or Jean Ritchie, not to mention Doc Watson and an endless supply of hat passers who lived in local walk-ups. With the bomb ready to drop, someone was always protesting something at the Gaslight, Gerde’s Folk City, the Bitter End, Cafe Wha?, the Night Owl, the Cafe Bizarre, or any of the clubs whose leases, everyone said, belonged to Mafia landlords. The focal point of the scene was the Folklore Center, operated by Izzy Young at 110 Macdougal Street, just steps from the Kettle of Fish (at No. 114), a Kerouac-era bar adopted by harder-drinking folkies. There one could buy guitar picks, issues of Broadside and Sing Out!, or a copy of Lead Belly’s Last Sessions or Harry Smith’s epic Anthology of American Folk Music, a six-LP collection of Depression-era 78s. Every “real” Villager knew that here, in these scratchy interfaces with the vanished past, the true hidden American songbook resided, not in that commercial schlock turned out in midtown (see also BRILL BUILDING).

Van Ronk called Young’s place a “clubhouse”: “If you had no fixed address, you would have your mail sent care of the Folklore Center… it became a catalyst for all sorts of things.” Indeed, one cold day in 1961, in its tiny back room, Van Ronk came to share a microphone with the 19-year-old Bob Dylan, who had recently arrived in the Village with a suitcase full of apocrypha about being an orphan and running away from the circus. “Want to do that ‘Fixin’ to Die’?” Van Ronk asked, referring to a wrenching blues number written by Bukka White while he’d been imprisoned in Mississippi in the 1930s. Then the two young white singers from relatively well-off backgrounds bashed into a tune of ultimate powerlessness, harmonizing on lines like “I’m looking funny in my eyes, an’ I believe I’m fixin’ to die / I know I was born to die, but I hate to leave my children cryin’.” In 1963, Columbia released The Freewheelin’ Bob Dylan, featuring “Blowin’ in the Wind,” “Masters of War,” and “A Hard Rain’s A-Gonna Fall.” The album’s cover showed a windswept Dylan and his girlfriend, Suze Rotolo, running through the dirty snow on Jones Street, summing up the scene in a single frame.

In 1965, Dylan, then one of the world’s biggest pop stars, recorded the single “Positively 4th Street,” the final kiss-off to the place that had made him. He’d already gone electric; now he was abandoning protest, calling some of his most famous works “finger-pointing songs.” Its well-known opening zinger, “You’ve got a lotta nerve to say you are my friend,” left a lot of people wondering if he meant them. More likely it was the scene itself, with all those purist rules Dylan could no longer abide. A few years later, he’d be living in Malibu.

The Village folkie scene didn’t die just then (the war in Vietnam continued to rage, and there were many versions of Leonard Cohen’s “Suzanne” left to be sung). It was just that the neo-hobos who’d come to town with guitars on their back, people like Jimi Hendrix, who led the house band at Cafe Wha? in 1966 under the name Jimmy James and the Blue Flames, were now making the kind of music you could dance to.

3 The Essential Greenwich Village Folkie Playlist

Chosen by Hap Pardo, director of musical operations at Cafe Wha?

Fountain Pen

Image

Hard to imagine a time when the only self-contained writing implement was a pencil or a stick of chalk or charcoal. Yet barely 150 years have elapsed since scribes were liberated from the mess of the open inkwell and its dip pen, made at first from a quill and subsequently with a steel nib. A few unproduced or variously flawed products aside—including a potentially usable 1847 model built and patented by Walter Hunt, better known for another pointy invention (see also SAFETY PIN)—the great cursive leap can be attributed to a Brooklyn resident named Lewis Waterman. He’d been selling some of those flawed early pens, and in 1884 he patented the key improvement: a variable-depth fine groove in the nib that fed ink slowly and evenly, through capillary action, from an internal reservoir to the paper. Even though fountain pens have long since moved from everyday writing tool to baroque enthusiast niche product, they all still work that way.

Image

Heather Holliday, Coney Island sword-swallower.

Freak Show4

The first presentations of abnormal people and animals date to sixteenth-century Europe, where unusual bodies were seen as comical, rather than frightening or grotesque, and freak shows were lighthearted affairs that accompanied church feasts and holidays. But New York turned freak shows into something else entirely: more extreme, profit driven, fantastical, and cruel. And for that, we can thank P. T. Barnum.

The home of freak shows as we know them was Scudder’s American Museum on the corner of Broadway and Ann Streets in lower Manhattan. Barnum bought Scudder’s, renamed it after himself, and began exhibiting people there in 1842. By that time, such displays had been gaining popularity in “scientific” museums around the country. They were often framed as educational, but Barnum saw something bigger: commercial potential.

He put a wide range of people on display, going well beyond the nonnormative bodies that freak shows had once featured. And crucially, he wasn’t afraid to fabricate outlandish—and often racist and ableist—stories to pull in gawkers. Barnum displayed individuals with ambiguous gender traits, entire Native American and Chinese “families,” and the wildly tattooed. In his telling, unusual characteristics became outsize spectacles: “the largest Mountain of Human Flesh ever seen in the form of a woman,” the “mammoth infant,” and the “living skeleton.” One of Barnum’s earliest draws was Joice Heth, a formerly enslaved woman who Barnum claimed was 161 years old and the onetime nurse of George Washington. Heth fascinated the public and helped establish Barnum’s reputation, while he made up a series of tall tales to keep popular interest fixed on her.

Barnum actually managed to make freak shows highbrow. Through a constantly rotating series of human “exhibits” and by paying scientific “experts” to back up his claims, he turned his museum into the ritziest place in New York, one where visiting dignitaries wanted to be seen (it was across the street from Astor House and a few blocks from Delmonico’s, which didn’t hurt, either). But he also wasn’t above marketing his shows to the threepenny press aimed at working-class readers. His publicity strategy is credited with the spread of freak shows across the country over the next hundred years, but by the mid–twentieth century, the public’s understanding of disability had started to shift and the events fell out of favor. Even now, though, Barnum’s legacy hasn’t entirely disappeared. The Coney Island Circus Sideshow advertises “Freaks, wonders, and human curiosities!” daily throughout the summer season.

4 How to Swallow a Sword

From sword-swallower, strongman, and Coney Island USA Sideshow School professor Adam Realman

“How do you swallow a sword? Very carefully! It requires mind and body control. If you flip that switch in your head that says, Yes! I will accomplish what I set out to do, then guess what: You actually will be able to swallow a sword. That’s the mental. The physical is the series of gag points that your body has—the epiglottis, the esophagus, the sphincter, the stomach. When the sword passes down each one, you’re going to feel like retching. You need to learn to control those.

“You start with a training implement, something softer than a sword, more flexible, so that, should you be able to get it down and you retch or move, it will bend along with your body and not puncture your esophagus. I’ve taught people who within minutes have managed to get the training implement down. Conversely, I’ve taught people who just can’t ever, because they don’t believe in themselves.”

Free Verse

Like many revolutionary writers, Walt Whitman was at first self-published, printing 795 copies of the first edition of Leaves of Grass at his own expense and putting them on sale on July 4, 1855, when he was 36. The book’s explicit sexual content made it controversial. A fellow poet said Whitman should have burned the volume. The Saturday Press called for its author to commit suicide. Ralph Waldo Emerson admired the book but wrote to Whitman advising him to tone it down. Whitman was fired from a clerkship in the Department of the Interior because the secretary thought he wrote smut. But the book’s content was less radical than its form. Taking his sense of rhythm from the King James Bible but adhering to no regular meter, Whitman was writing free verse.

Free verse is defined by an irregularity in the recurrence of a poem’s stresses and rhymes. The irregularity is persistent and can be seen on the page as well as heard by the ear when the poem is spoken aloud. Detractors sometimes dismiss it simply as “rhythmical prose.” There were precursors to Whitman’s experiments, notably the liberal use of blank (i.e., unrhymed) verse by John Milton in Paradise Lost and the long, religious poem “Jubilate Agno,” which Christopher Smart wrote in a madhouse between 1759 and 1763 (and of which only thirty-two pages survive, first published in 1939). Though Whitman was largely derided, even shunned in his lifetime, his influence immediately registered in America and Europe. Baudelaire, Rilke, Hopkins, and Eliot became practitioners and advocates of this seemingly nonformal form.

Like many innovations that do away with traditional constraints, free verse is often said to be democratizing, and it did become the dominant mode of poetry in the twentieth century. From William Carlos Williams’s short line through Allen Ginsberg’s homages to Whitman to the “American Hybrid” style in vogue at the turn of the new century, it has become not so much a way for poets to escape the rules but to invent, and instantly reinvent, their own.

Friends see SITCOM

Frozen Custard

Image

When it’s made the classic American way, ice cream calls for milk and cream, sugar, and some churning to add air (see also SUPERPREMIUM ICE CREAM). The French add egg yolks; the Italians, in their gelato, use more milk than cream and stir it less, for a dense smoothness. Early in the twentieth century, however, four Pennsylvania brothers, the Kohr family, introduced an extremely delicious variant. The oldest, Archie, usually gets the credit for tinkering with the ice-cream maker they’d purchased and creating their frozen custard—basically an ice-cream mixture with whole eggs and not so much churned-in air, served slightly less cold than the standard scooped product—though other Kohrs claim they were involved to various degrees. In 1919, they hauled their machine to Coney Island and set up shop.

Coney Island, just then, was itself facing change as it turned from a fancy seaside resort for aristocrats to one frequented by the masses. (The subway reached Coney Island that same year, jump-starting this social leveling.) The nickel cone was the perfect accompaniment to a middle-class day at the beach, and the Kohrs’ frozen-custard business soon expanded up and down the East Coast with 48 stores by 1960. Competitors sprung up too: In Coney Island, the preponderance of these were owned and operated by Greek immigrants. Most of them served a standard complement of flavors (vanilla, chocolate, pistachio) plus a signature unique alternative (maple walnut, say, or banana).

With Coney Island’s physical decay after the 1960s—not to mention the increasing presence of “soft serve,” that ersatz, eggless, nearly tasteless stuff offered at Dairy Queen and McDonald’s—frozen custard became harder to find in New York. Its life continued mostly at the Jersey Shore and a few other niche markets, including (curiously) Wisconsin, where it built a big following at roadside drive-in shops. But that long slide was reversed in 2001, when the upscale restaurateur Danny Meyer opened his first Shake Shack hamburger kiosk in Madison Square Park. Meyer’s pastry chef Nicole Kaplan developed a frozen-custard recipe for the stand’s cones and milkshakes, one that used higher-quality ingredients than the old Coney Island sellers would ever have imagined. Her custard was a huge hit, and Shake Shack now serves it in more than 200 stores in 15 countries, with more to come.

Frozen Hot Chocolate

Stephen Bruce, Calvin Holt, and Patch Caradine didn’t set out to found a frozen-dessert empire. They were three struggling actors and dancers who opened a café together in 1954, figuring they’d sling some coffees to pay the bills while they waited for their stage careers to take off. All their initial offerings were hot drinks, so someone—nobody quite remembers who—suggested they add a funny oxymoron to the menu: frozen hot chocolate. They came up with a recipe for a rich, slushy drink topped with whipped cream and served in a goblet, and they billed it as a secret formula.

It was not an instant hit. Early visitors to their Upper East Side café, Serendipity 3, were intrigued, if not immediately sold. But within a few years, Marilyn Monroe—then married to Arthur Miller and living a few blocks away—was coming in to sip the drink while learning lines. Andy Warhol was paying for his sugar fix with drawings instead of cash. And Jackie Kennedy found she liked frozen hot chocolate so much that she asked for the secret recipe to serve at a White House dinner. (The owners flatly refused.) These days, its fan base is somewhat less elite: Six couples have chosen to be married in bathtubs filled with the concoction. Serendipity 3 estimates it has served somewhere around 25 million such drinks, including several to Cher, who has come by and gotten them to go.

“Fuhgeddaboudit”

In 1985, the Washington Post writer (and future Donnie Brasco screenwriter) Paul Attanasio interviewed Martin Scorsese, who, in the course of their conversation, answered two questions with rhetorical utterances of exasperated resignation: “Whaddaya gonna do?” and “Forgetaboutit.” The latter appears to be the first published use of a commonplace expression in the New York Italian-American vernacular that had long ago spread to the broader city population. Two years later, Tom Wolfe (see also NEW JOURNALISM) used it in his novel The Bonfire of the Vanities with the innovative twist of the phonetic spelling “Fuhgedaboudit.”

By the early 1990s, it was showing up in the Times, and in 2004, the wisecracking, boosterish Brooklyn borough president Marty Markowitz used it on the LEAVING BROOKLYN sign that drivers can see as they roll onto the Verrazzano-Narrows Bridge. In 2016 it made it into the Oxford English Dictionary with three D’s and one T, plus a usage note clarifying its origins “in representations of regional speech (associated especially with New York and New Jersey).” Notably, it is a word that can be deployed as its own opposite: You can say “fuhgeddaboudit” to mean “It’s no trouble at all, sir,” and, in other contexts, “That’s out of the question, buddy.”