Chapter 10
Attack of the Killer Tomatoes

There are known knowns;1 there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.

—Donald Rumsfeld

It’s easy to look back in disbelief at history’s lost and dated food beliefs—the idea that cinnamon came from giant bird nests (and, if you mixed it with lamprey blood and inedible crust, made for a delectable pie); that honey fell from the sky (and, if you added it to breakfast cereal, became a gateway drug to chronic masturbation and, by proxy, baldness, habitual depression, morbid predispositions, fetid breath, and permanent darkness over one’s wretched soul); that corporations like McDonald’s and Starbucks actually care about consumers’ health or happiness.

Yet despite our litany of progress—our discriminating palates, our trove of food blogs, and our supernatural pantries—things have hardly changed. Our grandchildren, and certainly their grandchildren, will no doubt look back at us with the same wonder and bewilderment we feel when looking back at Kellogg’s Battle Creek Sanitarium or John Smith’s colonial attempts to fish with frying pans.

In 1893, a solid three hundred years after tomatoes were first cultivated in Europe2, it took the US Supreme Court to decide whether3 tomatoes were a fruit or a vegetable. At the time, imported vegetables were subject to a 10 percent tariff to protect American farmers, owing to the Tariff Act of 1883, but in 1887 a tomato importer named John Nix sued the collector of the port of New York to get his money back, arguing that tomatoes were fruits and therefore exempt. And this argument was contested for six years in escalating court battles4 before making its way to the nation’s highest court, where Supreme Court justices read from various dictionaries and heard testimony from expert witnesses before ultimately ruling that tomatoes were vegetables because they “are, like potatoes, carrots, parsnips,5 turnips, beets, cauliflower, cabbage, celery, and lettuce, usually served at dinner . . . and not, like fruits generally, as dessert.”*6,7

This happened not long after people finally decided tomatoes weren’t poisonous, a belief that had lasted for hundreds of years, owing largely to their botanical relationship to mandrakes and deadly nightshade, which are in the same family and not only are poisonous but were also said to be used in “witches’ brew” and to summon werewolves. In fact, the tomato’s scientific name, Solanum lycopersicum, literally means “wolf’s peach,”8 from the Greek lykos (“wolf”),9 which also gave us lycanthrope (“werewolf”) and persicon (“peach”)—while its old German name is Wolfspfirsich.10 And as recently as the 1860s, widespread rumors warned of tomato crops infested with poisonous worms11 capable of spitting their fatal venom several feet, causing terrible agony and instant death—but fortunately, the worms turned out to be harmless caterpillars. Another, more credible complaint was that the tomato’s innards were too acidic, causing them to leach toxic lead or copper12 from dishes and cookware. Meanwhile, others warned simply of their taste, calling them “sour trash” or “odious and repulsive smelling berries.”13

Potatoes, which come from the same family, suffered a similar reputation. In addition to their associations with witchcraft and devil worship, they were once thought to cause syphilis and leprosy,14 largely because of the way they looked, bearing a resemblance to the gnarled hands of lepers15 and, um, other afflicted body parts. Eighteenth-century Russians called them “the Devil’s apples”16 and burned them at the stake, while others warned that eating potatoes at night caused mothers to bear children with abnormally large heads17 or that pinning someone’s name to a potato cursed them to certain death. Meanwhile, wealthy people used them as decoration, growing potato plants in ornamental flower gardens18 and wearing potato flowers on their lapels19 or in their hair.

Ultimately, it wasn’t until widespread famine and crop failures forced people’s hands20 that Europeans begrudgingly offered the potato a place at their tables—even then, many resisted. Peasants in Austria “were threatened with forty lashes if they refused to embrace it,”21 while Prussia’s king Friedrich Wilhelm I threatened to cut off the ears and nose22 of dissidents who refused to plant them. In France, a scientist named Antoine-Augustin Parmentier took a softer approach;23 after struggling to convert skeptics by way of reason and science, he appealed to their sense of envy by serving potatoes to famous people and hiring armed guards to surround potato fields outside Paris, and voilà, now we have pommes frites.

Now, of course, tomatoes and potatoes are the most consumed vegetables in the United States by far, with per capita annual consumption weighing in at about thirty-one pounds of tomatoes and forty-nine pounds of potatoes24 in 2019, led largely by French fries and tomato sauce.25 In comparison, the consumption of onions, the third most popular vegetable,26 is only about nine pounds per capita.*27,28

And these weren’t the only ingredients people feared. As recently as the nineteenth century in England, there was a myth that raw fruits were poisonous, with “death by fruit” commonly listed as a cause of death29 on Victorian death certificates, a belief likely stemming from a 1569 ban on the sale of uncooked fruit30 to prevent the spread of the plague (which actually had merit, given that it was common practice at the time for butchers to throw leftover blood and entrails into rivers,31 where they’d often wash up on shores, and that this same polluted water was often used to wash fruits and vegetables).

And the fog has hardly cleared since.

People were afraid to eat Patagonian toothfish, a type of oily cod that was traditionally thrown back by fisherman, until they were rebranded with a sexier name in 1994: Chilean sea bass.32 This despite the fact that they’re neither technically a bass33 nor, a lot of the time, Chilean; many come from waters off the coasts of Africa and Australia.34 Now, of course, they sell for $29.99 a pound at Whole Foods, owing both to their desirability and the fact that this desirability has led to overfishing, going from a global capture of just 579 metric tons in 1979,35 when they were known mostly to Antarctic scientists,36 to a peak of more than 44,000 tons in 1995.37

The same thing happened with rock salmon (formerly “spiny dogfish”), blue cod (formerly “oilfish”),38 Torbay sole (formerly “witch”), and orange roughy (formerly “slimehead”). The uni (sea urchin) on your sushi platter used to be called “whore’s eggs”39 by fisherman, owing to their tendency to accumulate unwanted, fouling their equipment; before that, in ancient Greece, sea urchins were metaphors for women’s pubic hair. As David A. Fahrenthold writes in the Washington Post, “Today’s seafood is often yesterday’s trash40 fish and monsters.”

And the same state of confusion extends to almost everything else we put into our mouths, from pasta to multivitamins.

Certainly consumers are more familiar with spaghetti than yesterday’s trash fish, but that doesn’t necessarily make them savvy or any less gullible. In 1957, for example, the BBC aired a news segment on “spaghetti plantations”41 as an April Fool’s Day joke, showing footage of cooked spaghetti strands hanging from trees (to which they were affixed with tape) as farmers plucked them for harvest and placed them into baskets to dry—and people actually believed it. So much so, in fact, that the network was overrun with calls from viewers asking where they could buy their own spaghetti trees.42 Even members of the show’s production crew, who’d been kept in the dark, fell for it. Again, this was in 1957, twelve years after the development of the atom bomb.

In the 1980s, A&W tried to one-up the McDonald’s Quarter Pounder by releasing a third-pound hamburger43 that was also less expensive and rated higher in consumer taste tests. It failed, however, because Americans are bad at fractions and thought a third was smaller than a quarter.

And in 2016, lawmakers in West Virginia were sent to the hospital44 after drinking raw milk in a celebratory toast for striking down a ban on raw milk that had clearly been put there for a reason. (Or reasons, among them E. coli, listeria, salmonella, and Guillain-Barré45 syndrome, which can result in paralysis, kidney failure, stroke, and death.) To be fair, Scott Cadle, the Republican delegate who’d distributed the milk,46 denied that the incident had anything to do with the milk, telling reporters, “It didn’t have nothing to do47 with that milk” and “It ain’t because of the raw milk.”48 And it’s impossible to know for sure, as he flushed the remainder of the milk49 down the toilet before samples could be tested, which is, apparently, something he normally does with perfectly good milk.

Meanwhile, our top nutritionists still can’t decide whether or not eggs are good for us.

In 1980, the US Department of Agriculture’s “Dietary Guidelines for Americans” consisted of a twenty-page pamphlet stapled in the center,50 offering such sage advice as “Maintain ideal weight,” “Avoid too much fat, saturated fat, and cholesterol,” “Avoid too much sugar,” and “Avoid too much sodium.”

By 2005, its guidelines had become slightly more specific, recommending that Americans consume less than 300 milligrams per day of cholesterol. Its 2015 guidelines, however, weighing in at a massive 122 pages,51 removed that limitation, prompting the American Egg Board to boast, “The U.S. has joined many other countries52 and expert groups like the American Heart Association and the American College of Cardiology that do not have an upper limit for cholesterol intake in their dietary guidelines.”

Except that’s not really true, because the actual guidelines explain that the body “makes more than enough”53 cholesterol on its own and that “people do not need to obtain cholesterol through foods” before ultimately recommending that “individuals should eat as little dietary cholesterol as possible while consuming a healthy eating pattern.” But that doesn’t mean we should eat none, apparently, because their Healthy U.S.-Style Eating Pattern outlined in Appendix 3, Table A3-154 recommends 2 to 3 cup-equivalents of dairy per day and 13 to 43 ounce-equivalents of meat, poultry, eggs, and seafood per week depending on which of the twelve caloric subgroups you belong to, which you can find in Appendix 2, Table A2-155 by cross-referencing your age, sex, and physical activity level.* And their Healthy Mediterranean-Style Eating Pattern outlined in Appendix 4, Table A4-156 recommends 2 to 2½ cup-equivalents of dairy per day and 13 to 50 ounce-equivalents of meat, poultry, eggs, and seafood per week. (Note, by the way, that cup- and ounce-equivalents don’t always correlate with actual cups and ounces. One large egg, for example, is equivalent to 1 ounce-equivalent of eggs,57 yet, per the USDA’s own guidelines, an egg has to weigh a minimum of 2 ounces58 when averaged by the dozen, in order to be called large, so technically, 2 ounces of eggs is equal to 1 ounce-equivalent of eggs. Similarly, 4 ounces of pork59 is equal to 4 ounce-equivalents, but 4 ounces of walnuts is equivalent to 8 ounce-equivalents.) And if you get out your decoder glasses and scrap paper and do the math, the maximum cholesterol intake suggested with these healthy eating patterns is—surprise—still about 300 milligrams,60 so nothing has changed other than the level of obfuscation.

Mind you, this isn’t the USDA’s fault, as they’re simultaneously charged with protecting the economic interests of American farmers61 and meat and dairy producers and protecting the nutritional interest of Americans, a Sisyphean task. So on the one hand, they’re supposed to encourage us to buy more meat and dairy products—and on the other, to eat less of them. As a result, their messaging is often inescapably convoluted and schizophrenic, reading a lot like fortune cookies or Bill Clinton’s 1998 grand jury testimony: “It depends on what the meaning of ‘is’ is.”

In addition, the USDA has to split food oversight with the US Food and Drug Administration (FDA) according to a matrix of blurred and invisible lines. The USDA is responsible for overseeing nutritional guidance, pepperoni pizza, meat sauces with more than 3 percent62 red meat, open-faced sandwiches, and catfish, for example, while the FDA is responsible for nutritional labeling,63 mushroom pizza, meat-flavored sauce with less than 3 percent red meat, closed-face sandwiches, and fish other than catfish.64 The division of eggs65 is even more confusing; the USDA is responsible for the grading of shell eggs, egg-breaking and pasteurizing operations, and products that meet the USDA’s definition of “egg products,” such as dried, frozen, or liquid eggs, while the FDA is responsible for the labeling of shell eggs, egg-washing and -sorting operations, and egg products that do not meet the USDA’s definition of “egg products,” such as freeze-dried egg products, imitation egg products, cake mixes, French toast, egg sandwiches (if they’re closed and don’t also contain a certain quantity of meat), and ethnic egg delicacies like balut.

Plus, in addition to overseeing what amounts to roughly 78 percent of the US food supply,66 the FDA oversees more than 20,00067 prescription drug products, 6,500 separate categories of medical devices, 90,000 tobacco products, and consumer products ranging from perfume, pet food, and deodorant to temporary tattoos, tampons, and microwave ovens. Also laser pointers. So they simultaneously have people regulating the prevention of maggots in consumer foods and the use of medicinal maggots in wound therapy.68

As a result of this confusion, they don’t have even a fraction of the bandwidth in terms of tools, funding, manpower, or daylight to do what they’re asked to—so the majority of food facilities under their jurisdiction (56 percent) go more than five years without inspection69 and a much larger percentage (about 99 percent) of the imported foods they’re responsible for go uninspected completely.70 All this while, according to the CDC, “48 million people get sick,71 128,000 are hospitalized, and 3,000 die from foodborne diseases each year in the United States.”

So no agency is really in charge of nutrition, transparency, or labeling, and we’re basically on the honor system. The inmates are running the asylum.

And even if we set aside the political wavering over nutrition and the issues with bandwidth, jurisdiction, and food labeling that journalist Barbara Presley Noble once called “so opaque or confusing that only72 consumers with the hermeneutic abilities of a Talmudic scholar can peel back the encoded layers of meaning,” we just find more and more layers of discrepancy and ambiguity.

Most experts seem to agree that olive oil is healthy,*73 citing things like monounsaturated fats, antioxidants, and an ability to lower “bad” cholesterol74, but that’s only if your olive oil is actually olive oil—and experts say there’s a good chance it isn’t. As Larry Olmsted writes, analysts estimate that between two-thirds and 90 percent of olive oil75 sold in the United States isn’t what it’s claimed to be and that “virtually every investigation, whether by universities,76 journalists, law enforcement, or government agencies has found an industry rife with fakery.” So not much has changed since 1820, when Fredrick Accum warned that commercial olive oil77 was often rancid or tainted with lead. In 1959, an estimated ten thousand Moroccans suffered partial paralysis78 after consuming olive oil merchants had mixed with surplus industrial lubricants intended for jet engines, and in 1981, more than twenty thousand people were poisoned and hundreds died after consuming Spanish “olive oil” that turned out to be machine oil.79

The bright side, if there is one, is that, although olive oil counterfeiting remains rampant, a lot of today’s counterfeitting has to do with faking its graded virginity;80 i.e., passing off virgin for extra virgin or diluting it with oils that are less expensive but still edible, like canola, sunflower, or soybean. But you should still consider yourself lucky if your organic extra-virgin olive oil is even made from olives, let alone those rated for human consumption and not for use as lamp oil.

Similarly, a lot of experts say that fish is healthy, owing to its omega-3 content, but just as there are a lot of fish in the sea (and rivers and lakes and commercial fish farms), there’s a lot of methylmercury, polychlorinated biphenyls,81 parasites, agricultural pesticides, microplastics, and toxic algal blooms. Then there’s the fact that high levels of omega-3 have also been linked to prostate cancer.82

So choosing healthy seafood isn’t as simple as eating oysters only in months with the letter r in them, an adage that goes back at least to the 1500s from the advice only to eat oysters “that growes upon great ships bottomes,83 or in places not muddy; in those moneths that haue the letter R. in their names,” which was mostly a precaution against eating raw seafood in the summer84 prior to the advent of refrigeration—or, as Anthony Bourdain advised in Kitchen Confidential, never ordering fish on a Monday85 unless you’re eating at Le Bernardin, because most seafood vendors don’t deliver on weekends.*86

As nutrition professor and James Beard Award–winning author Marion Nestle writes, “To make an intelligent choice of fish at a supermarket,87 you have to know more than you could possibly imagine about nutrition, fish toxicology, and the life cycle and ecology of fish—what kind of fish it is, what it eats, where it was caught, and whether it was farmed or wild.”

Yet even if we did know all this, it wouldn’t make much of a difference, because fish fraud is also extremely rampant.

In 2008, two high school girls in Manhattan88 collected fish samples from restaurants and grocery stores for a school science project, preserving them in alcohol and sending them to a university lab for genetic fingerprinting, and found that half of the local restaurants and 60 percent of the grocery stores were selling mislabeled fish—including at least one endangered species.

Others who’ve run similar tests have come to similar conclusions. In 2012, researchers who collected 142 fish samples from New York restaurants89 and grocery stores found that 94 percent of the tuna, 79 percent of the snapper, and 20 percent of the salmon they ordered turned out to be other fish. In fact, seventeen of the eighteen fish sold as white tuna turned out to be escolar, also known as oilfish or “ex-lax fish,”90 a species that’s banned in Japan and Italy (and that the FDA advises against importing) because it contains toxins and indigestible wax esters91 that can cause diarrhea, abdominal cramps, nausea, headache, and vomiting. Some fish sold as red snapper and halibut turned out to be tilefish, which is on the FDA’s do-not-eat list for “women who are or might become pregnant, nursing mothers and young children,”92 owing to its high mercury content.

And this isn’t just limited to New York. In 2007, samples of fish labeled monkfish in Chicago93 turned out to be illegal and potentially deadly puffer fish, sending some customers to the hospital. And in 2016, an Inside Edition investigation of twenty-eight restaurants94 across the country found that 35 percent of sampled lobster dishes had substituted lobster with cheaper seafood. In the most egregious cases, one Florida restaurant’s lobster rolls were made from a frozen mixture of lobster, whiting, and pollock (the last two being common ingredients in frozen fish sticks), and one restaurant in New York’s Little Italy sold “lobster” ravioli that was filled only with cheese.

Meanwhile, a lot of restaurants, including Red Lobster, have gotten into hot water for replacing lobster ($24 a pound) with langostino ($4 a pound),95 a closer relative to hermit crab that’s about two inches long and also known as pelagic crab or squat lobster.

“As a seafood expert, Red Lobster understands96 that the seasonality and availability of lobster can fluctuate, so our Lobster Bisque can contain meat from Maine lobster, langostino lobster, or, in some cases, a combination of both,” explained a company spokesperson. “INSIDE EDITION’s test was a matter of what we call ‘the luck of the ladle’ and both types of lobster provide the bisque with a rich, sweet taste that our guests love.”

So once again, not much has changed in the last thousand or so years. In 1499, Henry VII had to issue a statute banning the sale97 of painted fish because fishmongers were painting and varnishing the gills of spoiled fish or brushing them with blood to make them look fresh. Other tactics at the time included blowing air into fish or stuffing them with fresh fish guts “as to make skinny, flabby fish98 look pump and fat”; fattening limp and watery lobsters by stuffing fresh haddock and wooden skewers through cracks in their tails;99 or using skewers to join pieces of broken lobsters and plugging the holes with wood.100

Before that, in 1272, Edward I banned fishmongers from watering the fish101 on their slab more than once, a practice that preserved their appearance while adding costly water weight and accelerating spoilage. Corrupt vendors who were caught watering their fish were either fined or, after having their fish smelled by a jury of peers, sometimes put into stocks with their unscrupulously treated fish burned beneath them.102

Today, of course, it’s not just fish that are watered down to add weight and volume but meats, vegetables (both “fresh” and canned), honey, and fruit juice. In 2013, Consumer Reports found that, on average, nearly half of the advertised weight of the canned foods103 they examined came from the packing liquids (e.g., the tuna water, not the actual tuna). Meanwhile, consumer advocates in the United Kingdom have reported frozen chicken breasts containing as much as 40 percent added water.104

Even the vitamins in our food aren’t to be trusted. In addition to the vitamins and dietary supplements sold in grocery stores (taken by more than half of American adults105 to make up for a lack of nutrients in their diet), a lot of the foods in grocery stores contain added vitamins. Tropicana, for example, makes orange juice with added calcium and vitamin D,106 healthy heart orange juice with omega-3107 (because what goes better with orange juice than tilapia, sardines, and anchovies?), vitamin C and zinc orange juice (“to help support a healthy immune system”),108 pineapple mango juice with probiotics,109 and apple cherry juice with fiber.110 Dannon makes probiotic yogurt111 and kids’ cotton candy–flavored smoothies with added vitamin D,112 and a lot of breakfast cereals are fortified with vitamins and minerals.

Nestlé even has a helpful chart explaining how the vitamins113 in their cereals help release energy; contribute to healthy skin; help the nervous and immune systems work properly; reduce tiredness; and contribute to healthy blood, bones, and teeth—and in regard to Nesquik cereal, Nestlé writes, “We believe in kids’ creativity.114 That’s why NESQUIK Cereal helps nourish their mind with Vitamins B3, B5, B6 and Iron in those delicious chocolaty balls.”

But some studies—conducted by people who do not sell vitamins or cereals for a living—suggest that the vitamins in those delicious chocolaty balls might actually be harmful.

For example, one study published in the Journal of Clinical Oncology found that men who supplemented their diet with high doses of vitamin B for ten years nearly doubled their risk of developing lung cancer.115 Another study found that women who supplemented their diet with vitamin B were 10 percent more likely to die during the study.116

Other studies have found that large doses of vitamin B can cause nerve damage and liver disease117 and that too much calcium and vitamin D may increase the risk of heart disease.118

Meanwhile, a 2014 analysis by the Environmental Working Group,119 “a non-profit, non-partisan organization120 dedicated to protecting human health and the environment,” warned that the percentages of fortified vitamins listed on boxes of children’s breakfast cereals were based on “woefully outdated”121 adult guidelines from 1968—and that a single serving of some cereals contained levels of vitamin A, zinc, or niacin that exceeded the tolerable upper intake levels for children set by the Institute of Medicine.

As for some of the supplements sold in supermarkets, DNA testing has shown that many contain none of the ingredients they claim to. A 2015 investigation by the attorney general of the state of New York122 found that only 21 percent of the store-brand herbal supplements tested contained DNA from the plants listed on their labels, including just 4 percent of store-brand Walmart supplements. Meanwhile, 35 percent of the supplements tested contained fillers and unlisted contaminants not identified on their label, including things like rice, beans, pine, and powdered houseplants. Other studies have found lead and arsenic in prenatal vitamins.123

So either vitamins are good for us or they kill us or they’re not even vitamins.

And the same is true for foods like olive oil, red snapper, and monkfish. (Fortunately, we’re pretty certain they can’t give us syphilis or be used to summon werewolves, so it’s not as though we’ve learned nothing about food in the last few hundred years.)

Now, all of this may seem incredibly depressing; surely, it’s not fun to realize that our favorite type of sushi might actually be oilfish and our gummy vitamins might eventually kill us. But if we’ve learned anything from history, it’s that this adversity is nothing new. Certainly, it’s tempting to picture our ancestors in perfect harmony with nature, but nature has always been trying to kill us—and every generation before us has faced culinary dangers of their own, whether from toxic roots, scarcity (e.g., crop failures, wartime supply issues), tainted meat and water, failed freezing experiments, or simply failing to pack proper fishing gear and having to resort to frying pans.

And, ultimately, it’s these struggles that pushed them to adapt and persevere, thus paving the way for apple pie (and edible crust), vanilla ice cream and Rocky Road, and extra-chunky tomato sauce—not to mention smaller jaws, bigger brains, holiday traditions, and modern civilization. . . .