HOG HEAVEN
Food, like sex, is deeply personal. Unlike sex, however, it is part of public ritual and responsibility. We eat publicly but make love privately. The ways in which we copulate are, by and large, without public consequence, yet how and what we eat is an important force in shaping and reshaping culture, as it shapes and reshapes our individual bodies and experiences. Food tells a people’s collective story in the same way that the molecules we eat assemble the body. The air we breathe and the water we drink (with luck and care, in the best of circumstances) are more or less the same, but food is endlessly varied. Its particularities tell much about our place in the world, just as it carries information about the changing nature of the world itself. Food is the primary incursion of the physical world into the individual.
Given what we have seen of agriculture—how it has pushed past the boundary of the planet’s arable land in so many directions—is it any surprise to find that it has overrun our bodies as well? How could it be otherwise? As we have seen, industrial agriculture grows commodities, not food. The fact that commodities do not meet our bodies’ needs is irrelevant.
Throughout much of agricultural history, but especially during the past forty years, agriculture has evolved in a course independent of human needs, as a force of its own, with a will of its own. Modern agriculture does not exist to serve human demands. The writer Robert Pirsig points out that to a hog in a pen it must appear that he has enslaved the farmer. Why else would the guy show up twice a day with a bucket full of feed? The hog believes this until the day he dies.
Coincident with the rise of industrialism, people started to see food less as the connection between one’s body and the natural world and more as a barrier between humans and the imagined savagery of the natural world. This had more to do with industrialism’s demonization of nature than with a new view of food itself, but this shift is not just academic. It changed what we eat.
To accommodate this shift, food would have to be processed—refined, purified, distilled, transfigured—into something other than its natural form. We can perhaps mark the beginning of this process with the work of Sylvester Graham, a Presbyterian minister who stumped the United States’ eastern seaboard in the first half of the nineteenth century, proselytizing for what we would today call a fad food. Graham’s enemy was nothing more than indigestion, and he blamed this then-feared affliction on improper diet (reasonable enough) and sexuality. Graham advised people to avoid highly seasoned foods on the grounds that spices stimulated carnal appetites. Sexual activities, according to Graham, provoked the inflammation that ruined one’s physical and spiritual health. His prescription? Vegetarianism, a link between puritanism and herbivory that stretches to our time. I mention him here, however, by way of introducing a particular food he invented and touted as the key to health, the Graham Cracker. Indeed, he was on to something here, in that his cracker, different from the commercial variety that survives today, represented a return to coarse, high-fiber flours that even by Graham’s time had been replaced by refined flours. He reasoned that the refined flours were too dense a package of nutrition, and he was right.
However, the future of processed foods from Graham on would follow exactly the opposite course, and all that would survive of his legacy was the idea that a food might be invented that would, in its waferlike purity, contain salvation. In this sense the line from Graham to modern processed foods is direct and unbroken. In 1863, James C. Jackson invented a breakfast food—made from Graham’s flour—that he called “granula.” The product was pretty awful and had to be reconstituted in milk overnight to be palatable to all but the most self-flagellating of puritans, but it set wheels in motion. A Seventh-Day Adventist and acolyte of Graham’s, Dr. John Harvey Kellogg, followed through with his own cereal, something he called “granola,” which he served in his sanitarium at Battle Creek, Michigan. When Jackson sued him for patent infringement, Kellogg moved on to wheat flakes, then to corn and rice cereals. Right behind Kellogg came the advertising genius Charles W. Post with his own line of processed grain cereals.
The spirit of all of this invention gets satirized in T. Coraghessan Boyle’s novel The Road to Wellville, and Boyle had ample material. The Battle Creek, Michigan, phalanx of the movement linked up with some contemporaneous German theorists in an unholy alliance of Prussian and Puritan sensibility that spawned sanitariums and vegetarian clubs promising to purify the bodies (and capture some of the income stream) of industrialism’s ascendant bourgeoisie. The movement completed industrialism’s triumph over all things natural by conquering food, which was to be reduced to something unrecognizable either by machine or by an individual’s brute force. British prime minister William E. Gladstone spearheaded the movement to chew each bite thirty-two times, once for each tooth, as a sort of mathematical symmetry (as if English food were not already sufficiently removed from the natural). An American businessman, Horace Fletcher, gave his name to the extreme oral processing of food, “fletcherization.” Under this doctrine, thirty-two chomps was regarded as the barest minimum of mastication. Fletcher once testified that a particularly stubborn shallot he encountered required 720 jaw strokes to be brought to heel.
The point of all of this was elimination, that is, to get that nasty bit of nature through the body as rapidly as possible. The Victorians were mightily concerned with the motion of the bowels. Kellogg considered three daily sit-downs to be a mark of good health. His parallel concern reveals itself in his preoccupation with the digestive systems of young boys, for whom he prescribed enemas and mineral oil, not only to promote elimination but to prevent masturbation. Fletcher, meanwhile, boasted that his oral rigor gave him odor-free stools, a claim he offered to substantiate by mailing samples to anyone insisting on empirical evidence.
Any of us who has been much exposed to modern food faddists will find more than campy nineteenth-century fun in this bit of history. As Boyle accurately portrays, the faddists cloaked themselves in pseudoscience, attempting to participate in the main business of legitimate science in that era: perfecting nature. In the late nineteenth and early twentieth century, diet became science. Home economists, extension agents, and chemists set about cleaning up the mess that was nature. These scientists quickly identified unnatural diets along ethnic lines, citing the cuisines of Italian, Slav, and Jewish immigrants to the United States as particularly egregious. Further, they scientifically determined that an efficient and healthy diet consisted of the very dishes—boiled, baked, and bland—that could be found right on their very own tables in WASP New England. The proselytizers devised entire cookbooks of scientific cuisine.
The food writer Jeffrey Pilcher reports that immigrants remained completely resistant to this evangelism, preferring their pastas, red sauces, horseradish, and chilies. The ascendant white middle class, however, provided ready converts in its housewives, who were the real targets of this movement anyway. Newly recruited to the science of food, they were attuned to claims of “new and improved.” In this way, taste was commercialized and industrialized.
Betty Fussell characterizes this period as the time of our learning to speak “the language of commodities,” the language of corn in particular. Fussell’s own family was, like Graham’s, Presbyterian, and in The Story of Corn she captures the fervor with which they adopted this new ethic:

Infused with the spirit of the brand-name age, my family shouted hallelujah in daily worship of the processed box, bottle, and can. Presbyters who believed that wheat thins and grape juice could be transubstantiated into His Body and Blood could believe anything, anything but the equation of the “natural” with the “good.” Fallen nature required redemption, and Christ was the soul’s processor. We praised the manufacturers who, in imitation of Him, worked miracles of transformation. Instead of turning water into wine, they turned corn into starch, sugar, and oil.

The proto-processors focused their efforts on commodities—that is to say, grains—even before people came to think of them as commodities. No one took, say, carrots or tangerines or broccoli to a mill and attempted alchemical transformation. Chemically, those items are too complex to lend themselves to reconstitution. They are food. Yet from the very beginnings of agriculture, grain, even fresh off the stalk, was not yet food. It wanted fermenting or grinding or baking. Hunter-gatherers could be nomads because they could pluck nourishment straight from limb or bone. Grain-based nutrition required sedentism as much to process grain as to grow it. It never was immediately food but, rather, a raw material.
Not all grains, however, are equal in this regard. Wheat and rice—rice especially—are relatively close to being useful in their natural state. Rice requires only hulling and cooking; wheat only hulling, grinding, and cooking. To a degree, this is also true of some forms of corn, as traditional Central and South American agriculture knew and still knows. Yet these same people processed corn, even during agriculture’s ancient incarnations.
Central to Aztec cuisine was the nixtamalization of corn, a trick that modern food processing still does not fully exploit. We know nixtamalized corn as the hominy grits of traditional Southern cooking, but its history is far older. The Aztecs knew enough to process corn by first soaking it in, then cooking it with, a solution of lime or wood ashes. These steps removed the tough outer covering of the corn kernel, but also converted some of the corn’s vegetable protein to niacin and tryptophan, forms of protein more readily usable by humans. When corn became a staple of the European peasantry, economic downturns were marked by outbreaks of the diseases pellagra and kwashiorkor, both the result of protein deficiencies. This occurred because, in hard times, the peasants lived on corn alone. In the New World, the poor among the Aztecs had spent centuries living on this diet, yet remained healthier, largely because of nixtamalization, but also because their cuisine always mixed maize and beans, thus providing complementary forms of protein.
The Europeans did not adopt the Aztec process because they thought its sole purpose was to remove the skin of the corn, a task their powerful mills could handle just fine. Still, from the beginning, people processed corn, and that, coupled with the surpluses of corn that would accrue, steered the crop toward commodification as a raw material for factories.
The primary ingredient of grain (and, for that matter, of potatoes, now the world’s fourth-most-important crop) is starch, which is complex carbohydrates. It is fuel. Separating that starch from fiber, germ, and hull is a relatively easy matter. The ancient Chinese and Egyptians could extract starch from rice. American colonists imported wheat starch from Europe to powder their wigs. In 1840, an Englishman named Orlando Jones invented a process that, like nixtamalization, used an alkaline catalyst to extract the starch. Colgate & Company first applied the process to wheat in the United States, but an employee saw the potential for corn and set up his own cornstarch factory in Oswego, New York. The fine, white powder spread through the culinary world about as rapidly as another fine, white powder would spread through the late-twentieth-century drug world. It was the white stuff, a hyper-refined, pure-as-snow additive for a refined, middle-class diet. Marketers even then began rewriting recipe books with cornstarch as a main character.
Even in this embryonic form, the corn-processing industry bred oligarchy. The first of these oligarchs was a marketer, Gene Staley, who established his Corn Products Company in 1906, and helped to spawn today’s corn and soybean conglomerates. But an additional development was needed for conglomeration. Starch is made of complex carbohydrates; sugar is simple carbohydrates. That is, one could theoretically derive sugar by breaking down starch, obviating the need for sugarcane and the dependence on the trade with tropical agriculture that it implied. In theory, but not yet in practice.
Corn syrup was used as a substitute for molasses as early as 1733, after the invention of a relatively simple process for converting cornstarch to dextrose. By the time of the Civil War, the manufacture of corn sugar was a common process, but production really didn’t take off until chemists took the next step and learned to convert corn to fructose, a sugar much sweeter than dextrose. The initials ubiquitous today on processed-food labels, HFCS, stand for “high fructose corn syrup,” which was first commercialized in 1967 by the Clinton Corn Processing Company of Clinton, Iowa. Then ADM picked up the ball and ran with it.
The commercialization of corn syrup completed the commodification of corn. Each kernel was now a raw material to be disassembled and fed to separate output streams. The yellow skin and other parts make vitamin supplements (necessary now because our food is processed), but especially animal feeds. The kernel gets separated from the germ (the actual seed) and is processed to cornstarch or sugar. The germ is squeezed for its oil. Oil, starch, and sugar became the triumvirate of the Corn Products Refining Company, the brainchild of a marketer who would use these three to rewrite the design of America cuisine, first by branding it. The company gave us Mazola Oil, Karo Syrup, and Kingsford’s Cornstarch. Company flacks wrote cookbooks based on these products and sold cooks on the advantages of products “untouched by human hands” in the new antiseptic factories. The starch, syrup, and oil became the basis for Bisquick, Aunt Jemima Pancake Mix, and a slew of other “convenience” products. Because Karo was a syrup, as opposed to granular sugar, and came in a clear as well as a dark form, it lent itself to all sorts of unique company-invented concoctions, such as taffy, fudge, and “divinity.”
This collection of processes was in place by the time America developed its fascination with, to use a slogan of the time, “Better living through chemistry.” (Some would argue that the fascination developed as a result of these technologies.) The manufactured, the refined, the convenient all became symbols of status. Until these products came along, ready-prepared foods were the purview of the rich, the people glimpsed in movies who had maids and cooks. With processed foods, any middle-class housewife could put Aunt Jemima to work in her kitchen.
The phenomenon of eating Jell-O, for instance, is difficult to explain. Jell-O is a tasteless blob of reconstituted cow’s hooves artificially colored, sweetened, and flavored, served in its most revered form with lumps of corn syrup called marshmallows. Even more difficult to explain is that in the Midwest, Jell-O became a status dish, the sort of offering a beaming farm wife would bring to a church social. When I was a child in the Midwest, a row of fake-copper Jell-O molds hung in most kitchens like a collection of family crests. All difficult to understand, until a midwesterner older than I explained to me the origin of this status. To make Jell-O, one needed a refrigerator, something not at all common in the generation before mine. Taking Jell-O to the church social was a way of publicly announcing that your family could afford a refrigerator.
The food processors were not offering nutrition; they were offering the illusion of wealth, stability, and order, and consumers became willing accomplices in the plot. (It is worth remembering that the corporate marketers were simply occupying a niche, taking advantage of cheap surplus commodities to turn a profit. Agriculture itself created the niche. The marketers would have been equally happy selling most anything else, as later conglomerations of these same corporate entities would demonstrate. For instance, Philip Morris, the tobacco company, is now the owner of many of the food companies that pioneered this era of processed foods in the immediate postwar years.) Conning cooks into using up America’s corn crop, however, was an uphill fight. The effort to dispose of agriculture’s surplus in the human body predates the golden age of processed foods.
With the 1920s came the flapper era, and thin women became fashionable, a fact that alarmed the U.S. government. The Bureau of Home Economics of the United States Department of Agriculture, by then already a fully equipped propaganda machine for the “scientific diet,” began urging women via home economists to eat more wheat—attempting to persuade them, in food historian Harvey Levenstein’s words, “to start chomping their way through the wheat surplus.” Indeed, this effort characterizes the USDA’s course throughout the twentieth century. The agency formally had two charges: expanding markets for farm products and attending to nutrition. These roles were at odds with each other, because increasingly in the United States “farm products” meant surplus commodities—wheat and corn—and consumption of large amounts of these subverts nutrition. Repeatedly, the USDA settled this conflict by ignoring nutrition.
Women were not so amenable to the USDA’s suggestions, however, even when hard times followed the flapper years. Middle-class women, at least, still ducked their duty of chomping through the wheat surplus. Oddly (or not so oddly, if one begins to grasp a history of contradictory relationships between affluence and food), the Great Depression, with its highly publicized bread lines, was also an era of dieting, comparable to our own. With women refusing to cooperate, government turned to the poor, and in not very subtle ways.
Franklin Roosevelt’s initial efforts to feed the nation’s hungry, then called “relief,” hit a brick wall with conservatives, among them the nation’s remaining farmers, who were themselves by then the recipients of multimillion-dollar federal payments designed to support farm prices. A spokesman for Indiana farmers, for instance, framed their opposition to relief for the hungry in the early 1930s by claiming the poor would spend the money on “cigarettes, malt, and other non-necessitous things.” About the same time, the government bought and slaughtered six million young pigs and dumped them in the Mississippi River. It bought milk and poured it on roadways. The farm lobby that benefited from these purchases opposed giving the food to the hungry on the grounds that it would undermine markets.
When this began to produce an outcry from a public by then familiar with the sight of gaunt children and bread lines, Roosevelt established the Federal Surplus Relief Corporation in late 1933. Farmers, believing it would gut their markets, fought the new program and eventually gutted it. It reemerged, but only after the word “relief” had disappeared. The new agency became the Federal Surplus Commodities Corporation, a name that left no doubt of its primary mission, and was placed firmly in the control of the USDA. The government had overcome farmers’ opposition to feeding the poor by giving the program a new goal: disposing of surplus grain. The program evolved rather early into a system of food coupons in service to its conflicting goals. Participants got two kinds of coupons; the first set, geared to wheat and other commodities, had to be used before another set for food like vegetables and fruit, not surplus commodities, became valid. Disposing of surplus farm products would remain the system’s paramount goal until the Nixon administration and the federal food stamp program gave the poor the power to buy a range of food products, not just surplus commodities.
Once, with all the certainty a college freshman’s knowledge confers, I offered in conversation with my maternal grandfather the mildest sort of criticism of Franklin Roosevelt. I got a firestorm response from a man who had weathered the Depression feeding his five children sometimes as an autoworker, sometimes as a trapper, and sometimes on relief. People who knew the real poverty of the Depression saw FDR as a saint. I remember clearly what those surplus commodities packages looked like into the sixties, how the boxes would come home from the distribution centers and be repackaged or put away in a cupboard, out of sight.
His wife, my grandmother, was fat and diabetic. Late in his life, my grandfather also became diabetic. Even then it was not usual for poor people to be obese from a cheap, starchy diet. In 1969, the year I graduated from high school, The New York Times featured photos of fat people applying for food stamps, a paradox most Americans couldn’t swallow then and still can’t.
Our ability to dispose of surplus, and farming’s ability to extend its footprint into our bodies, expanded and matured, aided directly and indirectly by the government. The foreign front opened with a law called PL 480, which was passed in 1954 and still exists. It allows for the dumping of grain in the developing world, ostensibly as a relief effort; but as has been demonstrated time and again, this cheap grain serves as a burden to development, not only bankrupting local farmers but providing a resource for parasitical governments. By the time our government was pushed into beginning reforms of these ham-handed methods, the business of surplus disposal had been greatly refined. The poor still bore the brunt of the effort, but a range of marketing tools allowed some expansion into the general population, a necessary move as the stock of poor began to decrease in the United States. Part of what allowed this expansion was an evolution of those fad diets of the thirties.
The first half of the twentieth century featured a rise of progressivism and faith in science. We and our leaders believed that humankind was perfectible, and science gave us the tools for perfecting it. That perfectibility, of course, extended to our bodies, so opportunity beckoned for the science of nutrition. Food faddists like Kellogg, Graham, and Fletcher begat a new and more respectable generation, many of them the products of new home-economics departments at land grant colleges. No longer was eating taken for granted, nor was enjoyment a primary goal. Eating became serious business.
In the period before World War II, Americans began paying attention to vitamins, especially the B vitamins, after Dr. Russell Wilder of the Mayo Clinic completed some cursory experiments that suggested benefits. B vitamins took on the status of a miracle drug, hailed as, literally, a “pep pill.” Similarly, the term “balanced diet” came into popular usage, and the USDA issued a series of pronouncements, continuing to this day, prescribing proper intake levels of carbohydrates, protein, and fat (in ratios weighted, then and now, toward whatever happens to be in surplus). In the wake of these developments, America went into World War II every bit as food obsessed as the most committed health-food eaters of today. From the beginning, nutrition was consciously and conspicuously a part of the war effort, including a high-profile federal program for “nutrition defense.” Nutritionists saw their role as an advancement of a similar effort from World War I, when the official slogan had been: “Food Will Win the War.” A better slogan for the new enlightened age, the experts decided, was “Vitamins Will Win the War.” Almost immediately, food producers tapped into the effort. The meat industry, for instance, mobilized both pollsters and advertising to convince people that meat should be the centerpiece of a well-balanced meal.
The federal Nutrition Division devised a seal of approval, a drawing of Uncle Sam in profile downing a forkful of food with the slogan “US Needs US Strong, Eat Nutritional Food.” Producers and processors flooded the agency with requests to display the logo on their packaging, and most were successful, although only after some thoroughly bureaucratic embellishments. The Doughnut Corporation of America, for instance, was denied use of the logo if it used the term “Enriched Donuts” but allowed to use the logo when it called its product “Enriched Flour Donuts.” With this sleight of hand, doughnuts became healthy enough to go to war, just as ketchup would become a vegetable during the Reagan administration. The net result of all this maneuvering was that commercial interests quickly became bound up in the “science” of nutrition, and with them came advertising, food fads, hype, and profound changes in American eating habits.
Dietary change was most conspicuous among people in the military, the impetus for all this talk of nutrition in the first place. Early in the war, draft boards found themselves rejecting 40 percent of their candidates, 33 percent of those for health reasons such as tooth decay. Justified or not, diet caught the blame for this, and the military responded with perhaps the best diet a soldier ever faced. It was abundant and hypercharged with nutritionists’ favored foods of the day, such as milk and meat. More important, though, this new diet was being uniformly administered, making it arguably the biggest melting pot experiment of the twentieth century. The millions of men and women in the services, most still in their formative years, came from a variety of ethnic and economic backgrounds, yet probably ate better than they ever had before, and it was a diet formally backed by the government as the soundest science could devise. The habit stuck; the abundant, high-protein diet would become the postwar standard.
Meanwhile, on the home front, similar changes were afoot, understood best against the backdrop of the Great Depression. There were constant debates throughout the Depression about the degree of malnutrition in the country, but evidence suggests the fears were overstated. There was hunger, but most of it was confined to the poorest economic groups, which had always been hungry. The majority of people had enough food, though it came from lower on the food pyramid than during flush times. As the war heated up the economy, however, rising income was quickly converted to better food. Meat consumption rose sharply in the immediate prewar years.
Scratch any person’s memory of the war years, and you are sure to hear stories about rationing—of tires and gasoline, but especially of food. Having lived through rationing sticks as a sort of badge of honor, and the two foods most often mentioned in this context are sugar and meat. Yet these memories usually fail in some details. The usual beef ration was about two and a half pounds per person per week, which by current standards is a glut. Poultry and fish weren’t rationed so no one had to scrimp on protein. The average British person at the time got a pound of meat a week; most of the rest of Europe went without.
The more curious development was the war’s effects on Americans’ attitudes toward sugar. Although a variety of foods were rationed on and off throughout the war, most foods were not in short supply. The rationing largely stemmed from rumors, black markets, and the resultant panic buying and hoarding. The declaration of war alone brought almost overnight a nationwide run on sugar, and only sugar. People rushed to stores and bought hundred-pound sacks of the stuff. Rumors and black markets amplified the effect, and so sugar became the first commodity rationed, at a half-pound per person per week. From a nutritional standpoint, this is about a half-pound more than a person needs, and certainly enough to, for instance, cause the rampant tooth decay recorded as the reason for many draftees’ rejections. Honey, molasses, and syrups were still available. Yet this bit of rationing looms in surviving memories as deprivation.
All of this points to a pivotal shift in America’s food habits. Justified or not, we came out of the war craving sugar and meat, but also with the rapidly and almost universally expanding incomes to command those foods. More subtly but no less importantly, we came out with a strengthened belief that eating this and eating that could enhance not only our well-being but also our status. We accepted the notion that food could be improved. We came out of the war with a consensus as to what constituted the good life, especially with regard to food, and we came out with the tools to get it.
Immediately after the war, Europe and Asia, including our allies, faced a good old-fashioned famine, a simple lack of bread. Indeed, there was starvation. Our government knew this and prevailed upon the farmers it had been subsidizing through the years to channel surplus grain to Europe. Grain growers and Americans in general responded by not so politely telling the government to go to hell. There was more profit to be made in selling wheat to livestock growers to fatten the beef that the Greatest Generation had come to see as its birthright, just as there was intense political pressure to keep the meat supply high.
 
 
Somewhere during the mid-century ascendancy of the consuming generation, a nasty realization must have occurred to those in the food business—a realization slowly percolating since Karo Syrup’s advent. Mass consumption is wholly dependent on mass advertising and, in turn, on branding. How does one, say, brand a pear as Acme’s Peerless Pears when Acme’s pears are pretty much the same as, say, the Polly’s New and Improved Peerless Pears being offered by the competition? Evolution, breeding, and farmers make a pear what it is and not much can be added by a label.
(There is, of course, an exception to this, but it turns out to be the elusive rule-proving exception. United Fruit Company did indeed brand bananas with the label Chiquita, and did use advertising to change habits of consumption. The skim milk and banana diet invented and hyped by United Fruit was the nation’s most popular fad diet of 1934, a poll showed. United Fruit, however, had to go to the trouble of cornering the market, which involved dotting Central America with its company-owned and -operated banana plantations, and taking over a government or two in the process, not the sort of thing Acme or Polly could accomplish with pears. Pears are grown in the temperate zone, not in banana republics.)
Beyond the issue of differentiating and branding, there was a second inherent problem. Demand for food is what economists call “inelastic,” meaning that as incomes increase, people don’t spend much more of that extra income on food. As incomes increased in the early war years, people did indeed spend more on food, buying more meat, fruits, and other quality items, but this, too, had a limit. Once the consuming public worked its way up the pyramid to meat, there was nowhere else to go. In any event, the extra money spent was not going to food processors.
The solution the food processors devised to both branding and inelasticity was value-added manufacture. That is, instead of food, they began selling services, in the guise of processed food. People might not be able to buy much more food, but they could buy more convenience, and layered just beneath the pitch for convenience was the message that they could buy status. Harvey Levenstein summarizes the era’s promise of upper-class leisure for middle-class women:

An executive of the American Can Company told the assemblage [the 1962 Grocery Manufacturers Association convention] that “the package revolution” had helped give the American family not more time for women to work but “more time for cultural and community activities.” Charles Mortimer, head of General Foods, boasted that “built-in chef service” had now been added to “built-in maid service” implying that housewives could now lead the lives of the leisured upper class. Even in 1969, when it had become the norm for married women to work, the chairman of the board of Corn Products Company saw the “social revolution” convenience foods had brought only in terms of the full-time housewife. “We—that is the food industry—have given her the gift of time,” he said, “which she may reinvest in bridge, canasta, garden club, and other perhaps more soul-satisfying pursuits.”

The revolution the executive was commenting on here was a done deal in 1962, toward the end of an era Levenstein calls “The Golden Age of Food Processing,” and what another food analyst calls “the Velveeta cocoon.” This chemically assisted better living greatly depended on groundwork already laid, some of it quite deeply. The problem of differentiation, the Peerless Pear dilemma, could be partly solved by packaging, and the groundwork for that trick is as old as human evolution. Specifically, manufacturers played heavily with the colors of packaging. After one study (this process was nothing if not scientific) found that the only cars that could be distinguished from atop the Empire State Building were painted two colors, two-tone packages became the norm. Additional studies found that women responded to red, and men to blue, so package ink reflected this knowledge. In all of this, though, even if they were unaware of it, processors were tapping into that deep link between color and survival that was the basis of hunter-gatherer expertise. Using red to signal good food was about as subtle as selling Playboy with bare breasts.
Just as important, though, the processors relied on government-laid groundwork. Their biggest asset was the surplus of commodities, especially corn and wheat, which were simply raw energy, a blank culinary slate or platform. Properly reduced, then fortified with flavorings, preservatives, sweeteners, and packaging, these commodities could become anything manufacturers wished them to be. The public was well primed; the government had spent two generations convincing them of the value of manipulated food, telling them that flour must be enriched with vitamins, that chemical magic could build strong bodies twelve different ways and win wars. Our nation was already populated with, if not food faddists, food fetishists ripe for the picking. And the ups and downs of Depression and wartime rationing and postwar boom, of want followed by plenty, had produced a generation that would eat nearly anything if enough sugar were added. Not that the manufacturers were limited to sugar; by 1958, an inventory found 709 synthetic chemicals commonly used in food processing.
The strategy worked. The value added to food by manufacturing increased 45 percent between 1939 and 1954. Almost all of the increased spending on food during the fifties went toward manufacturing costs.
 
 
The writer Susan Allport tells of meeting a Hadza man, a hunter-gatherer, while she was traveling in Africa. “I was immediately struck by his direct manner, his easy good humor, and his black teeth,” she says. Allport was looking at modern evidence of one quirk of hunter-gatherers: if they can locate bee trees, honey becomes an obsession. Allport asked the man about his favorite foods, which he listed as meat and honey. He reported drinking three mugs of honey a day.
Honey and other sugars are the wild cards in diet. Hunter-gatherers hear a sort of internal chant directing them to find more food, because within a normal range of forage, it takes all the food one can scare up to satisfy one’s hunger. All bets are off with sugar, the refined essence of food, pure energy that answers that internal chant. Evolution does not equip us to deal with abundance; we have to learn to do that ourselves. The food writer Claude Fischler notes that “Societies of abundance are tormented by the necessity to regulate feeding … . They are at one and the same time impassioned over cuisine and obsessed with dieting.”
 
 
Marketers’ energies during the postwar years were directed toward the middle class for the same reason that Willie Sutton chose to rob banks. Yet if one has the tools to make commodities into whatever one wishes, then those tools can work just as well when incomes fall as when they rise, especially when sugar and corn are cheap. An extreme manifestation of this notion emerged in the sixties when food scientists began hatching grand schemes to create simple, chemically aided food for the Third World masses. Scientists touted a product they named Incaparina, a stir-and-serve powder made of cottonseed, corn, and sorghum that when mixed with sugar provided nutrition close to that of a glass of milk. UNICEF came up with Saridele, a soybean extract for infants. Chemists devised a fishmeal flour made of “trash fish” and set up a plant in Chile. The American delegate to the UN passed out chocolate-chip cookies made with the stuff. International Telephone and Telegraph devised Astrofood, a highly sweetened cupcake that made its way into school lunch programs in the United States in 1967.
All of these concoctions sound much like the livestock feed being formulated at the time. Indeed, the attitude has been that if we can concoct some form of basic gruel (preferably one that chews up surpluses) to maintain a stock of poor people sufficient to provide cheap labor and a stock of hogs and cattle sufficient to meet the culinary needs of the better folks (and somehow show a profit in the bargain), then the problem of world nutrition will have been solved. At least this approach might solve the problem of foreign hunger. The problem of feeding the domestic poor would require a bit more finesse.
As the golden age of food processing peaked, the problem of what to do about the poor suddenly became more acute, albeit not well understood. The poor, or at least the poorer, at that point were not the only folks standing in food-stamp lines. There were also people with jobs. In 1973, the real wage of American workers peaked. It has declined steadily from then until now. Ironically, this was an era in which the upper classes emerged from the Velveeta cocoon: cookbook sales soared, restaurants and diets proliferated, and tastes went global. A better-educated and rebellious bunch of baby boomers had begun insisting on something more than steak, potatoes, and Jell-O. The well-off and well-educated demanded better food, and they got it.
Everybody else got fast food and sugar.
The growth of the fast-food industry probably needs no documentation, especially not in a book. (Most Americans are far more likely to encounter the former than the latter.) Most people have seen the outskirts and now centers of every village and town spired with golden arches and the attendant cookie-cutter architecture. Ninety-six percent of America’s schoolchildren can recognize Ronald McDonald; only Santa Claus surpasses him in name recognition. Internationally, the clown scores 80 percent among children. A study in the United States, the United Kingdom, Germany, Australia, India, and Japan said 54 percent of the respondents could identify the Christian cross; 88 percent, the Golden Arches.
In simple numbers, Americans spent $6 billion on fast food in 1970 and more than $110 billion in 2000. This latter figure exceeds what Americans spend annually on higher education, personal computers, or cars. It’s also more than they spend on movies, books, magazines, newspapers, videos, and recorded music combined.
As we saw with the development of fish-and-chips in England, the advent of fast food does not represent a sudden break with the past, especially in light of the social overlay; fast food or something like it—largely filled with whatever happens to be in surplus—has long been the food of the working class. Potatoes are the mainstay of most fast-food operations, as they were when fish-and-chips took over the feeding of working-class Britain. The highly touted beef of fast food is actually almost waste beef; hamburger is made from the very poorest cuts left over after the steaks and such have gone to upscale buyers. (Even McDonald’s fries relied on waste beef—suet—until health concerns and public awareness stymied the practice. Then McDonald’s switched to vegetable oil—but added a synthetic flavor designed to reproduce the aura of beef fat.)
If Ray Kroc had a genius, it was not for invention but for spotting existing trends and tailoring his business, especially his marketing, to feed on them. Food processors had already spent a generation marketing the status and convenience of their products before Kroc came along; one can hear clear echoes of the maid-in-a-box strategy in McDonald’s philosophy. “Working-class families could finally afford to feed their kids restaurant food,” says John F. Love, a company historian.
McDonald’s and all the rest of the chains had a clear reason for being interested in the “working class,” and not just as a market. The central strategy of the chains is to dumb down every operation, to process food and freeze it at centralized factories so that it can be thawed and served at restaurants by unskilled labor. As a consequence, there are no unionized McDonald’s, and the average wage of fast-food workers is now the lowest of any sector in the United States, including migrant farm workers.
That control of labor extends to the farm. The chains so dominated their markets that they could control the means of production. Farmers were drawn into “forward” contracts in which processors supplied seed and then dictated how it would be grown, when it would be harvested, and, of course, how much it would cost. The result has been a strong vertical integration and corporatization of farms. Using its buying power, the fast-food industry prompted a wave of potato-price crashes that reworked the social landscape of potato farming in areas like southern Idaho’s Snake River plain. During the past twenty-five years, the number of potato farmers in Idaho has been cut in half while the total area devoted to growing potatoes has increased, as has the number of corporate-owned farms. In his book Fast Food Nation, Eric Schlosser reports that “The patterns of land ownership in the American West more and more resemble those of rural England. ‘We’ve come full circle,’ says Paul Patterson [an agricultural economist]. ‘You increasingly find two classes of people in rural Idaho; the people who run the farms and the people who own them.’”
Similar market penetration into the beef industry has similarly reshaped its social structure from farm to fork. The most telling of the many statistics one could site is that the suicide rate among American farmers and ranchers is three times the national average.
The fast-food industry also worked a field already plowed by the earlier generation, advertising—particularly television. TV not only transmitted the new culture, it became the culture, advertising fast-food products and serving up prepackaged entertainment to an audience that watched as passively as it ate.
In its essence, though, the fast-food trend is sugar. Just as it tapped changes in the labor market, fast food’s rise corresponded directly to the food-processing industry’s, especially, as we have seen, Archer Daniels Midland’s decision in the seventies to go whole hog into the business of converting corn to sugar. Cheap corn made cheap sugar, which became the common denominator of fast food, appearing in everything from “special sauces” to hamburger rolls. As the trend has evolved, and as consumers have been trained, though, fast-food firms have become less subtle, putting only the merest façade of food on its sugar. Now sugar needs only the addition of water and a bit of coloring and flavoring to become marketable: soda is the high-profit item of the fast-food business, a fact that has also benefited convenience stores. Even more widespread than fast-food joints, they have stripped away the veneer of food altogether, going straight for the high-profit “big gulp.” A thirty-two-ounce soda and a tank of gas is America distilled to its seminal fluids.
Not that fast-food sellers are standing on the sidelines. They are capitalizing heavily on a hole opened by the soda manufacturers, specifically Coca-Cola, Pepsi, and Cadbury-Schweppes (which makes Dr Pepper). These makers figured out that the expansion of their business (their goal is 25 percent per year) simply could not be maintained by a market of adults only, so they began a campaign to reach children, principally through television, but also, and more insidiously, in the schools. By contracting with school districts, soda manufacturers get schools to host soda machines in exchange for a cut of the profit, thereby giving them a financial stake in increased sugar consumption by the kids they are supposed to educate and protect. It works.
Meanwhile, at fast-food places, larger and larger sodas have become a marketing gimmick and profit booster. In the 1950s, a customer got an eight-ounce soda with his fast-food burger. Today, a child’s meal includes a twelve-ounce soda, and a large soda is thirty-two ounces. Soft drink consumption has quadrupled in the United States over the past fifty years. Just since 1978 the intake of a typical teenage boy has tripled to more than twenty ounces per day. One-fifth of the nation’s one- and two-year-olds drink soda routinely, some of it from baby bottles.
 
 
I began planning this book after a single jolting experience. I had been traveling in Peru and Chile for a couple of weeks, researching efforts to feed the poor in the developing world. It’s odd how a few weeks abroad can subtly remove the salient points of America from one’s view. Never did I expect to encounter culture shock on reentering my own country, but I had gotten off an all-night flight from Lima at Los Angeles and begun working my way toward a domestic connection, when I found myself overwhelmed with just such a shock. It was based on a single observation: that Americans are fat.
Obesity is pandemic in the United States, and is now officially recognized as a leading threat to public health. There is a tendency to frame this problem in economic terms, especially when one has spent time sitting in the mud huts of people who live on less than a dollar a day. Corpulence is indeed an economic artifact, but in exactly the opposite way that one might have expected: obesity in the United States is not a mark of wealth; it is more reliably a mark of poverty.
Doctors who have grappled with the problem of obesity cite a variety of causes; some even disagree on the problem’s distribution among social classes. Clearly, there is more at work than simply surplus corn and sugar. There is a complex psychology to weight, so much so that in 1978 the magazine Psychology Today reported that food had replaced sex as Americans’ leading source of guilt. We are, after all, not evolved to deal with surplus and have deep-seated drives to consume as much as possible, to lay in stores for hard times. This sets up a classic love-hate relationship, a paradox, and we can see evidence of conflict all around us. Young upper-class and middle-class women suffer in record numbers from eating disorders such as anorexia and bulimia, while obesity is rampant among other groups.
Yet those same doctors and researchers struggling with this problem tend to cite a common denominator: sugar consumption. On average, Americans today get about 16 percent of their total calories from added sweeteners, mostly corn syrup. Current dietary recommendations say we should limit that to 6 to 10 percent. The biggest source of added sweeteners is soft drinks.
A second common denominator, especially among obese children—and in that age group the problem is truly epidemic—is sedentism, or more to the point, inactivity, because fat kids spend , more time than other kids watching TV.
We have seen a decline of the pernicious and ham-handed methods of the 1930s, when the poor could not get food unless they agreed to eat their way through the surplus stock of corn and wheat, because our methods of ensuring this same outcome have become more sophisticated. The methods are new, but the goal itself is as old as agriculture.