Harvey A. Levenstein
The First Colonists: A Subsistence Economy
Europeans visiting the United States for the first time are often struck by the relatively large number of obese people one sees there, and they remark that Americans seem overfed. Although being revolted by girth is a fairly recent phenomenon, the idea of America as a land of food abundance is not. Throughout the nation’s history, those who have praised or criticized its diet have agreed on that. Americans have proudly defined themselves as “People of Plenty,” and this has primarily meant plenty of food. On the other hand, America has also been a home for countless schemes to limit partaking in this abundance—ranging from the zaniest of food fads to the most sober scientific theories.
Although the first settlers, in the early seventeenth century, suffered some difficult years, most of the inhabitants of Britain’s American colonies rapidly established themselves as much better fed than their counterparts in the motherland. The ink was hardly dry on the French biologist Buffon’s theory of how everything born in the New World was a stunted version of that of the Old when the colonists’ abundant diet was helping to refute it. When the colonies rebelled against British rule in the 1770s, the American soldiers were, on average, much taller than both the British soldiers facing them and the French troops who came to their aid. Indeed, by the time of the American Revolution, white Americans had just about achieved their modern height—due mainly to the nutritional advantages of the land of abundance. Even their poorly treated black slaves were taller than European peasants and laborers. They were also much taller than the slaves of African descent in the Caribbean and South America. But the citizens of “the First New Nation” did not need comparative height data to prove that they were better fed than Europeans. The idea that their land provided them with enormous amounts of food was already enshrined in their folklore. It was natural, said the Philadelphia physician John Bell in 1793, that Americans, living amidst “superabundance,” should be “great eaters.”
For most Americans, abundance meant lots of meat—mainly pork and, to a lesser extent, beef—accompanied by breads made from corn (maize), rye, and, increasingly, wheat. Fruits and vegetables were plentiful in season (although in much of the country those seasons were not very long), and the immense stocks of fish and seafood along the coasts provided coastal communities with additional sources of protein. For much of the winter and spring most people in the middle and northern sections subsisted on a diet of preserved meat and bread, supplemented mainly by beans and root vegetables that they stored in cold cellars. Although they sometimes complained of this dietary monotony, they still took pains to impress foreigners with their abundant supplies of meat. In 1793 an impressed French traveler estimated that Americans ate seven to eight times as much meat as bread. The dearth of fresh vegetables was not regarded as much of a loss, for, like their British forbears, Americans were suspicious of fresh vegetables and preferred to cook them (and fruits, too) until they were almost mush. Indeed, despite the growing proportion of immigrants who did not originate in the British Isles, British traditions would prevail for almost the next two hundred years. An impressive number of new foods and tastes would be assimilated into the national diet, but mainly on British-American terms.
Modernization and Early Food Reforms
By the 1830s, impressive improvements in transportation—roads, canals, and steamboats—and agriculture were transforming the subsistence economy in which most of the country’s farms operated into a cash one. A much wider variety of foodstuffs was now available for much longer periods of the year, particularly to the middle class of the towns and cities, which were swelling in the wake of this transformation. Perhaps, then, it was a feeling that they were being engulfed in agricultural plenty that made this new middle class so responsive to the first of what would be many attempts to regulate and restrict the national diet.
The food reform movement that arose in the 1830s and 1840s now seems quintessentially American, for it linked calls to avoid foods that science had deemed deleterious with strivings for moral purity. Its most famous advocate was the Protestant preacher William Sylvester Graham, whose scientific ideas were derived from the “vitalist” theories then circulating in France. He based his initial crusade—against alcohol—on the “vitalist” idea that the nervous system contained a force upon which all life depended. Alcohol, he said, overstimulated the nervous system and sapped this vital force, leaving the body prey to disease, debility, and death. He soon expanded the indictment to include other forms of nervous stimulation—particularly sexual activity and the consumption of meat and spices—adding a scientific dimension to traditional moral-religious prescriptions for sexual prudery and vegetarianism.
Graham’s ideas also fit in well with the Romantic intellectual currents of the day, for he was suspicious of any food that had been altered from its natural condition. The cultivation of wheat, never too successful along the eastern seaboard, had boomed in the trans-Appalachian West, and Americans had been rapidly forsaking breads made of corn, rye, and whole wheat for white loaves made from bolted flour. This “denaturing” of wheat became one of Graham’s main targets. Echoing the Protestant marriage ceremony, he thundered, “Let no man pull asunder what God has joined together.” His followers set up the nation’s first health food store to sell unbolted “Graham flour.” (It was some decades later that commercially produced graham crackers appeared.)
This kind of appeal to science, nature, and God attracted many of the people swept up in other crusades of that great era of reform such as the antislavery, temperance, and women’s suffrage movements. Prominent writers and philosophers such as Henry Thoreau tried vegetarianism and ate Graham flour. Fourierist phalansteries and other utopian communities adopted many of Graham’s dietary ideas. The fiery preacher Charles Finney, who sparked the greatest Protestant revivalist crusade ever witnessed by the nation, was a Grahamite. Even Joseph Smith, polygamous founder of the Mormon Church, tried Grahamite vegetarianism.
Yet Smith and most of the other reformers eventually abandoned vegetarianism and returned to the pleasures of flesh-eating. The expansion of the railroad and cattle-raising into the West had encouraged hearty beef-eating, allowing the fulfillment of many an Anglo-Saxon dream. “We are essentially a hungry beef-eating people, who live by eating,” a proud newspaper proclaimed in mid-century.
A Second Wave of Food Reforms
As the railroad networks expanded and domestic agriculture diversified, the tentacles of a tropical agricultural empire began to stretch out at home and then abroad. As a result, the plates of the wealthy, in particular, overflowed with an enormous bounty. Millions of immigrants from impoverished parts of Europe flowed in, providing a labor force for the railroads and other burgeoning industries. They also provided the servants to help prepare and serve the bountiful upper- and upper-middle-class meals. Often the upper-class versions of these meals, presided over by French chefs, were every bit as lavish as those of their Belle Époque counterparts in Paris.
By the 1890s, however, the seeds of a middle-class reaction against this kind of excess were finding fertile ground in the kinds of concerns that had given rise to food reform in the 1830s. Like the first wave of food reform, the turn-of-the-century one was grounded in new scientific ideas that were thought to advance both the health and the moral state of the entire nation. The first objective of the reformers was not the obvious one—the upper-class men who waddled about with enormous bellies, constantly complaining of dyspepsia, constipation, and other digestive ills. Nor, initially, was there much concern over the middle class and better-off farmers, who ate enormous quantities of heavy foods, particularly fried meats and baked pies. Rather, the food reformers first tried to change the diets of the working class, which was largely composed of immigrants who had come to America hoping to partake in the fabled riches of its tables.
The discovery that foods’ energy could be measured in calories and that they were composed of proteins, carbohydrates, and fats, each of which seemed to have a unique physiological function, had revolutionized scientific thinking about food. A loose group of American nutritional scientists, social reformers, and home economists were struck by the apparent relevance of these new ideas—the so-called New Nutrition—to the horrific social problems created by industrialization and urbanization in America: ragged workers’ families living in overcrowded, underheated, and unsanitary housing, drunkenness, child labor, and prostitution. They thought that if the working class could be taught that the proteins in beans, for example, were just as nutritious as those in beefsteak, they would be able to spend less on food and have more to spend on shelter, heating fuel, and clothing. Living standards would thus rise and they would turn a deaf ear to the radicals fostering anarchism, socialism, trade unionism, and other disruptive principles.
Unfortunately for the reformers, the working class was largely composed of immigrants, such as the millions from Italy’s Mezzogiorno who had come to America hoping to eat beef, not beans. The reformers were generally ignored and derided by those they wished to help. (This was probably all to the good. Scientists, ignorant of the existence of vitamins, declared most fresh fruits and vegetables to be mainly water, so the New Nutritionists advised workers that they were wasteful extravagances. They were particularly frustrated by the apparent profligacy of Italo-Americans on this score.) However, by the early 1900s the New Nutritionists were beginning to have an impact on the middle class, who were again becoming responsive to calls for dietary change. This was due in part to the efforts of a number of other critics of the American diet, including a direct heir of Graham’s ideas, Dr. John Harvey Kellogg, co-inventor (with his brother, William) of corn flakes and director of the famous “sanitorium” at Battle Creek, Michigan. This vegetarian health resort had been founded by Seventh-Day Adventists, a Protestant sect that had made many of Graham’s food ideas part of its religious credo. The genial Kellogg, whose medical credentials were unimpressive and scientific ones nonexistent, had managed to turn the virtually moribund institution into a trendy health spa catering to a clientele of non-Adventists who were convinced that its cures reflected the cutting edge of nutritional science.
Many of Kellogg’s ideas were simply Graham’s dressed up in modern garb. At their heart lay the same warnings against eating those foods which overstimulated the nervous system—particularly meat, spices, alcohol—and succumbing to the temptations of masturbation. But he was also fixated on the terminus of the alimentary canal and was particularly enthusiastic about new theories concerning the dangers of “auto-intoxication,” which blamed many illnesses on eating foods that encouraged bacteria to proliferate in the colon. By artfully blending new scientific theories such as this with older calls for moral uplift and an array of special vegetarian diets—high blood pressure, for example, was treated by putting patients on a diet of nothing but 9 to 13 pounds (4 to 6 kilos) of grapes per day—Kellogg managed to attract an impressive list of rich and famous people to the booming “San” as well as a lot of media attention.
One of the key people helping Kellogg straddle the gap between quackery and mainstream nutritional science was Horace Fletcher, a wealthy American businessman who had returned to an impressive nineteenth-century palazzo on Venice’s Grand Canal. Like most faddists, Fletcher claimed to have discovered a miracle cure after indulgence in a rich diet had placed him at death’s door. In his case, a drastic reduction in food intake and “thorough mastication” of his food had done the trick. Never one for half measures, Fletcher took his latter idea to an unheard-of extreme, advocating that each mouthful of food be chewed at least one hundred times. Prominent nutritional scientists, hoping that he might subsidize their research, pretended to subscribe to his theory that food was actually ingested by an unknown mechanism at the back of the mouth. However, they were able to take him more seriously in two regards. First, the fact that his feces were tiny and odorless did seem to demonstrate that “Fletcherizing” prevented “auto-intoxication.” (Indeed, to make this point, he often sent them to leading scientists through the mail.) Ultimately more important, however, were his demonstrations that current estimates of human protein needs were much too high. People should eat only when hungry and then eat only enough to sate the hunger, he said.
The fifty-three-year-old Fletcher became famous for performing physical tasks beyond the capabilities of most twenty-one-year-old athletes on one-half to two-thirds of their protein intake. He advised the chief of staff of the U.S. Army on reducing the army ration and made prominent converts such as the novelist Henry James and his philosopher brother William. While most of them ultimately found “Fletcherizing” simply too tedious to maintain, his admonitions to eat less found increasing support among nutritional scientists and home economists, many of whom would shortly begin calling themselves dietitians. The downward revisions also found a receptive audience in the upper middle class, who were finding it impossible to imitate the culinary and entertaining styles of the class above them. The growing shortage of competent domestic servants made it particularly difficult to entertain in the sophisticated French manner. It was even difficult to fulfill expectations for family meals, which by the late nineteenth century had been invested with much of the somber moral significance of Protestant church services. The transient immigrant girls upon whom most middle-class women relied for help in the kitchen were of little use in preparing and serving these formal dinners. The women were therefore receptive to the reformers, who told them they should abjure the upper classes’ fancy seasonings, exotic ingredients, complex preparations, groaning tables, and multicourse dinners. That these calls for restraining the pleasures of the flesh also struck a familiar Protestant chord did their prospects no harm.
These self-denying ideas received a big push during World War I, when the government organized a massive effort, led by Herbert Hoover, to persuade Americans to voluntarily cut back on consumption of certain foodstuffs—beef and wheat, in particular—so that they could be shipped to the U.S. and Allied troops in Europe. A major propaganda campaign was mounted to teach the New Nutritionists’ rules of substitution and persuade Americans that eating less would not harm their health and might even improve it. Again, the working class remained generally unmoved by this information. They used fattened wartime paychecks to buy more beef. However, the campaign did have an impact on the middle class. Indeed, for many years thereafter they recalled how their diets had been “Hooverized.”
“Vitamania,” 1917–50
Even before Hoover had assumed direction of wartime food conservation in early 1917, the first vitamins had been discovered. Too little was known of them to have an impact then, but by the time he was elected to the presidency in 1928, the new nutritional paradigm revolving around vitamins—which its exponents called the Newer Nutrition—had come to the fore. Although it ultimately helped transform diets throughout the world, this new vitamin-centered way of thinking made itself felt particularly early in the United States. In part this was because the prewar and wartime food reformers had already primed middle- and upper-class Americans to accept the primacy of health concerns over gastronomic ones in making their food choices. Indeed, hardly anyone seemed to notice when the national law prohibiting the sale and consumption of alcoholic beverages from 1921 to 1933 virtually destroyed what remained of haute cuisine in America by forcing most fine restaurants, whose survival depended on profits from alcoholic beverages, out of business.
“Vitamania” was also fostered by industrial developments. By the mid-1920s, food production in the United States was being transformed into a series of highly organized industries dependent on large capital investments, mechanized production, sophisticated distribution networks, and—crucial from the point of view of dietary ideas—large expenditures for promotion and advertising. Vitamins, which are invisible, weightless, and tasteless, proved to be a food advertiser’s dream. Citrus fruit growers, grape juice producers, flour millers, pickle producers—almost anyone could and did make extravagant claims. Fleischmann’s Yeast Company, owned by Standard Brands (one of the two enormous new food conglomerates created by Wall Street–financed mergers in the 1920s), spent enormous amounts spreading the message that eating four of its slimy yeast cakes a day would provide enough vitamin B to “rid the body of poisonous wastes,” raise energy levels, and cure indigestion, constipation, acne, pimples, “fallen stomach,” and “underfed blood.” The dairy industries were among the most successful at spreading vitamin consciousness. In the late 1920s, a process was invented to irradiate canned milk with Vitamin D. Its use was soon expanded to butter and fresh milk. The powerful milk producers’ cooperatives and giant dairy companies, guarding the process from margarine producers, were then able to change the image of milk from that of a children’s food into “the perfect food” for adults as well, one which contained virtually every nutrient necessary for good health. During World War I, coffee was the army’s favorite beverage; during World War II, fresh milk was the G.I.’s overwhelming choice.
Vitamania had also helped revive Graham’s idea that white flour was “denutrified.” Critics charged that milling techniques introduced by large millers in the late nineteenth century, which made white flour much cheaper and even whiter than in Graham’s time, were linked to national deficiencies in important vitamins. In 1940 and 1941, as America hastily rearmed in anticipation of being drawn into war, alarm spread over apparent deficiencies in vitamin B1 (thiamine) in the national diet. In 1939 and 1940 physicians at the famed Mayo Clinic in Minnesota had put some teenagers on a diet very low in this vitamin and concluded that it made them surly and uncooperative. It was soon labeled “the morale vitamin,” and warnings rang out that a nation whose staple was now white bread, from which most of this vitamin had been removed, would be weakened and vulnerable in the face of an enemy invasion. The government thus had millers restore the thiamine back into the flour and add two other nutrients, iron and riboflavin, for which deficiencies were thought to be widespread as well. The general public, however, seemed persuaded that taking vitamins would give them “pep” and energy, a notion mainly inspired by the fortuitous fact that vita means “life” and connotes vigor, verve, and vitality.
The organized medical profession was hardly enthusiastic about all of this, for they saw in vitamania a recurrence of the age-old threat that people would seek nonmedical help when stricken with illness. This danger to their monopoly on healing loomed particularly large in the late 1930s, by which time most of the vitamins had been synthesized and were being produced commercially, in the form of pills or tonics, and sold without prescriptions. Their recurring efforts to restrict vitamin sales were supported by many food producers. Unable to compete with the pharmaceutical giants selling vitamins in more convenient form, they dropped their vitamin-centered campaigns to focus on their products’ other attributes, such as the “quick energy” that came from their sugar content. Most joined with the American Medical Association in promoting the doctrine that became the official government line until the 1970s: that the American food supply was unsurpassed in nutritional quality. The public was repeatedly assured that the vitamin supplements were unnecessary because they could obtain more than enough nutrients to ensure good health by simply eating a “balanced” diet.
The Era of the Baby-Boomers
Although a number of munitions manufacturers claimed that giving vitamin pills to workers raised their productivity, the entry of the United States into the war diverted attention away from vitamania and other health concerns. Food shortages and rationing became the national obsession, and for good reason: restricting how much of their favorite foods Americans could buy flew directly in the face of their perception of their country as the land of abundance. Although the ration was quite princely, even compared to the peacetime standards of other belligerents, Americans had difficulty accepting the idea that the shortages they faced were real. There were recurring rumors that supplies of food—particularly beef—were more than adequate but that foodstuffs were being wasted, destroyed, or otherwise kept off the market by an incompetent government, crooked middlemen, or various other conspiracies. The wartime resentment persisted into the postwar years, when the supply and price of beef became, at one stage, the most important issue in American domestic politics.
After 1948, when a semblance of order was restored to the markets and the feeling of being at the receiving end of the cornucopia returned, concerns over the healthfulness of the national diet remained in the background. The era of the baby boom, from 1946 to about 1963, was a time for family-building, when Americans set up millions of new households and were concerned primarily with managing them. In food, this meant that concerns of health or gastronomy took a back seat to “convenience,” or what food processors like to call “built-in service.” Producers and processors developed a host of new methods for growing, raising, preserving, precooking, and packaging foods. From 1949 to 1959, chemists alone came up with over four hundred new additives to help food survive the new processes. Any concerns over the effects of these methods on the nutritional quality of the foods were largely swept away by pride in American inventiveness. Government-sponsored exhibitions designed to impress foreigners with the achievements of American capitalism gave processed foods, kitchen appliances, and supermarket shopping center stage. In Moscow in 1959, Vice-President Richard Nixon waited until he was in the gleaming, fully equipped kitchen of one of those exhibits before engaging Chairman Nikita Khruschev in their famous debate over the relative merits of the two systems. When Khruschev visited the United States, he was taken through a supermarket.
Not everyone was impressed. A French woman, contemplating the monstrous turkey in the American display of a bountiful Thanksgiving dinner at the 1957 Dijon Food Fair, remarked with some disgust, “Who has an oven big enough to cook something like that?” Another wondered why the Americans had an exhibit at a gastronomic fair at all: everyone knew that everything they ate came from cans. Even in the United States it was sometimes said that “convenience” was taking a toll on taste. Yet gastronomical considerations had long since been relegated to the backseat when it came to food choices. Moreover, food tastes had virtually ceased to be an important marker of social distinction. Everyone seemed equally partial to beefsteak and potatoes, fried chicken, hamburgers grilled on the barbecue, and casseroles made with canned fish or meat, canned creamed soup, and topped with potato chips or crumbled crackers. Brightly colored Jell-O molds—canned fruits, vegetables, meats, or fish congealed in artificially colored and artificially flavored gelatin—graced the tables of all classes. There was hardly a dissenting voice as government, educators, the media, and the food industries regularly assured Americans that they were “The Best-Fed People on Earth.” Indeed, after making his first visit to Europe in 1948, where he dined in Italy, France, and England, the nation’s most prominent restaurant critic, Duncan Hines, assured Americans that their cuisine remained the world’s finest. Only the English, whose roast beef he particularly admired, approached American standards.
A Time for Self-Criticism
This mood of self-satisfaction did not survive the late 1960s, when all aspects of American society were subjected to intense criticism. The public was suddenly made aware that millions of people were living in poverty. The fact that they had trouble paying for adequate diets seemed particularly scandalous in a nation producing an overabundance of farm products, storing mountains of surplus foods, and paying its farmers not to produce food. As in the depression of the 1930s, when socialists raised the cry that there were “bread lines knee deep in wheat,” the charge that there was hunger in the land of abundance was a politically potent one. Programs to provide the poor with, first, the surpluses and then food stamps to purchase food had little trouble gaining public support.
But the issues that affected the middle classes the most had to do with their own health, not that of the poor. Indeed, even in the late 1950s the U.S. Congress responded to concerns that some food additives might be carcinogenic. Processors were allowed to continue using 704 of the chemicals then in use but were required to obtain government approval for any new ones by proving they did not cause cancer in mice. During the next decade, there was increasing alarm over the effects of the pesticides and chemical fertilizers that were now used extensively in farming and the antibiotics and other chemicals employed in meat production. Prewar concerns over the nutrient-depleting effects of processing arose again, stimulated in part by botched government attempts to stem another wave of vitamania by placing restrictions on vitamin pill sales. By the end of the decade, large corporations were being regularly denounced for being part of the “military-industrial complex” that had mired the nation in the Vietnam War. These critical eyes inevitably turned to the big businesses that now dominated practically every aspect of food production, processing, and marketing.
The best-known result of the new critical spirit was the vogue for “organic” and unprocessed “natural” foods. In a certain sense, this was a revival of the kind of mixture of health, morality, and romanticism that prevailed in the 1830s and 1840s. The drug-besotted hippies of the counterculture, with whom it was first associated, were hardly aware of or interested in past associations, but this kind of thinking also struck a sympathetic chord among the more conventional upper and middle classes. The health concerns were stoked by consumer activists such as Ralph Nader, an ascetic lawyer who turned his attention from the safety of automobiles to that of food. Young veterans of the “New Left,” whose campaigns against racism and imperialism floundered in the early 1970s, redirected their moral critique of capitalism toward its effects on food and the environment. The giant corporations, it was alleged, used their immense advertising resources to brainwash Americans into eating their overprocessed, unhealthy, and environmentally hazardous products.
The food producers were thrown on the defensive, but not for long. Within months they were reformulating their products and emblazoning their packages with words such as “Natural,” “Nature’s Own,” “Fresh,” “Farm,” and “Mountain Valley.” Pet owners could even buy “natural” dog foods. Although the changes were often merely cosmetic and failed to satisfy most critics, they did help preserve sales. A much greater threat to producers’ and processors’ profits was posed by the disturbing turn that nutritional science seemed to be taking—away from the Newer Nutrition and toward what could be called the Negative Nutrition.
Negative Nutrition
After World War II, the government had embodied the lessons of the Newer Nutrition in charts portraying four different food groups—fruits and vegetables, dairy products, meat and fish, and grains—and telling Americans that each day they should eat some foods from each of these groups. Food producers supported this enthusiastically. Since every one of their products had a place in at least one of the groups, they could all be called essential for good nutrition. They helped make the “Basic Four” the core of nutrition education in the schools and the media.
The Negative Nutrition diverged markedly from this emphasis on eating enough nutrients. Instead, it warned against eating certain kinds of foods. A prime target, of course, was cholesterol, which, since the late 1950s, had been coming under increasing suspicion as a cause of heart disease and other deadly ailments. Sugar also became a favorite target. The white crystals were portrayed as an addictive drug, used by food processors to “hook” children on their products, and a leading cause of cancer, heart disease, diabetes, skin ailments, schizophrenia, hyperactivity, and assorted other physical and mental disorders. Clearly, the Negative Nutrition also echoed many Graham themes of the past—particularly when in the hands of moralists such as the sucrophobes, whose tales often echoed Protestant parables about those who have “fallen from grace.” Graham and his followers would certainly have felt comfortable with the attacks on beef, the condemnations of the effects of industrialization on food, and the charges that greedy forces were foisting an unhealthy diet on the people. Nor would Fletcher, Kellogg, and their fellow advocates of restraint have felt alien in this atmosphere in which people such as the anthropologist Margaret Mead could blame “overnutrition” for the nation’s ill health. Had they been aware of modern epidemiological evidence against it, they would almost certainly have supported that other important aspect of the Negative Nutrition—the attack on obesity. After all, their reduced estimates of protein requirements were important steps in what became a century-long buildup of claims that obesity caused a number of death-dealing ailments.
As in the previous eras, the food reformers were supported by important segments of the scientific community. Funding for health science research had been soaring since the late 1950s, and in the late 1960s increasing sums were allocated to studying the links between diet and health. Whereas the government, whose food research efforts were responsive mainly to agricultural interests, was slow to venture into the field, huge new nonprofit charities such as the American Heart Association and the American Cancer Society had no such compunctions. Not only did they subsidize research into the deleterious effects of various foods on particular ailments, they also took the lead in disseminating any adverse results. They began spending millions to warn the public of the supposedly terrible consequences of eating foods containing too much sodium, sugar, and animal fats as well as the perils of being overweight.
With such impressive backing, it is no wonder that the Negative Nutrition struck a responsive chord among the middle class and that within four years of being endorsed by the U.S. Senate in 1977 it became the core of national nutrition policy. In 1981 even the U.S. Department of Agriculture, which had fought a rearguard action against it on behalf of the powerful beef and dairy producers, endorsed its general goals. Of course, by then many of the processors who had initially been most threatened by the new paradigm had themselves adeptly picked up its banners. A wave of “low-fat,” “low-calorie,” “no-fat,” “cholesterol-free,” and “sodium-free” versions of their products now engulfed supermarkets. When the Reagan administration freed them in the early 1980s from previous restrictions on advertising health claims, the misleading labels and advertisements for their “heartwise” and “healthwise” products became the public’s prime sources of information about the Negative Nutrition. Indeed, so great did the cacophony of half-truths and misinformation become that the succeeding Bush and Clinton administrations were forced to restrict it. However, new government guidelines, issued in the middle 1990s, seemed to provide additional scope for distortion. For example, the dairy and meat lobbies had managed to distort the government’s advice that no more than 30 percent of calories be derived from fats to mean that this was the lower, rather than the upper, limit for fat intake.
The end result reflected many of the continuing paradoxes of American abundance. The middle and upper classes seemed swept up by lipophobia and took to dieting and exercising in an almost maniacal manner. Many of their daughters were overcome by eating disorders. Yet there is no indication of any decline in the average weight of Americans over the past twenty or thirty years. Indeed, quite the reverse seems to be the case. Although they have cut back on full-fat dairy products and their beloved beef, consumption of other fats, particularly in the form of crisp snacks (often eaten to banish the hunger pangs caused by dieting) seems to have more than counterbalanced this. The same goes for sugar, sodium, and other objects of Negative Nutrition scorn. To complicate matters further, there was a parallel, often contradictory, trend toward self-indulgence. The foreign travel boom of the 1960s and 1970s helped elevate gastronomic standards by developing a more adventurous and discerning clientele for diverse kinds of cooking. It also helped to once again make food tastes a sign of social distinction, just as they had been in the late nineteenth century. The narcissism of the middle-class “Me Generation” of the 1970s, which sought “self-fulfillment” in all things, including the pleasures of the flesh, provided a boost to this, as did the yuppie phenomenon of the 1980s, which defined people in terms of their consumption habits. Beneath it all, though, the Protestant strain of moralism persisted, along with its concomitant—guilt. In 1971 the magazine Psychology Today anticipated the next twenty-five years well when it noted that “food has replaced sex as an object of guilt.” (Although Sylvester Graham would have preferred that food join sex in this matter, he would not have been displeased.)
But the targets of the guilt seemed to be constantly moving, multilayered ones. The Negative Nutrition paradigm had not really displaced the two previous ones—the New and Newer Nutritions; it had simply been superimposed on them. Americans were simultaneously trying to eat more of the foods that were supposed to cure illness, less of those deemed to cause it, more that promoted general good health, and less of practically everything that contributed to weight gain. To make matters worse, they were repeatedly fed new and often contradictory proclamations regarding what these were. Some observers began wondering how long this frantic search for ways to select from the abundance of available food choices could last. They suggested that the result might be a state of “gastroanomie” in which people have lost confidence in all the experts and all the paradigms. But this seems unlikely in a culture that seems doomed to celebrate its food abundance while simultaneously avoiding enjoying it too much.
BIBLIOGRAPHY
Belasco, Warren. Appetite for Change. New York: Pantheon Books, 1989.
Carson, Gerald. Corn Flake Crusade. New York: Rinehart, 1957.
Conlin, Joseph. Bacon, Beans, and Galantines: Food and Foodways on the Western Mining Frontier. Reno: University of Nevada Press, 1984.
Cummings, Richard. The American and His Food: A History of Food Habits in the United States. Chicago: University of Chicago Press, 1940.
Green, Harvey. Fit for America: Health, Fitness, Sport, and American Society. New York: Pantheon Books, 1986.
Hooker, Richard. Food and Drink in America: A History. Indianapolis: Bobbs-Merrill, 1981.
Jones, Evan. American Food: The Gastronomic Story. New York: Random House, 1981.
Levenstein, Harvey. Paradox of Plenty: A Social History of Eating in Modern America. New York: Oxford University Press, 1992.
———. Revolution at the Table: The Transformation of the American Diet. New York: Oxford University Press, 1988.
Nissenbaum, Stephen. Sex, Diet and Debility in Jacksonian America: Sylvester Graham and Health Reform. Contributions in Medical History, no. 4. Westport, Conn.: Greenwood Press, 1980.
Root, Waverly, and Richard de Rochemont. Eating in America: A History. New York: Morrow, 1976.
Schwartz, Hillary. Never Satisfied: A Cultural History of Diets, Fantasies, and Fat. New York: Free Press, 1986.
Seid, Roberta. Never Too Thin. New York: Prentice Hall, 1989.
Shapiro, Laura. Perfection Salad: Women and Cooking at the Turn of the Century. New York: Farrar, Straus, and Giroux, 1986.
Whorton, James. Crusaders for Fitness: The History of American Health Reformers. Princeton, N.J.: Princeton University Press, 1982.
Williams, Susan. Savory Suppers and Fashionable Feasts: Dining in Victorian America. New York: Pantheon Books, 1985.