SOMETIME DURING the late 1980s—no one can pinpoint the exact date—Ron Magruder, the president of the thriving Olive Garden chain of Italian restaurants, received a telephone call from a dissatisfied customer. The call had been patched all the way up to Magruder because it was so ... different. The caller, named Larry, wasn't complaining about the food or the service or the prices. Instead, Larry was upset that he could no longer fit into any of the chairs in his local Olive Garden.
"I had to wait more than an hour and half to get a table," Larry told Magruder. "But then I found that there wasn't a single booth or chair where I could sit comfortably."
Magruder, a heavyset man easily moved to enthusiasm, was sympathetic to Larry's plaint. And as president, he could do something about it. He had his staff contact the company that manufactured the chairs for the chain and order a thousand large-size chairs. He then had these distributed, three each, to every Olive Garden restaurant in the nation. It was, as Magruder later told the eminent restaurant business journalist Charles Bernstein, a perfect example of his management philosophy: "We're going to go the extra mile for any customer, no matter what the situation."
Tales like these are the warp and woof of contemporary American management culture, limning as they do the ageless high wisdom that the customer is always right. But the essentials of Larry's tale—the easing of painful, if traditional, boundaries like a restaurant chair, and the acceptance of excess—also go to the core of the popular culture that gave birth to the modern American obesity epidemic. Indeed, if fast-food companies of the 1980s seemed to see the American eater as an endlessly expanding vessel for their product, Americans of the same period rejected the entire notion of limits themselves. They seemed to believe that the old wisdom could be inverted: Gain could come without pain. In 1980 even the hidebound U.S. Department of Agriculture began promoting its new diet guidelines as The Hassle-Free Food Guide.
Nowhere did this new boundary-free culture of American food consumption thrive better than in the traditional American family, which by the '80s was undergoing rapid change. The catalyst came in two forms: individual freedom (born of the liberation movements of the '60s and '70s) and entrepreneurial adventurism (born of the economic tumult of the late '70s and early '80s). Women, freed from the stereotypical roles and duties of the '50s housewife, now made up a substantial percentage of the workforce. Taking their rightful place alongside their male counterparts in every profession from law to medicine to construction to engineering, they set forth to transform the American corporation and helped fuel a long overdue renaissance in management culture. Men, freed from the traditional notions of being the family's sole breadwinner and disinclined to give any one employer too much loyalty, went in search of professional and personal fulfillment. Garages burst with strange new contraptions called PCs, and Mom soon joined Pop in founding strange and almost magical new businesses. Freedom was good—and profitable.
The familial price of this freedom was told in time—mainly the lack of it when it came to the kids. And when it came to eating together, that time became ever dearer. The mom of a generation previous had had the time to cook a complete meal, insist that everyone show up to eat it, and then wrestle each child's food issues into an acceptable family standard. The new parent had no time for such unpleasantness. After all, what was more important: to enjoy one's limited time with one's children, dining out at McD's, or to use that time to replicate the parent's own less than idealized childhood table? Most parents were pragmatists. It was easier and more practical simply to eat out—or to order in.
The numbers show that that is exactly what the American family did. In 1970 what the USDA calls the "food away from home" portion of the average American's food dollar was 25 percent; by 1985 it had jumped to 35 percent and by 1996 Americans were spending more than 40 percent of every food dollar on meals obtained away from home. The trend was clear and unambiguous. In 1977 the proportion of meals consumed away from home was 16 percent; by 1987 that figure rose to 24 percent; by 1995 to 29 percent. Snacking too moved out of the home and into the streets, with 17 percent of snacks being consumed away from home in 1977, 20 percent in 1987, and 22 percent in 1995.
Calorically speaking, the shift was even less ambiguous. In 1977, Americans got only 18 percent of their calories away from home; in a decade that figure had grown to 27 percent, and in less than another decade (by 1994) to 34 percent. Fat consumption away from the traditional table soared, from 19 percent of total calories in 1977 to 28 percent in 1987 to 38 percent in 1995. Where fast-food places accounted for just 3 percent of total caloric intake in 1977, that share rose to 12 percent two decades later.
And thanks to the revolution in food processing, commodity prices, and fast-food marketing, what was in that food also changed rapidly. Here the Butzian revolution had fused with the triumph of the value meal and new-style sugar and fat technologies. Yummy sizzling meat—it was everywhere! Coca-Cola— it was almost free! In this regard the single most telling statistic came from the USDA. "We calculate that if food away from home had the same average nutritional densities as food at home ... Americans would have consumed 197 fewer calories per day." Put another way, that's an extra pound's worth of energy every twenty days.
That food on the run was getting more caloric was a reflection of another, less understood phenomenon, that of "nutrient control." Nutrient control means simply that—the degree to which one exercises some control over what goes into one's food. Fast food and convenience food by their very nature preclude such control; to put it the way a French intellectual might, a Big Mac is a caloric fait accompli. So is a Swanson's TV dinner or any boil-in-the-bag fettuccine Alfredo. To be convenient—to be stable and have a long shelf life, or to retain good "mouthfeel" after an hour under the fast-food heat lamp—food had to contain larger and more condensed amounts of fats and sugars. Such was one source of those extra 197 calories.
But Americans of the 1980s kept eating more for another reason as well. Increasingly, as the away-from-home numbers show, they ate in a kind of gastronomic time warp, justifying their larger portions because they were "eating out" or because it was "a treat." But now the treat had become a daily treat. Eating out was just, well, eating. As three of the USDA's more pointed scholars put it: "Where that may have been a reasonable attitude twenty years ago, when eating out was more infrequent, [today] that belief becomes increasingly inappropriate." Americans had ceded "nutrient control"—and self-control.
Of course, ceding control—avoiding hassles and conflicts with one's children—was the whole point, wasn't it?
Such was the overwhelming message of a wide range of 1980s child-care books, most of which centered on the important but ultimately squishy notions of "autonomy" and "empowerment." Both notions derived from a reaction to the conformist society of the previous generation—the same society that had stereotyped and oppressed woman and made man into little more than a "productive unit." Such books inevitably emphasized the overriding importance of a child's personal choices as a way to instill self-confidence and responsibility. Unfortunately, when it came to food, their authors tended to view the child as a kind of infant-sage, his nutritional whims a "natural" guide to how parents should feed him.
One of the more wide-ranging of these books—one that eventually sold more than 3 million copies and made its authors virtual nutritionist stars—was Fit for Life. Published in 1985 and written by Harvey and Marilyn Diamond, two holistic nutritionists from California, Fit was originally pitched as a dietary guidebook ("You can eat more kinds of food than you ever ate without counting calories!"). But in the ever conflict-avoiding 1980s, Fit for Life eventually became a kind of all-purpose advice book for regaining one's "vital principles." On the subject of children and nutrition, its authors were insistent: Food should never become a dinner-table battleground. "Pressure causes tension," the Diamonds wrote. "Where food is concerned, tension is always to be avoided." Here the operative notion—largely unproven—is that a child restrained from overeating will either rebel by secretly gorging when away from the table or, worse, will suffer such a loss of self-esteem that a lifetime of disastrous eating behaviors will ensue.
The authors of 1985's Are You Hungry? A Completely New Approach to Raising Children Free of Food and Weight Problems took the sentiment to its next logical step. With the intent of helping children develop their own sense of self-control, New School for Social Research authors Jane R. Hirschmann and Lela Zaphiropoulos put forth three basic guidelines to parents: "First, they [children] should eat when they are physically hungry and only when they are hungry. Second, they themselves should have the responsibility for determining the foods they eat. And finally they should stop eating when they feel full."
Reading deeper, however, there was also another issue: the comfort of the parent. "To questions like 'Why can't I eat my dessert first?' or 'Why can't I eat all my Halloween candy?' you can answer 'No reason at all. You can.' And this answer doesn't lead to ill health or loss of family discipline," the pair promised. In fact "good parenting requires this answer because it leads to 'self-demand' feeding ... Life can be much easier with self-demand feeding because it allows you to give up unnecessary control and the concomitant struggles over food."
It would be tempting to lay the entire blame for such intellectual indulgence at the feet of the ever demonized politically correct, but there it does not belong, or at least not entirely, for the fact is that, despite our "spare the rod, spoil the child" big talk, Americans have been historically predisposed in exactly the opposite direction, particularly in matters concerning children and food. Part of this derives from the very nature of the American family. As the sociologist Edward Shorter has noted, in contrast to its European counterpart, the American family was "born modern." From early on it was nuclear, seeking as it did to withdraw itself from the meddling of the traditional extended family. At its center was not a child in the European tradition—essentially just one more actor in an extended community—but rather a child as the very reason for being, for feeling and acting independently. As a result, the American child commanded disproportionate "respect"—he wasn't to be hurried too quickly into the pain of adulthood. Rather, he was to be mollified with the tremendous bounty of the new nation. And the nation's greatest bounty was food, glorious food.
That is, more food. For well into the postwar years, when true undernutrition among the middle class became a rarity, undernutrition remained the central concern of most parents. This is not to say that Americans have never attempted to deal with fat children; the pages of turn-of-the-century newspapers were filled with advertisements promising to help one's "chunky" offspring "slim down." But the thrust of those efforts—from early-twentieth-century medicaments to twenty-first-century fat camps—were almost always social and aesthetic: Plump Janey was being alienated at school. Fat Joey was being harassed by the slimmer boys. Nowhere in those efforts was childhood overeating paired solely with health concerns. Always in the background were the taunts and the teases. The notion that overfeeding might be overridingly a health problem and a health problem alone never entered the American psyche.
A counterpoint to the culture of the overfed American could be found in late-nineteenth-century France, which like the United States of the period was undergoing rising rates of urbanization and declining rates of childhood mortality. The French family too was coming to a new understanding of the child. With wet-nursing on the decline, the French mother was increasingly in charge of her own little enfant cher. There was thus more natural sentiment toward the occupant of the cradle. This was a new development, for a new reason. Only a hundred years or so previous there would have been a good chance that little Mathilde, away from her mother's breast, would not make it past her first birthday; true maternal attachments could wait until she was five or six. By the late nineteenth century, however, with better medical practices and pasteurized milk widely available, the chances were good that not only would she make it out of the cradle but that she would be a part of Mama's life for the rest of her years. Chère Mathilde became chère chère Mathilde.
One of the first unanticipated products of this new generation of more indulgent French mothers was l'enfant obèse. By the 1930s French medical journals were full of case histories of fat children. But unlike their American counterparts, the French fat child was not considered to be so socially vulnerable. Rather, his or her condition was to be dealt with—directly and forthrightly—as a medical issue. Fortunately, there was already a public health network through which to treat the problem. This was known as the puericulture system.
Puericulture had begun as an informal system of health education in the late nineteenth century, principally to teach new mothers how to prevent and treat tuberculosis, then on the rise. By the early twentieth century, puericulture was adapted to teach better mothering techniques to a new generation of mothers. When the first results of parental overindulgence showed up as a two-hundred-pound teenager (as it did), the advocates of puericulture retooled again. Their prescription: Adults had to take control of a child's diet. Period. If they did not, the child would certainly become a sickling.
Soon, French mothers were being taught a new puericulture dogma. The essentials were this: Plump children were not necessarily a point of pride; mealtimes should be as nearly set in stone as possible; snacks, except on rare occasions, were to be forbidden; second helpings were out of the question, save, perhaps, on a holiday; children should eat separately from adults, so as "to avoid arousing his desires" with richer adult fare. And the child was never to be left to his or her own personal choice. Augusta Moll-Weiss, the mother of puericulture and the founder of the influential Paris School for Mothers, put it thusly: "It is unimportant how much freedom is left in this choice; the essential thing is that the quality and quantity of the diet correspond to the exertion of the young human being." Lastly, all meals should be supervised by an adult. "The basic message was surprisingly persistent," writes the cultural historian Peter N. Stearn, the principal American chronicler of puericulture. "Too much food was bad. Children must learn to discipline their appetites and eating habits, sitting for meals regularly, chewing carefully, expecting adult supervision."
For the French, struggle and tension at the table were simply part of the process of setting reasonable boundaries for children.
About this the Diamonds and the Hirschmanns and their many present-day imitators have had nothing to say. Yet this very lack of pragmatic boundary-setting may well be wreaking nutritional havoc on children.
Consider perhaps the central dogma in the child-as-food-sage theology—that a child "knows" when he or she is full. Such is the belief, repeated emphatically to this day, of many of the nation's leading nutritional authorities, both academic and popular. This despite new research showing that children, just like adults, increasingly do not know when they are full. In a recent study by the Penn State nutrition scholar Barbara Rolls, researchers examined the eating habits of two groups of children, one of three-year-olds, another of five-year-olds. Both groups reported equal levels of energy expenditure and hunger. The children were then presented with a series of plates of macaroni and cheese. The first plate was a normal serving built around age-appropriate baseline nutritional needs; the second plate was slightly larger; the third was what we might now call "supersized." The results were both revealing and worrisome. The younger children consistently ate the same baseline amount, leaving more and more food on the plate as the servings grew in size. The five-year-olds acted as if they were from another planet, devouring whatever was put on their plates. Something had happened. As was the case with their adult counterparts in another of Rolls's studies (cited in chapter 2), the mere presence of larger portions had induced increased eating. Far from trusting their own (proverbial and literal) guts, children, the author concluded, should instead get "clear information on appropriate portion sizes."
Theorizing aside, the continuing disinclination to restrain a child's eating flies in the face of overwhelming evidence that, of all age groups, children seem to be the ones who respond best to clear dietary advice. In four randomized studies of obese six- to twelve-year-olds, those offered frequent, simple behavioral advice—in other words those who were lovingly "hassled"—were substantially less overweight ten years later than those who did not get the advice. And thirty of those children were no longer obese at all.
The case for early intervention has been further buttressed by new studies on another age-old medical injunction: Never put a child on a diet. For decades, the concern was that such undernutrition could lead to stunted growth. But the authors of a study of 1062 children under age three have concluded differently. Writing in the journal Pediatrics, they state that "a supervised, low-saturated-fat and low-cholesterol diet has no influence on growth during the first three years of life." And overweight children who were put on such a diet ended up with better, more moderate eating habits, to boot.
In other words, it's good to tell Johnny when enough is enough.
Another way to find out where food intake minus mitigation leads is simply to look at the food world that children were "allowed" to create, a world that can be summarized by one word: snacking.
In the 1980s, snacking was flat-out encouraged. The first to do so were the decade's ever more economically busy parents, who simply wanted to make sure that their kids ate something. Fair enough. But snacking was also indirectly encouraged by new understandings in nutritional science, which suggested that many people, and particularly children, needed to eat more than three meals a day. Although such insights have a strong basis in fact, their real-world utility was often twisted by the media and food companies. Suddenly it was "unnatural" to eat three times a day. Progressive people ate "when their bodies told them to." Snacking was not only not bad; it was good to eat all day long. Such was the message of the diet craze known as "grazing," a quasiregimen endlessly fawned over and packaged by the mainstream media.
Food companies, of course, were happy to join in the party. There would be "Snack Good," "Snack Healthy," and, by the early 1990s, "SnackWell." And with sugar and fat prices lower than ever, it was easy for new, less bridled players to share the fun and profit. The number and variety of high-calorie snack foods and sweets soared; where all through the 1960s and 1970s the number of yearly new candy and snack products remained stable—at about 250 a year—that number jumped to about 1000 by the mid-1980s and to about 2000 by the late 1980s. The rate of new, high-calorie bakery foods also jumped substantially. A revealing graphic of this trend, charted against the rise in obesity rates, was published by the American Journal of Clinical Nutrition in 1999; the two lines rise in remarkable tandem.
The increased variety in snacks and sweets enabled by the Butzian revolution in agriculture conjured a new and ever fattening pattern of eating. Just as the presence of supersized portions had stimulated Americans to eat more at mealtime, the sheer presence of a large variety of new high-calorie snacks was deeply reshaping the overall habits of the American eater. Studying the eating patterns of adults, and using the most advanced monitoring and tracking systems available, researchers at the USDA Human Nutrition Research Center at Tufts University were able to document an amazing phenomenon: The higher the variety of snack foods present in their subjects' diets, the higher the number of calories from those foods they would consume, and the higher would be the subjects' consequent body fatness. This was stunning. Historically, the drive to eat a variety of food had been a positive element in human evolution, helping early humans to increase and balance fuel intake, and, consequently, improve their metabolic, physical, and mental abilities. The drive for novelty had been healthful. Now the same drive had become unhealthful. "Today," the Tufts researchers noted, "a drive to overeat when variety is plentiful is disadvantageous for weight regulation because dietary variety is greater than ever before and comes primarily from energy-dense commercial foods rather than from the energy-poor but micronutrient-rich vegetables and fruit for which the variety principle originally evolved." In short, variety had become the enemy.
You could see the phenomenon everywhere you went. One of the more insidious of the new snacks appeared in California, where the Snak Club company began selling huge (as much as five portions) but inexpensive ($.99) bags of unbranded candy. The bags were routinely placed near checkout stands, where a telling ad campaign forthrightly proclaimed that the bag of candy just within Junior's reach was "a meal in itself." Ten years later the label was changed to "a treat in itself."
And snack kids did. In the '80s, in every single age group, between-meal chomping was louder than ever. Moreover, the troubling tendency to snack several times every day—in essence making snacking part of a de facto meal pattern—was perpetuating itself into adolescence and young adulthood. To find out how much so, the pre-eminent nutrition scholar Barry Popkin and his associates at the University of North Carolina at Chapel Hill studied the dietary patterns of 8493 nineteen- to twenty-nine-year-olds over the period 1977–1996. The results showed that not only had snacking prevalence soared, but so had the number of snacks per day and the number of calories per snacking occasion.
The demographics of increased snacking also revealed a new and disturbing trend: The most avid snackers were the poor. In the same period the snacking rate per day among low-income households went from 67 percent to 82 percent. Snacking by whites increased the least while snacking by Hispanics and African Americans increased the most. The greatest increases were in the poor-to-middle-class South. And like meals in fast-food joints, the caloric density of snacks was growing. As Popkin concluded, "This large increase in total energy and energy density of snacks among young adults in the U.S. may be contributing to our obesity epidemic."
Beyond the immediate contribution of more calories to the diet, the very nature of modern snacking may be pushing children toward obesity. New studies show that, far from the romanticized "eat when you feel like it" philosophy, eating more often in itself may make one fat, regardless of the calorie count. In a recent summary paper in the British medical journal Lancet, the scholars Gary Frost and Anne Dornhorst explained: "Not only did hunter-gatherers eat a diet low in fat and derived mainly from slowly absorbed carbohydrates, but also by eating less frequently they spent long periods of the day post-absorptively [fasting.] Today's grazing culture results in a disproportionate amount of time being spent post-prandially, which favors glycogen synthesis and fat disposition."
In other words, a perpetually snacking child—whether he knows best or not—is literally a walking, talking, fat-making machine. One that knows no limits.
If the parents of the early '80s had, in essence, let the calories in, they would soon be aided in doing so by a most unlikely accomplice: the public school system.
Until the mid-'70s, public high schools were still a bastion of traditional postwar culture, a place where the boundaries, however frayed, still held. In postwar America, a teacher's ability to act under the legal cover of in loco parentis was rarely questioned. Hence, at least on campus, teachers wielded broad cultural influence. This was because a teacher was, for the most part, assumed to be acting in the best interests of the child. The arch of his eyebrow or the pursing of her lip meant something. School was their empire.
A second standard-bearer of campus life concerned food. Nutritionally, the cafeteria of the '70s still reigned as the center of activity for those cool enough to have parents who didn't—or couldn't, or wouldn't—pack a lunch for them. There were Coke machines, but they were few and they dispensed a mere six to eight ounces at a time, and were peripheral to campus life, the places where amateur smokers cadged a quick one between classes.
Such, at least, were the lingering images of public schools held by many '80s parents, who were (sometimes consciously and often not) hoping that the duties they no longer had time for at home might somehow be fulfilled at school.
By the time Me Generation parents began handing their children over to the schools, though, the empire had changed. The broad, boundary-imposing authority of the teacher crumbled under cultural, legal, and economic attack. The old, wide-ranging interpretation of in loco parentis had been eroded by court case after court case. Many of these turned around the issue of free speech—something Me Generation parents held particularly dear. (And perhaps even dearer since many of the high school speech cases involved the "symbolic free speech value" of wearing one's hair long.) Other legal findings limited the ability of teachers to discipline students—corporally or otherwise. The net effect of such schoolroom jurisprudence—and of the constant hectoring and second-guessing from society in general—was to make the teacher hunker down and back off. As Thomas R. McDaniel wrote in his 1983 essay "The Teacher's Ten Commandments," the best thing a truly concerned teacher could do was simple: "Sign up for a course in school law."
The final blow to the old empire came in the form of budgetary cutbacks. Ironically, many of these were supported by—if not originated by—the very same generation that was now hoping for the old system to come through just one more time. Their support for California's Proposition 13 was a case in point.
Fueled by inflation and rising property taxes, the 1979 ballot measure required a 1 percent cap on all property tax increases. Its principal proponent, a cigar-chomping Orange County businessman named Howard Jarvis, was a longtime anti-tax activist with a penchant for public speaking. As a small businessman and property owner himself, Jarvis easily connected to the growing legions of "Invisible Americans"—the same folk, many of them traditional Democrats, who had grown tired of government inefficiency and overtaxation and who would, only a year later, elect Ronald Reagan president. Persuasively Jarvis argued their case: If property taxes weren't capped, the very people who had helped build the Golden State would no longer be able to live in their own modest postwar tract homes. The measure's opponents—they were, in truth, few—took a different tack. Proposition 13, they claimed, would bring an end to the Golden State itself; it would destroy quality education, not to mention the vast network of public services that so many Californians had come to take for granted.
In all of this ran a variant of the generational temper tantrum that Earl Butz had encountered only a few years earlier. The folk wanted what they wanted when they wanted it. Proposition 13 passed in a 2-1 landslide. As did its imitators in twenty other states.
Although budget surpluses initially softened many of the budget cuts feared by the measures' opponents, Proposition 13 and its copycats did lead to many important cuts in the schools. Physical education, for one, was gutted (see chapter 4). There were closings and reduced hours at the many public libraries upon which so many schools depended. Perhaps most important, education was no longer considered the great untouchable in discussions of public spending.
In California, where famously well-funded schools had long enjoyed primus inter pares status, school cafeterias felt the first pinch, and the way they reacted to it foreshadowed how school lunch programs nationwide would deal with similar cuts.
In 1981 the California Department of Education ended its successful Food Service Equipment Program. For decades the program had augmented local school budgets by providing millions of dollars for the maintenance and upgrading of school cafeterias. For the Los Angeles Unified School District (LAUSD), then experiencing unprecedented growth, the cut "was a huge blow," says Laura Chinnock, now the assistant director of the district's mammoth food services department. "What that did was to force us to make changes in the existing infrastructure instead of expanding. So now we had to feed, say, two thousand kids through the old service windows that were built to service half that. Well, now double that —and keep in mind that the minimum legal amount of time for a child to eat lunch is twenty minutes—and you'll see why now some big schools have kids lining up at ten-thirty in the morning for lunch."
Try as they might, the period's food service directors could not make a cafeteria that once cooked for five hundred cook for five thousand. As Gene White, one of the state's most respected school nutritionists and a longtime policy hand, says, "If the school cafeteria couldn't cook the meals, the natural alternative was to get rid of a lot of the traditional cafeteria's functions. In the '80s, that meant what you might call outsourcing—cooking the meals someplace else and bringing them in to be reheated, or actually contracting with an outside source to deliver pre-plated meals." However one looked at it, the public school had lost control of many of the ingredients that went into that food. "Those pre-plated meals must meet some standard, but the overall quality is much like a TV dinner," says White. "I'll leave you to decide what that means."
Yet even outsourcing would fail to cure the cafeteria's chronic blues. Food service budgets simply failed to keep pace with growing school populations. Part of the problem was political. Not only did schools now have to compete for money with all other public services—the legacy of Jarvis—they increasingly had to do so without what was once their most politically influential supporter: middle-class parents, who were now defecting in droves to private schools. There was a cultural problem as well. With fast-food joints proliferating faster than ever, students were more and more likely to bypass the cafeteria completely and, when no one was looking, simply bolt from campus to McD's, rules to the contrary or no. Food service departments around the nation bled. Slowly but surely many came to the inevitable conclusion: Food service departments would have to become more entrepreneurial.
Fast-food makers had also come to a similar conclusion, for different reasons, but with very similar ends. For a decade firms like Taco Bell and Pizza Hut had tried—with occasional success—to develop institutional feeding programs. One way to do that was to sell frozen versions of their most popular products to large institutions. But frozen entrees never quite captured the imagination, let alone the taste buds, of increasingly sophisticated pizza chompers. Worst, to make the effort really work, fast-food makers would have to spend a great deal of money reformulating their products to meet USDA limits on fats and sugars in school lunch foods. And that conjured even more corporate dyspepsia: What would happen to their overall brand image if those reformulated products didn't taste as good as those plied by the same company's franchisees just down the street? Would that drive down regular sales as well? There had to be a way—but where was it?
The answer came in the early 1990s, when a group of enterprising Pizza Hut salespeople asked: Why not—instead of trying to qualify Pizza Hut pizzas under the school lunch program—find a way to sell the pizzas outside of the federally regulated cafeterias, say, out on the lawn, or on the playground, or even over by the old vending machine areas? The executives took the idea to several large school districts. One of them was Los Angeles Unified. There, as one nutrition director says, "it was as if this huge light bulb went on." Not only could the district get out of the never ending battles with the USDA and Pizza Hut over re-formulation, it could also make some money on its own by purchasing the pizzas centrally and then selling them at a markup. And by offering a branded product, they might additionally keep students off the streets and on campus.
As it would evolve, the deal came with a number of other perks as well. Fast-food companies helped underwrite the purchase of zippy new "food carts," to be placed strategically about the campus during lunch and break time—in essence becoming the new food service equipment program. There were added incentives for schools that sold the pizzas at glee club meetings and for those who used them for campus fund-raisers.
But the single most important innovation was the way in which individual schools actually got the pizzas. It worked like this: Every morning the school's cafeteria manager would estimate how many pizzas—or tacos, or burritos—the "non-cafeteria eaters" would likely consume that day. The manager would then call a designated local Pizza Hut franchisee and place the order. Just before lunch the franchisee would deliver the piping-hot pies to cafeteria workers, who would load them into the shining Pizza Hut food carts and send them off to the waiting students. In the parlance of management, it was a win-win situation: Schools had found a way to feed kids economically and to keep them on campus; fast-food companies got a toehold in a market that had been unreachable—and without the expense of having to obey the law. The students? They got an opportunity to eat ... the same food that more and more of them were eating at home. By 1999, 95 percent of 345 California high schools surveyed by the nonprofit Public Health Institute were offering branded fast foods as a la carte entree items for lunch. At 71 percent of those schools, fast food made up a substantial portion of total food sales—up to 70 percent. Seventy-two percent of the same schools permitted fast-food and beverage advertising on campus.
But what really was wrought? Who really was served? Certainly anti–fast-food activists now had a genuine beef with school administration. Not only had "the system" found a way around the well-intended (and very healthy) USDA guidelines, it had also instigated another problem: dietary overconsumption. Portion sizes for pizza were a case in point. The cafeteria dispensed individual pizzas that, by law, corresponded to USDA portion recommendations. In the LAUSD, for example, a typical individual school lunch pizza runs somewhere around 5.5 ounces. A typical food cart, or branded, pizza—sold outside the cafeteria and thus unrestrained by such regulation—weighs in at almost twice that. The school lunch pizza had 375 calories, the branded "personal" pizza more than twice that—almost one-third of the recommended daily calories for a typical American teenager. The schools had lost control of calories.
Among obesity-minded nutritionists, such was clearly cause for concern. But now the school district was hooked. Concerned about the enormous calorie count, Laura Chinnock, the LAUSD veteran, went back to Pizza Hut reps and proposed that the food cart pizza be just slightly reformulated—"just use some lower-fat cheese, for example." She was immediately rebuffed. "The concern was that somehow that would affect the taste—and that would somehow taint the overall Pizza Hut product. Kids might not buy so much of it on the way home or for dinner, say." Chinnock repeatedly took the issue up with her superiors at LAUSD, but with the nation's largest and most tumultuous school system in a constant budgetary crisis, anything that might cost money simply did not make it onto the agenda.
But something else had happened as well—something no one, save, likely, the Pizza Hut people, had seen: The food carts—the latter-day successors to the dingy old vending areas—had become "cool." Whatever they purveyed had cachet—it sold and sold and sold and sold. Intrigued by this, Chinnock one day decided to try an experiment. Without telling students, teachers, or Pizza Hut vendors, she substituted USDA-formulated pizzas for the usual branded pizzas. "The response was nil—they gobbled them up just like usual," she recalls. "They were basically eating the brand."
By the mid-1990s school principals had also joined the brand-fest. Faced with continuing shortfalls in funds for sports teams, academic clubs, plant upkeep, and even janitorial services, they were a receptive audience to new overtures from the soft drink industry. This time the inducements came in the form of "pouring contracts." Such contracts typically involved three monetary perks for three contractual promises. For agreeing to sell only, say, Coke, a school would receive commissions and a yearly bonus payment—sometimes as much as $100,000 — to do with as it liked. In return for putting up Coke advertising around the campus, the school would receive free product to sell at fund-raising events. And in return for making the company's carbonated beverages available during all hours, Coke would provide additional "marketing" tools — banners, posters, etc. — to aid still more school fund-raising events. In a time of tight funds and rising expectations, such contracts proved enormously popular. In Los Angeles, sports-minded parents became some of the biggest advocates of pouring contracts. It is doubtful, however, that those same parents had any clue about what soft drinks were doing to their children's overall diets, not to mention health.
Between 1989 and 1994 consumption of soft drinks by kids soared. The USDA estimated that the proportion of adolescent boys and girls consuming soft drinks on any given day increased by 74 percent and 65 percent, respectively. In many ways the pattern reflected the adult population, where, between 1989 and 1994, soda consumption jumped from 34.7 to 40.3 gallons a year. But the kids were doing something with the soda that few people initially understood: They were drinking it in place of milk and other important nutrient-rich foods.
Worse, they were not compensating for those extra empty calories when they sat down for regular meals. A joint study by Harvard University and Boston Children's Hospital researchers in February 2001 concluded that such excess liquid calories inhibited the ability of older children to compensate at mealtime, leading to caloric imbalance and, in time, obesity. "Compensation for energy consumed in liquid form, which can be observed in very young children (4–5 years)," reviewers of the study concluded, "is lost rapidly in the following years."
When it came to food—and particularly when it came to setting boundaries on its consumption—family and school were hardly alone as they drifted through the 1980s. That other great arbiter of modern life, the media, was also at sea.
For most of the postwar period, the publishers of American diet books were a somewhat predictable lot. While editors might occasionally publish a celebrity diet or a quirky new fitness regimen, the general approach of diet books to weight loss mirrored what physicians, scientists, and nutritionists had always advised: to maintain weight one had to balance calories in with calories out. To lose it, one had to consume fewer and expend more. The lone dissenter was a Cornell University–trained physician named Robert C. Atkins. In 1972 Atkins published a small book that turned conventional wisdom on its head. Instead of counting calories, and always thinking about what one couldn't have, a person who really wanted to lose weight had to find a way to do so pleasantly. And Atkins had found the way.
The way, in fact, was simple—and, as Atkins never failed to note, very scientific. Human beings, he would begin, need three basic nutrients—proteins, fats, and carbohydrates. Once inside the body, proteins were broken down to replenish muscles and tissues, fats were burned or stored for future energy use, and carbohydrates were burned for immediate energy needs. The carbohydrates that didn't get used—and this was key to the Atkins diet—were stored by the liver as glycogen, which was then stored as fat. If the body did not get enough carbohydrates during the day, it would eventually begin to "burn" its fat stores. It was that last bit of information that could make all the difference for the frustrated dieter, Atkins said. If one deprived the body of carbohydrates—sugars—one could "trick" the body into burning its own fat stores. The added bonus of such a system was that one could consume all the fats and proteins one wanted, since the revved-up Atkinized body would either use them for muscle or burn them away.
Not surprisingly, the book, Dr. Atkins' Diet Revolution, went to the top of the charts.
Yet much of mainstream publishing remained wary of Atkins. Some old-time editors and critics knew that such a diet had been proffered, on and off, ever since the mid-nineteenth century, when it was popularized by a retired London undertaker named William Banting. Banting, who lost some fifty pounds on the regimen, had published a pamphlet, On Corpulence, that had eventually caught the eye of late-nineteenth-century Americans. In the intervening century, the Banting "scheme," as it was inevitably called, had popped up with astounding regularity about every twenty-five years. At one point it was even listed as a "cure" for obesity in the Merck Manual, the prestigious physician's handbook.
The nutritionally savvy knew something else about the Banting-Atkins scheme: It was full of medical mumbo jumbo and fraught with potential peril for anyone who followed it for a sustained period of time. It was true that the body stored excess carbohydrates as fat, for example, but it was not so clear that depriving the body of carbohydrates induced the revved-up, fat-burning state that Atkins claimed. It was also unclear what medical consequences flowed from consuming enormous amounts of fat and protein. Gout—something long considered erased in modern times—was an ongoing concern.
But perhaps the biggest objection to the diet was that in the early 1970s the great mass of people simply could not afford to substitute meat for the bulkier—and stomach-filling—meal components like bread and potatoes.
As the 1980s dawned in the major New York houses, two forces colluded to erase the old editors' reluctance to promote "all the meat you want." For one, meat prices were now increasingly within the reach of the average Joe. Butz's revolution in commodity prices had seen to that. Eating a giant hamburger patty and cheese three times a day, or "all the bacon and pork rinds you can," was actually economically viable.
The other factor was publishing itself. The older, medically attuned editors were either retiring or, worse, facing increased pressure to come up with hot new diet books. If they didn't, they were told, someone else would. Calories in, calories out—that was not only boring, but the franchise for it had also been virtually sewed up by Weight Watchers. It was time to offer a bold new category of diet books—or risk losing the opportunity to the newly competitive alternative diet publishers like Atkins and his imitators.
The result was not only an outpouring of Atkins-like low-carb diets, but a like-style gusher of other "all you can eat" diets. In 1989 W. W. Norton published The T-Factor Diet, inverting Atkins's claim and instead focusing on fat as the villain. The book promised that one could "Lose Weight Safely and Quickly Without Cutting Calories—or Even Counting Them!" The key, author Martin Katahn wrote, was something called the "thermogenic effect," the ability of certain foods, in this case not protein as in Atkins but instead carbohydrates, to "rev up" one's fat-burning engine. Although the idea of a thermogenic effect had been hotly debated by scientists and diet pill makers for decades, Katahn and his editors decided to render it as fact. "It is primarily fat in your diet that determines your body fat, and protein and carbohydrate calories don't really matter very much," Katahn wrote. "Once you start replacing some of that fat with carbohydrates you will unlock your body's hidden fat-burning potential: that's the T-factor at work!"
In 1993 Dean Ornish, a California heart specialist who had reported remarkable results reversing heart disease by having patients follow a very low fat diet, joined the all-you-can-eat bandwagon. Now, instead of prescribing his extremely low fat diet for medical patients, he enlarged its prescriptions to a larger audience. As his book jacket described it: "Dr. Ornish's program takes a new approach, one scientifically based on the type of food rather than the amount of food. Abundance rather than hunger and deprivation—so you can eat more frequently, eat a greater quantity of food, and still lose weight and keep it off!"
By 1995, however, Atkinism was back again, this time retooled by HarperCollins and Barry Sears. Reacting to the growing obesity statistics despite the early 1990s consensus that it was fat and not carbs that was the villain, Sears went back to a low-carb basic: "Basta with pasta!" he proclaimed. And forget about exercising too. If one only mixed the right foods, why, "you can burn more fat watching TV than by exercising," he idiotically promised. That same year Bantam introduced Michael and Mary Dan Eades and their notion of "Protein Power," in which one could "eat all the foods you love—steaks, bacon and burgers, cheese and eggs."
The point, of course, is not that the publishing industry and its new ancillary industries in the diet supplement and video sectors were publishing pure schlock (although most of it was). There had been legitimate scientific debate about such things as thermogenesis, fat metabolism, and the metabolic effects of various foods since the mid-nineteenth century, when French scientists like Claude Bernard first discovered the glycogenic (glucose-making) function of the human liver. The point is what the new diets did not say. For completely missing from the new genre was one increasingly strange and distant concept: self-control.
The very notion of self-control was anathema to the new generation of diet books. A diet—even a weight loss diet—was no longer about limits to one's gratification. Instead, the subtext was one of scientific entitlement. After all, if one had worked so hard to get so far in one's career, well, how could self-control really be an issue? To even suggest such was to make fat a moral issue—and how appropriate was that? No, it was all a matter of using nutritional science to "trick" the body into doing what it should be doing anyway.
The new boundary-free notions about consumption weren't purely the provenance of diet books. In the South and in the Midwest, where conservative Christians had long valued such notions as self-control and personal responsibility, something was amiss as well—namely, a certain sin known as gluttony, which had somehow gotten a good name.
To be fair, it had never had a very bad one—at least not in the United States and not in most Protestant denominations. The seven deadly sins—those were largely Catholic notions, wrapped up as they were with papist ideas of sin and church-administered sacraments. (It says something that one of the most foreign-seeming things in the recent hit movie Chocolat was the obsession of the little French town's pious Catholic elder with the sin of gluttony.)
Yet the sin of overconsumption was something that had preoccupied a number of American clerics over the years. The early nineteenth century's Sylvester Graham, a Presbyterian minister and the inventor of the graham cracker, had regularly attacked overeating as the source of moral turpitude. As Graham saw it, overeating was a form of overstimulation, which could lead to no end of sinful behaviors.
Later anti-gluttons took a more pragmatic tack. In the 1950s, Charles Shedd, another Presbyterian, wrote a book entitled Pray Your Weight Away. Its message was simple: God did not make man to be fat. "When God first dreamed you into creation," Shedd wrote, "there weren't one hundred pounds of excess avoirdupois hanging around your belt." By being fat one was cutting oneself off from the joy that Christ had died to confer on us all. Shedd thus proposed a series of prayer-based activities designed to right the imbalance. There were mealtime affirmations like "Today my body belongs to God. Today I live for him. Today I eat with him." There was faith-based physical exercise. One involved fifteen minutes of karate kicks, executed while reciting the third chapter of Proverbs; another required one to time one's sits-ups to the spoken rhythm of Psalm 19. As R. Marie Griffith, the author of God's Daughters: Evangelical Women and the Power of Submission, observes, Shedd was particularly noteworthy because "he balanced his moral rebuke with positive thinking." His message persisted well into the early 1970s, when he published The Fat Is in Your Head.
By then things in the American congregation were changing. Fundamentalism brought with it a revival of biblical literalism, and the view that the spiritual world is split between the soul and the body. This worldview had a strange effect on the priority one placed on such things as fatness or thinness, let alone general health. As the Christian journal Communique noted in 2000, "Literalists are prone to view biblical texts denouncing 'the flesh' as references to the human body, instead of as symbolic of our human sin nature. Thus, they reason that since the body is evil and mortal, and the soul good and immortal, our priority is to nurture the soul, even if it means neglecting the body." With their reliance placed firmly on a personal Savior, the new conservative Christians also tended to be more fatalistic when it came to illness; He would take care of them, fat or thin. And fat—that seemed to come more naturally.
It was also politically pragmatic. For the leaders of many American congregations, the challenge of the era was competing with the permissiveness rising in secular America. That meant "a little bit o' sugar," as one pastor recalls. Along with literalist, moral preaching about things like homosexuality and abortion would come a new tolerance for "the little sins." (Later on, when many of the new leaders had had their own personal failings televised widely, this doctrine became self-protective as well.) New seminarians were thus told that "holding the flock together" meant accentuating similarities. The same thing was taking place within more liberal circles. At places like Fuller Seminary in Pasadena, California, the student bookstore carried more titles about self-acceptance than it did about traditional moral failings. (Asked where a book about gluttony or sloth might be shelved, a visitor was told: "Where else? In self-help.") The end result of this reorientation, as Marie Griffith says, was that "the American church became like therapy. It was suddenly all about love and tolerance and acceptance, not about individual discipline."
There is, of course, a societal cost to religion's abandonment of the little sins. Religion, like belts or modest meal portions or argumentative family dinners, is a maker of boundaries. Religious beliefs generate the development of moral communities, which, in turn, serve to guide and constrain the action of individuals. As the sociologist Émile Durkheim observed early in the twentieth century, without a religion's "system of interdicts," a society will flounder. (Toynbee agreed, albeit in a secular manner, by noting that the disintegration of a civilization is always marked by "a surrender to a sense of promiscuity.") The relevant point here is clear. If, as Durkheim concluded, God and society "are only one," can there ever be a little sin, at least where religion is concerned?
By the '90s, with such purely theological considerations aside, scholars who studied the sociology of religion began to notice a growing trend: Not only did religion no longer address overconsumption, it seemed somehow implicated in just the opposite—in aiding and abetting overeating. In a 1998 study looking at 3500 U.S. adults, the Purdue University sociologist Kenneth F. Ferraro sought to find out the answer to two interrelated questions: One, was religion related to body weight, especially obesity, and two, did religion intensify, mitigate, or counterbalance the effects of body weight on well-being? To the first, the answer was qualified: Obesity was highest in states where religious affiliation was highest, but the specific differences in body weight between groups were more likely explained by differences in class, ethnicity, and marital status. Of all the religious groups surveyed, Southern Baptists were heaviest, followed by Fundamentalist and Pietistic Protestants. Catholics fell at the middle of the list, while the lowest average body weight was found among Jews and non-Christians. Surveying attitudes within those groups, Ferraro concluded that obesity was associated with higher levels of religiosity. If one calculated in the fact that many of these believers were also of low socioeconomic status, one could almost conclude that eating and religion had become a unified coping strategy. "Consolation and comfort from religion and from eating," Ferraro wrote, "may be a couple of the few pleasures accessible to populations which are economically and politically deprived."
To the second question—did modern religion act to inhibit gluttony or obesity—the answer was more surprising. It didn't. Instead, the church had become a nest of unqualified social acceptance. As Ferraro wrote: "There is no evidence of religion operating as a moral constraint on obesity." Instead, Ferraro went on, "higher religious practice was more common among overweight persons, perhaps reflecting religion's emphasis upon tolerating human weakness and its emphasis upon other forms of deviancy such as alcoholism, smoking and sexual promiscuity."
Ferraro warned that it wasn't that religion indirectly promoted higher body weight. Rather, most pastors simply saw obesity and overeating as too risky a subject. "They feel they would risk alienating the flock—at least at this point," says Ferraro. "In that sense we are in a stage with obesity like we were with smoking in the 1950s and 1960s."
And so when it came to overeating, gluttony, and obesity, Christians, like everyone else in America, were in deep, deep denial. As Jerry Falwell said when he heard about Ferraro's findings, "I know gluttony is a bad thing. But I don't know many gluttons."
Family, school, culture, religion—in the late twentieth century, the figurative belt had not only been loosened, it had come off. But what of the literal belt? What of the most traditional measure—and reminder—of excess girth?
While it was true that Americans had been dressing more casually for much of the '60s and '70s, it was also true that they had retained a notion that a good public figure was a lean public figure. High and, more important, middle fashion certainly promoted that, particularly the ultra-slim fashions of the disco years. Jeans companies, particularly Levi Strauss, had not only serviced those inclinations but also helped to create them. The firm's ultra-slim cuts of the late 1970s were so ubiquitous as to inspire caricature in a number of teen movies. A typical scene took place in a Valley jeans boutique, where the jeans were so slim—and the girls so determined to wear them—that they all had to lie down on the floor and wiggle like worms in order to get into the tiny pants.
By the mid-1980s, however, both Levi Strauss and its new competitor, the Gap, had retooled their sizing. Market research had shown that the boomers—the spenders—were getting larger, and, typically, that they did not want to be reminded of their largeness. Nearly overnight, the ultra-slim cuts were gone. In fact, what was once a regular cut was now a "slim cut." And now came a whole slew of "new" cuts. A person who wore, say, a size 34 waist and 32 length in a traditional jean could now pick from at least four options: regular fit, easy fit, loose fit, and baggy fit. The unspoken reality of all these new cuts—something everyone knew and everyone winked at—was this: The new cuts were in reality simply bigger sizes, without the bigger numbers.
By the '90s this trend was joined by a number of upstart purveyors of so-called street fashion. Taking their cue from the baggy pants–prison garb of the nation's rap stars—many of them not just fat but morbidly obese—such enterprises prided themselves on making "fat" into "phat." Phat, they would proclaim, was really about empowerment—about rejecting mainstream notions about power and fashion and conformity. "We about a buncha obese playboys!" proclaimed the rap star Big Pun in 1999. It didn't hurt that the same attitude also sold millions of records.
But the same attitude was also a tremendous enabler. Consider Big Pun's story. Pun—his real name was Chris Rios—had by the mid-1990s risen from obscurity in the Bronx to become one of the most promising of a new generation of Latino rap stars. Along the way, his girth had ballooned, from about 220 in the early 1990s to about 400 in the mid-1990s. The man who discovered him, the rap star Fat Joe, saw an advantage to that. "A lot of Latinos and blacks are overweight, so they could relate to this guy," he recalls. "A lot of people think that beautiful is trim and fit, but it ain't. It's what's inside. That's off the rack." The record company that eventually signed Pun, Loud Records, used his round image in their promotions, retooling the slim Michael Jordan figure in the Air Nike ads to one featuring a short round body.
By 1998 Pun had ballooned to 500 pounds. His records were hotter than ever. As typically happens with a young, charismatic star—one thinks, for example, of John Belushi—he was soon surrounded by yes-men and yes-women. The yes-men and the yes-women brought him not drugs but food. "They got him whatever he wanted," one family member recalls. "If we went out to McDonald's, it would be fifty dollars' worth of food for the whole group, and about twenty of that would be our portion of the bill, and then he would be eating our food as well." A friend recalls how those catering to Pun buttressed his sense of denial about his obesity. "People would tie his shoes for him, or push him around in a wheelchair when he didn't feel like walking, or buy him clothes and hide what size they were," she says. "If he was a size 10xx, people would buy him three to five sizes bigger, so he'd never know how much he was gaining."
And gain Pun did. By the time he was twenty-nine, when he died of a massive heart attack, he weighed 698 pounds. "That was a big shock," says the same family member, "because everyone in rap is always dying from violence and then we're told that he died because of his eating!"
At the suburban mall, the enablers were not rap and baggy pants but rather clothes made with Lycra spandex, a postwar synthetic that the aspirant classes had long considered déclassé to the extreme. No longer. Thanks to the health club boom and the incessant marketing of Olympic stars who wore tight spandex during their televised athletic triumphs, everyone thought they could wear the stuff. Particularly the middle-aged. It felt so ... good. And didn't it kind of make one look ... slimmer?
Well, no. At least not if one were from beyond the American mall. Consider the experience of Johannes Hebebrand, a professor of physiology from the University of Marburg in Germany. Hebebrand, whose specialty is obesity and its social origins, was visiting New York in the late 1990s as part of a series of studies he was conducting about social stigma and its psychological effect on the obese. His operative notion was that since fashion magazines and movies had so glorified thinness—and denigrated fatness—that fat people would be less likely to present themselves as fat in public. Such was his thesis.
But stepping off the plane and into the nation's shopping malls, Hebebrand was "floored"—what he was seeing was exactly the opposite. "I mean, here were all of these women, wearing this kind of tight black stretch thing!" Hebebrand recalls. "They were huge—their bellies and their derrieres were almost comic-book-sized! I was shocked because in Germany people who are that fat just don't go out. They don't go out because of the shame. But it wasn't the case here in the U.S."
In recent years, big sizes have become an increasingly necessary part of any clothing company's survival strategy. Large sizes account for a growing segment of the total clothing market, rising from about 7.5 percent in 1995 to about 9.4 percent in 2000. Sales of women's sizes 16 and up have risen steadily since 1997, with a 22.2 percent jump between 1999 and 2000. Moreover, the new big sizes are no longer confined to the plus-size sections of major department stores. The Gap, for example, recently nudged up its selections to a size 16, as has the ultra-trendy sportswear label FUBU. Tommy Hilfiger has plans to launch a plus-size line. And in mid-2001, the edgy retailer known as Hot Topic, with 291 stores nationwide, opened its first store for sizes 14 to 26. The firm estimates that about 30 percent of young women in the United States wear a size 14 or bigger. "This is one of the hot new target audiences," says Candace Corlett, a partner with WSL Strategic Retail, a consulting firm in Manhattan. "The population has grown heavier; the insurance companies are starting to redefine the weight groups; and we seem to be becoming more and more accepting of large people. It's almost the polar opposite of where we were in the '60s." That is, when we weren't so obese.
There are, of course, good and rational reasons to expand the clothing choices available to young people. Youth is a time of great changes in body size and shape; sometimes outsize garments are not a matter of style but of necessity. It is also a time when vulnerable egos can become warped by the inevitable teasing that comes with being overweight or obese. Having stylish clothes like everyone else can alleviate some of that social strain.
But we would be fooling ourselves if, as a culture, we came to believe that such accommodations come without a price, and perhaps a sizable one. Science, history, and common sense all hold that physical reminders of one's excess girth are critical when it comes to controlling further weight gain. One of the first things that experts in the science of weight loss recommend to patients who have lost weight, for example, is to get rid of their old, big-size clothing. The presence of such old clothes simply makes it easier for a person to gain weight; there is something comfortable to go back to. Researchers in the science of satiety—the study of when someone feels full and satisfied with a meal—point to something else. A slight tug at one's waist seems to perform two vital weight-maintaining functions. The first involves the so-called stretch factor, the brain-signaling that occurs when one's stomach is stretched by food intake. Those signals tell the brain when one is satisfied, telegraphing the message that one has eaten enough. A tug at the waist—something absent or diminished by spandex or extra-large-size pants—seems to accentuate that signaling.
Then there is what might be called the theory of the belt, which holds that people will watch and maintain their weight better if they are warned that they are gaining weight by clothing that makes them slightly uncomfortable. Although largely the product of accumulated experience and folk wisdom, there is now a small but important body of science upholding the theory. In the early i980s, John Garrow, the dean of British obesity studies, looked at the post–weight loss experience of a group of obese patients who had had their jaws wired. Garrow wanted to know if the patients could be prevented from regaining their weight through psychological reminders, or "cognitive thresholds."
Garrow's obese patients who had maintained weight loss had reported that they now wore smaller new clothes. Garrow proposed a test. He fitted half of his subjects with a 2-millimeterwide nylon waist cord, one tight enough to make a white—but not red—line when seated. A control group was not fitted. The results, he wrote, were "striking differences between the two groups ... in the weight change after the wires were removed." In the control group, the predictable weight gain had commenced full throttle—at about 1.8 kilograms a month. In the group with the waist cords, however, there was no significant weight gain. Surprisingly, the belt effect seemed to be a lasting one. Five months after the unwiring, the waist cord group had gained significantly less weight than the control group, and the average difference between the two groups "thereafter steadily increased."
Which brings us, full circle, back to our friends at the Olive Garden...
About two months after he first heard from Larry, the customer who had complained about how small all the chairs were in his local Olive Garden restaurant, Ron Magruder, the chain's president, received another call. It was Larry again. He was calling in response to a follow-up query from one of Magruder's staff. The staff had been busily making sure that all of the chain's restaurants now had at least three chairs that could accommodate the more amply endowed and had wanted Larry to report what he thought of their efforts.
Well, he was happier now. Indeed, Larry's message was entirely conciliatory—even thankful. But it wasn't because of the bigger chairs. It was because of the old small chairs. Largely because of them, Larry explained, he had been spurred to finally confront the extent of his weight problem. Why, in the seven weeks since he had spoken to Magruder, he had lost almost fifty pounds.
That tight little chair—that had been what Larry needed after all.