If Pope Gregory the Great had it right, the French are going straight to hell. Gregory wasn’t anti-French per se; it’s just that the portrait of the glutton drawn in his Morals on the Book of Job bears some resemblance to the modern Frenchman. Gregory made it impossible to gain any pleasure at all out of eating when he listed not one, but five ways to sin by gluttony. We have the obvious ‘too greedily’ and ‘too much’, but we also have less straightforwardly condemnable modes of eating: ‘too early’, ‘too expensively’ and ‘with too much focus on how the food is prepared’.
This multidimensional, Gregorian brand of gluttony looks very much like the modern French attitude towards food. In a world in which many of us are especially concerned with the nutritional aspects of what we eat – high GI or low, good fat or bad – the French have a refreshingly insouciant attitude towards the culinary, an attitude that revolves around the pleasures and experience of eating.1 Although refreshing now, this pleasure-maximizing attitude towards food was anathema in those Middle Ages monasteries in which the deadly sins were codified. When gluttony was deemed deadly in the Middle Ages, pleasurable overindulgence in food and drink spoke of an ungodly preoccupation with earthly, bodily pleasures, which came at the expense of a more proper focus on the divine and spiritual.
These days, of course, gluttony is no longer the multi-headed beast that Gregory described and condemned. Gluttony is now one-dimensional; it is all about eating too much and is moralized because of its link to obesity. For many, ‘gluttony’ is synonymous with ‘fat’. The good news for the French, however, is that on this contemporary gluttony = obesity account, it’s the Americans, not the French, who are on their way to hell. For despite its gluttonous attitude to food, France has a lower prevalence of obesity than the United States.2
Of course, gluttony and obesity aren’t the same thing. Even if we constrain ourselves to the ‘eating too much’ definition of the sin, it’s clear that gluttony is a manner of eating, whereas obesity is a condition of the body defined technically as a body mass index greater than 30. (It’s also interesting to note that when ‘eating too much’ was first added to the list of sins, it had nothing to do with being fat. It was taking pleasure in the excesses of food that was the sin, not an expansive BMI.3)
Although this chapter is about the many forms of gluttony (and not obesity), the connection between the two is so tightly drawn in the modern world that I would be remiss not to say a few words on the gluttony–obesity link.
Many people attribute obesity and being overweight to the gluttonous impulses and moral failings of the individual. People who regularly eat burgers, fries and double-fudge ice cream sundaes are seen as more unethical and inconsiderate than those who eat fruit and salad.4 The overweight are judged to be morally corrupt consumers of toxic junk, lazy and lacking in self-discipline.5
But blaming obesity on the moral failings of the gluttonous individual is counterproductive for at least two reasons. First, in moralizing fat, one stigmatizes the obese and overweight, which not only makes life less than pleasant for these individuals, but also increases the tendency to overeat, thereby compounding the problem.6
The second issue with attributing obesity to mere gluttony is that it overlooks the critical role that the environment plays in shaping how, what, and how much we eat.
Social psychologists have long known that human behaviour of all kinds is at the mercy of the environment. Eating is no exception. The relationship between the gluttonous drive to consume and its impact on the body depends crucially on one’s surroundings. Put simply: gluttony is adaptive in environments in which calories are scarce (like the African savannas of our deep evolutionary history) but not in those in which calories are plentiful (like Mississippi). The desire to eat and eat a lot is what kept humans alive and healthy enough to reproduce in the relatively food-impoverished conditions of our evolutionary past. We evolved to eat much and do little, a sensible strategy when calories are hard won. In evolutionary terms, gluttons survived; dieters did not.
But gluttony looks a little different in the contemporary industrialized Western world. Because of an abundance of convenient calories in our environment, our evolved drive to rampantly consume has become maladaptive. The real problem is not gluttony per se, but gluttony in the midst of the caloric excesses of McDonald’s, KFC and frozen microwavable dinners. When calories are hard to come by, an insatiable appetite is an insurance policy, an adaptive motivation that helps one pack away a few extra pounds for a rainy and food-scarce day. But when the same basic drive can be satisfied with a short trip from the couch to the plentifully stocked fridge, the gluttonous urge to consume sets us off on the path to obesity.
The role of the environment in consumption is difficult to overstate. Take the simple environmental cue of portion size.
Brian Wansink, a food researcher at Cornell University, has spent a considerable amount of time studying the impact of serving sizes on eating habits. Time and again he has shown that the amount of food one is served determines the amount of food one eats. People given a one-pound bag of M&Ms eat about twice as many as those given a half-pound bag; cinema-goers given large servings of popcorn eat about ten grams more than those given medium servings.7 More generally: double the size of a meal, and you’ll eat up to 25 per cent more; double the size of a snack, and you’ll consume up to 45 per cent more.8
Portion sizes sway our eating habits for a couple of reasons. First, serving sizes indicate consumption norms. They say, ‘This is how much most people are eating; this is what is normal and appropriate.’
But serving sizes also drive consumption because of what Wansink calls the ‘clean the plate’ phenomenon. Somewhat counter-intuitively, we stop eating not when we’re full, but when the plate in front of us is empty. If the plate isn’t empty, we won’t stop eating.
In one beautifully devious study, Wansink and a couple of his colleagues brought students into their lab for a ‘soup-only lunch’.9 After being seated in groups of four, these students were told that they would be eating tomato soup, a new recipe, and they were encouraged to eat as much as they liked. About twenty minutes later they completed a questionnaire asking them what they thought of the soup, how much they thought they ate, and other, similar questions. What Wansink was interested in was how much soup these participants ate. And he found something quite interesting: some participants were consistently eating about 75 per cent more soup than others.
Why? What was so special about these apparent soup-lovers?
Well, here’s the devious part. Even though all of the students were eating out of what appeared to be identical bowls, some bowls were in fact ‘bottomless’. Wansink had rigged some of the bowls to be self-refilling. Unbeknownst to participants, two of the bowls at every sitting were connected via a hidden tube to large pots of soup concealed beneath the table. Wansink had created an elaborate filling and drainage system, which ensured that, as participants ate, the bowl would imperceptibly refill.
What Wansink was doing was manipulating the clean-the-plate phenomenon. Some participants were able to clean their plates while others were prevented from doing so by Wansink’s refilling system. And it was the latter participants who ate much more. These unwitting soup lovers gorged themselves not because they were hungrier, not because they were obsessed with tomato soup, but because they never saw the bottom of their bowls.
But portion size is not the only environmental cue that messes with our eating habits.
Here’s a question: from which glass on the next page do you think you’d drink more – the highball on the left, or the tumbler on the right?
Most people think they’d consume more out of the highball, but most actually drink more out of the shorter, wider tumbler.10 Because we place undue emphasis on the vertical dimension of things (at the expense of the horizontal), height looms larger than width in our judgments. Undue emphasis on the height of these glasses makes us think that we’d consume more out of the taller highball.
The list of subtle environmental shapers goes on: we dish out and eat more ice cream from 34-ounce compared to 17-ounce bowls; we eat more Hershey’s Kisses if they’re in a transparent rather than a white bowl; and we eat more M&Ms if we have ten rather than seven colours to choose from.11
Despite the pervasiveness and potency of such environmental factors, much of this influence is outside of our awareness. We simply don’t appreciate the fact that our eating environments are constantly toying with us. When you suddenly realize that you’ve eaten half a dozen more chocolates than you should have, the reaction is never ‘Damn transparent bowl’; it’s more often ‘Damn, I must have been hungry’ or ‘God, why am I such a pig?’
Given this tendency to blame eating behaviour on internal factors, like cravings or hunger, it’s no wonder that those who eat a lot are demonized as gluttonous sinners, incapable of controlling their urges. No one attributes obesity to short glasses or transparent bowls; they attribute it to a lack of self-discipline.
None of this is to say that self-control and self-monitoring have no role to play in the regulation of food intake. The point, rather, is that the crude equation of obesity with gluttony (and the resulting moral condemnation of the overweight) is both counterproductive and simple-minded.
But enough about obesity; back to gluttony.
One of the real problems with mounting a defence of the ‘eating too much’ mode of gluttony is that sinfulness is inherent in the concept. It’s like asking whether murder is wrong. Murder is defined as unlawful killing. So yes, murder is wrong; it’s a matter of semantics. What we should be asking is not whether eating too much is bad (it is by definition, given the negative judgment implied by ‘too much’), but whether eating more is worse than eating less.
The glutton’s antithesis is the ascetic. It used to be that the ascetic was the solitary monk in a bare cell with a crust of bread and some water to sustain him as he sought communion with God. The modern incarnation of the ascetic is the dieter. The dieter’s cell is more metaphorical than literal, instead of water and bread we find all manner of wheat-grass-like faddish things, and the communion sought, while still suspiciously spiritual, is less often with God than with the ideal image of oneself.
There is perceived virtue in dieting, not only because it brings health benefits, but precisely because it embodies the self-discipline and purity that overindulgence does not. And although moderation and restraint sometimes glow with virtue, at other times it’s indulgence that wears the halo.
Food for thought
Take a few moments to try to solve the following problem:
On the next page are two configurations of beads, mounted on pegs. Your task is to turn the upper configuration into the lower configuration, using as few moves as possible.12
If you happen to be on a diet, you won’t be very good at this. At least, not as good as someone who isn’t watching his or her weight.13
This exercise is quite similar to what we psychologists call the Tower of London task. Such exercises are used to measure executive functioning, which is the mind’s capacity to regulate and control thought and to make complex decisions. The mind’s executive is much like a company’s executive: it runs things, plans, and generally directs operations. And just as the executive of any large organization requires ample resources to successfully run the show, so too does the brain’s managing director require cognitive resources to run the business of the mind.
So what’s wrong with dieters’ executives? Why can’t those watching their weight deal very well with the Tower of London?
One possibility is that dieters are preoccupied with food-related thoughts. Such a preoccupation uses up limited cognitive resources. So while the dieter’s executive is off worrying about what to eat for dinner, it’s unable to efficiently do other things.14
Another explanation is suggested by the work of University of Albany psychologist Matthew Gailliot.15 For Gailliot, the crucial player in the executive functioning game is C6H12O6, a simple monosaccharide. Its common name: glucose.
Glucose is the body’s fuel. It’s also the brain’s fuel. Our bodies turn the food that we eat into glucose, which is carried in the bloodstream to the brain, where it powers our thinking.
One can view glucose as a resource, which is used up by effortful mental activity. And one kind of effortful activity is the constant self-control exerted by dieters in trying to regulate their eating. According to Gailliot, glucose used up in self-control tasks is glucose unavailable for other demanding cognitive activities.
Does this mean that dieters are condemned to suffer the consequences of chronically underperforming executives?
Not quite. It turns out that a well-timed glucose hit may actually protect us against the depleting effects of self-control.
In one clever study, Gailliot and a long list of collaborators brought participants into the lab and had them watch a short film.16 The unusual thing about this film was that every ten seconds a word would flash in one of the bottom corners of the screen. Some participants were explicitly instructed not to look at these words. These participants were in the self-control condition because they had to consciously regulate their behaviour, redirecting their attention back to the action of the film should their gaze happen to drift onto the words. Other participants were not given these instructions; they were told to watch the film normally.
After watching the film, participants were given a drink and asked to rate it for pleasantness and other similar characteristics. Unbeknownst to participants, however, some were drinking lemonade sweetened with sugar, while others were drinking lemonade sweetened with a sugar substitute. Both drinks tasted equally sweet; the difference was that the sugary drink provided a source of glucose, whereas the artificially sweetened drink did not.
What Gailliot was interested in was whether a glucose hit would protect those in the self-control condition against performance decrements on a subsequent task. Remember that according to the glucose-as-resource account, participants who exerted effortful self-control while watching the film should have depleted their glucose stores somewhat, which means less energy for subsequent demanding tasks. Would a simple drink of lemonade counteract this effect?
To answer this question, Gailliot next had participants do the following task. Try it yourself.
Name the colours that these words are printed in.
Do not read the words, but name the colours.
This is called the Stroop task, and it is used by psychologists to measure selective attention, which is a central component of executive functioning. When people do this, and you probably noticed this yourself, they find it quite difficult to stop themselves from reading the words. For example, when the word ‘white’ is printed in grey ink, it takes longer to say grey than when the colour of the font and the word match. You can’t help reading ‘white’, and this slows you down.
What Gailliot found was that, as expected, participants who had previously engaged in self-control during the film were pretty bad at the Stroop task, unless they drank the sugary drink. These glucose-enriched participants performed just as well as those who didn’t engage in any self-control during the film.
And it’s not just executive functioning that gets a boost from a glucose hit. This miraculous monosaccharide has a range of interesting consequences for behaviour. It helps regulate attention. It helps regulate emotions. More glucose means better memory, faster reaction times, less aggression . . . the list goes on. Glucose may even help us cope with death.17
So the glutton who nabs that last glass of soda may actually be a little smarter than the dieter, a little faster, and a little better equipped to face the great beyond. All positives to be sure, but can we go further? Is the glutton, so often the moral outcast, actually morally superior to the dieter?
I want you to picture Barry. Barry is a ‘typical Australian man’. Now I want you to imagine what a day in the life of Barry would be like.
If you’re like many, this imagined day would be peppered with stereotypical assumptions: Barry might spend the morning riding around on a kangaroo, half drunk, shooting koalas, before returning home to his wife, Sheila, for an afternoon BBQ, some more beer, and an evening in front of the TV, watching cricket or Neighbours.
Most of us can bring stereotypes to mind with unsettling ease: the athletic African-American; the hardworking Asian; the illiterate, surfer Australian; and the gin-soaked, bad-teethed Brit. The annoying thing is that stereotypes often come to mind without our intending them to. They are ‘automatically activated’: by merely thinking about or encountering members of a stereotyped group, an interconnected web of group-related, stereotypical information comes to mind. One simply imagines ‘Barry’, and one almost can’t help thinking ‘kangaroo’.
But activation is not the same as expression. A stereotype may present itself to us (that is, become activated), but we need not present it to others: we don’t have to express the stereotypes that come to mind. We can in fact often suppress stereotypical information when it pops into our heads, pushing it back down into the brain’s recesses, but this requires effort and cognitive resources.
Now if one brings Gailliot’s glucose logic to stereotyping, one could speculate that people with sufficient levels of glucose might be better able to engage in the effortful struggle to suppress stereotypes.
To test the possibility that glucose reduces stereotype expression, Matthew Gailliot and some colleagues showed people a picture of Sammy, a ‘typical gay man’, and asked them to write a short story describing what Sammy might do during an average day.18 Now the critical manipulation was this: before writing their stories, half of Gailliot’s participants were given a sugary drink (easily and quickly processed by the body into glucose), and half were given an artificially sweetened placebo, containing no calories, and thus offering no glucose boost. When Gailliot later coded participants’ stories for stereotypical content, he found that people who had consumed the sugary drink were less likely to describe Sammy in a stereotypical fashion. These glucosed-up participants had the extra processing reserves required to suppress their stereotypes. They probably thought stereotypical thoughts while imagining Sammy’s day, but they were able to refrain from expressing those thoughts in their stories.
So, drink naturally sweetened lemonade and stereotype less. This is good news for the glutton, but just how far do the effects of consumption go in the realm of morality?
Barbara Briers, of the HEC School of Management in Paris, and her colleagues brought a group of people into their lab to do a couple of short studies on taste preferences and donation behaviour.19 The researchers had asked participants not to eat for the four hours prior to coming to the lab and to drink only water, tea, or coffee. This ensured that they were all reasonably hungry when they arrived.
When they got to the lab, half the subjects were assigned to the ‘satiated’ group. These people did a taste test first, during which they ate what Briers calls simply ‘a big piece of cake’ and then answered some questions about the cake’s taste, colour, texture and so on. Then, after a twenty-minute interval, a donation task was administered, in which participants were asked whether they would be willing to donate to a bunch of different charities – the Red Cross, Doctors Without Borders and so on.
The other half of the participants did exactly the same tasks, but in the reverse order – donation, then taste test.
The crucial difference between these groups was that those who did the donation task first were hungry while contemplating their donation preferences, whereas those who did it second were full.
What was of interest to Briers and her colleagues was the influence of hunger on subjects’ willingness to donate.
Previous research has shown that certain aspects of food and money are processed in similar parts of the brain – the orbitofrontal cortex (OFC) to be precise, which sits just above the eye sockets.20 The OFC is implicated in the processing of rewards. And what’s interesting about the OFC is that it doesn’t seem to matter what kind of reward we’re talking about. Whatever it is – food, money, sex, drugs – if it involves rewards, it seems to involve the OFC.
The fact that food-reward processing shares neural real estate with money processing suggested to Briers that money and food might, in some sense, be substitutable.
There is a Cree Indian saying: ‘Only when the last tree has died and the last river has been poisoned and the last fish has been caught will we realize that we cannot eat money.’
Well, don’t tell Briers or the OFC. As far as the orbitofrontal cortex is concerned, one can eat as much money as one wants. At the level of the brain, the desire for food looks much like the desire for money. So a hungry person might crave not only cake, but also cash. And just as the hungry person may hoard food, they may also have the urge to hoard whatever money they have.
This is exactly what Briers and her colleagues found: hungry people were less likely to donate than those who had gobbled down a piece of cake.
Briers found a similar result when hunger was induced with the scent of baking brownies: those who smelled baking brownies, and thus presumably felt hungry, gave less money to an interaction partner than did people not exposed to the scent.21
The upshot of this is that hunger may be less of a noble state than the ascetics would have us believe. Hunger is functional, not spiritual. It drives the organism to hoard – food, yes, but other rewards, too.
In defending gluttony, one must address not only the charge of eating much versus eating little, but also the plethora of other complaints outlined by killjoy Middle Ages monastics hell-bent on stripping all pleasure from culinary experience. Remember Gregory’s expansive list: too much, too greedily, too expensively . . .
The comprehensiveness of these prohibitions reflects the complexity of our relationship with food. For humans, food is much more than a simple energy source. Eating is not just a consumption experience; it is an aesthetic experience, a social experience, an identity-constituting experience. For early theologians, indulging any of these experiences too enthusiastically could distance oneself from God. But what of these other brands of gluttony for our everyday well-being? We have covered eating much versus little. What about the others? Let’s begin with ‘too expensively’.
The cost of gluttony
When it comes to eating, expectations matter: if a chocolate pudding has ‘healthy’ on the label, people think it tastes better; if restaurant customers believe their wine comes from North Dakota rather than California, they think it tastes worse; given the choice between ‘chocolate cake’ and ‘Belgian Black Forest Double Chocolate Cake’, people will go for the second every time.22 Expectations, be they health- or taste-based, change the experience of what we eat.
One of the strongest expectation generators when it comes to consumption is price. We know from research in marketing that as the price of a product goes up, so do perceptions of quality.23 The same goes for food and drink.
But just how strong are price-based expectations? Can they change the very nature of our culinary experiences?
Picture before you five wineglasses, each filled with a different Cabernet Sauvignon. You’re asked to taste these wines and rate them for pleasantness and intensity of taste. The five wines are differentiated only by their price tags.
Wine 1: £3
Wine 2: £6
Wine 3: £21
Wine 4: £27
Wine 5: £54
So, which wine do you prefer?
Well, when Hilke Plassmann of Caltech and some of her colleagues did exactly this study, she found, as might be expected, that people tended to like the higher-priced wines more than the lower-priced ones.24 A closer look at Plassmann’s data shows, for example, that wine from a £27 bottle was preferred to wine from a £3 bottle, and a £54 bottle was preferred over a £6 one. This makes sense: price often reflects quality.
But, as is the case with many stories in social psychology, there is a twist here. You think you’re tasting five different wines? Well, think again. You are in fact tasting only three: wines 1 and 4 are exactly the same, as are wines 2 and 5. So although Plassmann’s participants found the £27 wine significantly more pleasant than the £3 wine, they were in fact rating the same wine.
What’s happening here is that expectations are driving experience. A £27 wine should taste better than a £3 wine; a £54 bottle should be of a much higher standard than a £6 bottle. It is these ‘shoulds’, rather than the basic physical properties of the wines themselves, that shape our pleasure experiences.
So the glutton who eats ‘too expensively’ will indeed experience more pleasure. The interesting thing, however, is that simply believing that one is eating more extravagantly seems sufficient to do the trick.
The spice and bitterness of life
When Pope Gregory hinted at the fact that we shouldn’t be too concerned with how food is prepared, he was really trying to convince us that we should be content with the same boring grub, day in and day out. In essence, he was trying to bias us against variety.
Now, although folk wisdom has it that variety is the spice of life, when it comes to food, not all kinds of variety are equal.
Consider this scenario: you and a few friends sit down to dinner at a fancy restaurant. A waiter brings over some menus. You take a few minutes to peruse them and then you order.
At this, the ordering stage of the meal, variety presents itself in a few different guises. The menu itself is often an embarrassment of riches: duck risotto, penne ragu, aubergine parmigiana (this happens to be an Italian restaurant), caprese salad, primavera ravioli, lasagne . . . you get the idea.
The good news about variety on menus is that it often prompts healthier food choices. Because choosing from a long list is more difficult than choosing from a short one, people tend to make decisions that are easier to justify. And when seeking justification for their food choice, diners often favour health reasons over purely hedonistic ones, such as taste. So when people in one study were given the choice of fruit or biscuits, 55 per cent chose fruit when there were only two options in each category. But when six fruits and six kinds of biscuits were offered, a healthy 76 per cent chose fruit.25 More variety, healthier choices made.
There is, however, an important caveat here: offer too much variety and people seem to go into some kind of meltdown. Give people twenty-four kinds of jam to choose from and they’re much less likely to make any choice at all than if they have to pick from only six jam varieties.26 Such decision paralysis, or analysis paralysis as it’s sometimes known, places a clear boundary on the value of variety. Offer enough to trigger mindful deliberation, but not so much that the mind shuts down.
Although menu option variety sometimes serves us well, the belief that variety per se is a good thing often proves detrimental.
Chances are that while you and your dining companions are glancing at your menus, something like the following conversation will take place:
You: What do you think looks good?
Dining companion 1: The duck risotto looks pretty tasty.
Dining companion 2: Are you going to get the risotto? I was going to get that.
You: Well, why don’t you get the duck and I’ll get the lasagne . . .
This sort of thing happens all the time – diners attempt to distribute variety around the table. Something seems not quite right about everyone ordering the same dish. This rests on the idea that variety is a good thing – the spice of the dinner table. But this kind of variety, the kind distributed across diners at the same table, often backfires.
Dan Ariely of Duke University ran a clever field study that explored the impact of such variety on food preferences.27 (Beer preferences, actually, but let’s not get pedantic.) With Jonathan Levav, a professor at Columbia, Ariely went undercover in a beer hall in Chapel Hill, North Carolina. Dressed as waiters, Ariely and Levav approached groups of customers and offered them free beer samples. Four samples were on offer: an amber ale, a lager, a pale ale and a wheat ale. For some tables, customers were provided with these options and were then asked which beer they would like. Customers then proceeded to order, sequentially and out loud, around the table. For other tables, however, Ariely and Levav gave each customer a small menu and asked them to silently write down their preferred beer. After all the samples were served, customers were asked to rate how much they enjoyed their beverages.
What Ariely and Levav discovered should be heeded next time you’re out dining with a group of friends: customers who ordered out loud opted for more variety, but most people at the table enjoyed their beers less than those who ordered silently.
It’s pretty clear what’s going on here. Imagine it: you see the list of samples and you think, ‘The lager sounds pretty good, I’ll get that.’ When ordering in silence, you simply write down this preference, and you get what you wanted. But when ordering out loud, your friend might order the lager before you, and so you decide, in the name of variety, to get something else instead. As a result, you end up drinking something that you didn’t really want.
So the pursuit of variety distributed across diners may not be so sensible. But what about variety in one’s own food choice? Well, again, it depends.
Let’s go back to the Italian restaurant. Forget about your dining companions for the moment and focus just on what you want to order.
You’re in the mood for an appetizer and an entrée and you notice that the restaurant does appetizer and main serving sizes of the duck risotto, which happens to be your favourite. Despite your inordinate love of gelatinous rice-based dishes, you probably wouldn’t order risotto for both first and main courses. And this would be a sensible move. With each bite of a particular dish, our pleasure diminishes. This is called hedonic adaptation or habituation. The first mouthful of the risotto tastes great, the second very good; but by the time you get to the end of the course, each bite is delivering less and less pleasure and you’re in the mood for a change. You’ve habituated.
So when consumption experiences follow close upon each other, like appetizer and entrée, variety-seeking makes sense. Switching to a different dish for the main course, the parmigiana, for example, is a wise decision – even though it’s not your favourite, it’s still more pleasurable than more of the same old, adapted-to risotto.
But let’s say we inject a sizable delay between first and main courses. In the spirit of an ultra-slow food movement, you’re invited to a restaurant for an appetizer one night and a main course a week later. You are asked to order both courses the first week – the appetizer for immediate consumption, the main for dining the following week. How would you order? Well, if you’re like most, you would still opt for variety: risotto tonight and parmigiana next week.
But this isn’t the optimal move.
Variety in consumption is useful when it wards off habituation. But with a long enough delay between consumption experiences, habituation is no longer a problem. The first bite of the main course of risotto next week will taste just as sweet (well, just as savoury), as the first bite of the appetizer this week. Enough time has passed that habituation doesn’t apply.
So when it comes to variety in food we need to be careful. The gluttonous gourmand who seeks a variety of culinary pleasures may be maximizing health benefits and pleasure in some circumstances, yet diminishing them in others. Variety in food, much like spice, should be used sparingly and wisely.
Ah . . . the French
In the end, understanding gluttony in the contemporary world really comes down to understanding the difference between the French and the Americans. In fact, it comes down to two crucial differences.
As hinted at earlier, French people have quite different attitudes towards food than do Americans. This difference is neatly illustrated by a couple of the questions that University of Pennsylvania psychologist Paul Rozin typically uses to measure food attitudes. Consider these:
1. Which word is most different from the other two?
BREAD PASTA SAUCE
2. ‘Fried egg’ belongs best with:
BREAKFAST CHOLESTEROL
If you’re French, you’re more likely to choose ‘bread’ for the first question and ‘breakfast’ for the second.28 These choices betray a set of culinary associations linking food to the experiential and pleasurable aspects of eating. ‘Pasta’ naturally goes with ‘sauce’ (so ‘bread’ stands out), as does ‘fried egg’ with ‘breakfast’.
Americans, on the other hand, are more likely to link ‘bread’ with ‘pasta’, drawing the carbohydrate connection, and ‘fried egg’ with ‘cholesterol’. These answers highlight the American preoccupation with the nutritional aspects of eating. For Americans, culinary experience is much less important than culinary consequences. Americans are concerned with what food does to the shape and function of their bodies. To put it simply, in France, food is about pleasure; in the United States, it’s about worry.
The other crucial difference between these countries is not attitudinal, but environmental. To appreciate the difference we can again turn to the work of Paul Rozin.
Rozin sent some members of his research team, equipped with portable digital scales, on a mission to weigh portion sizes in restaurants in Paris and Philadelphia.29 These researchers would go into a local bistro or pizza joint, pull out their scales, and weigh a typical portion of food. When Rozin analyzed the results of this research, he found something quite startling: across all the restaurants studied, American servings were about 25 per cent larger. Even identical chain restaurants, which place a high value on standardization, differed between cities. Compare a McDonald’s or Pizza Hut in Philadelphia with one in Paris, and you’ll find the Philadelphian servings to be about 1.3 times as large. A large fizzy drink is not as large in Paris (530g) as it is in Philly (545g); a medium serving of French fries is 90g in the French capital, 155g in Philadelphia.
And it’s not just in restaurants that you find these differences. Larger portions are built into every corner of the US food environment. Prepared foods from American supermarkets have portion sizes about 1.4 times larger than those in French supermarkets. Even American cookbooks tend to up the serving size by a factor of 0.25 over their French counterparts.30
Americans are living in the midst of an overwhelming abundance of convenient calories. They live in a supersized food environment. The French do not. And this makes all the difference when it comes to understanding gluttony in the modern world.
Here’s the deal: in America the attitude is ascetic, geared toward restraint and nutrition, but the environment is not. The American culinary landscape is a vista of mountainous servings of easily obtained, calorie-dense food. So despite the attitudinal drive towards health in the United States, we observe the environment-driven spread of obesity. The French, on the other hand, approach food the way Pope Gregory’s glutton would: with pleasure-seeking abandon. But the environment is structured so that these libertine tendencies are constrained. Once again, gluttony is not so simply a sin. Rather, it is a refreshing attitude towards food that can, in the wrong circumstances, lead us astray.
The equation of gluttony with obesity in the modern world has fostered a one-dimensional, puritan and boring view of food and eating. All that matters for many of us is whether the next mouthful will make our butts bigger. The French, however, have remained steadfastly multidimensional in their culinary attitudes. For the French, eating is not just shovelling fuel into one’s mouth; eating is not just eating.
For much of human history food has played an integral role in human societies – not simply as a consumption experience, but as an identity-constituting experience and a profoundly social one. Eating defines us. We are what we eat after all, both literally – proteins, water, glucose, etc. – and symbolically. Food speaks of where we come from (chow mein or tabouli), where we rank (foie gras vs. Big Mac), and what we value (vegetables or meat). Even regular old soup signals identity: chicken noodle soup = homebody; chili beef = the life of the party; New England clam chowder = the wit.31
Perhaps even more important, eating connects us with others. Food sharing is the forge of social bonds: it played a central role in the evolution of our species, binding males and females within households and cementing the larger community.32 The evening meal, which has played so central a role in human households over evolutionary history, remains an important familial glue. There is evidence to suggest that families that eat together, thrive together, having better communication patterns and rearing children who perform well at school and have better psychological health.33 Even the semantics of food and society are intertwined: the word ‘companion’, for example, has its roots in the Latin ‘com’ (together) +‘pānis’ (bread).34
The French have retained much of what is meaningful, pleasurable and social about eating. They remain true gourmands, true gluttons in Pope Gregory’s terms. They value experience, not consequence; sociality, not isolation; sensible variety, rather than plainness and monotony. And if this is what gluttony looks like, then I’d happily join the ranks of the French on their gluttonous descent into hell.