CHAPTER THREE Food Doesn’t Have to Be So Hard

Is it just me, or are we all caught in a twilight zone of misinformation when it comes to food? First there was the low-fat diet push of the ’90s and ’00s, when everyone flocked to products like fat-free cookies and premade pancake mix with “low fat” proudly emblazoned on the packaging. Then, seemingly overnight, the ranks of the die-hard low-fat community diminished as new research suggested that all these hyper-processed low-fat foodstuffs caused sluggishness, heart disease, and pesky weight gain.

Not long after, the Paleo diet trend came and went. Premised on the notion that we should approximate the eating habits of our Paleolithic ancestors, this diet “Paleofied” and then commodified items like pasta and pancakes. A diet informed by our evolutionary history—that is, by our genetics—is the most logically sound starting point, but the movement failed to see the irony in rejecting modern bread, pasta, pancake mix, or cookie dough, only for its adherents to re-create gimmicky, supposedly “Paleo” versions of these same things. There’s a huge difference between a whole-food, Paleo-type diet based on evolutionary principles and a modern, industrialized diet created with “Paleo” ingredients.

We now inhabit a confusing and murky post-Paleo world, in which some have turned to “plant-based” or vegan alternatives, and others have embraced low-carb or even ketogenic diets. (Ketogenic diets severely restrict carbohydrate and protein intake, relying primarily on dietary fat as an energy source.) What and how we eat has become increasingly polarized and tribal—a fact that should come as no surprise given our current political landscape. The world is increasingly contentious, and the diet world is no exception. For every diet guru, there’s an anti-guru. For every early adopter, there’s a skeptic telling you a given diet is a fad, unproven, and unsafe.

This turbulent food and nutrition environment, exacerbated by social media and marketing, causes a great deal of angst among health professionals. A friend of mine, a veteran naturopathic doctor specializing in women’s health, strongly believes that women need nutrient-dense animal proteins—meat—for optimal health and function. (I couldn’t agree more.) Yet after she was publicly vilified online for saying so, she steered away from discussing the topic at all. I know of other health professionals who consume meat, starches, vegetables, and salads but refrain from discussing it on social media for fear of attacks from online lynch mobs allied with particular diets. Read that last sentence again: people eating meat, starches, and vegetables fear social judgment. Crazy times we live in.

To be sure, food tribalism afflicts a relatively small portion of the world’s human population that has the luxury of choosing its diet. Whether or not to blend Irish butter sourced from grass-fed cows into your Indonesian coffee is the epitome of a #firstworldproblem. But as we’ll discover in this chapter, our real first-world dietary problems, which have now spread throughout most of the globe, are sources of widespread hardship and disease. Whether we eat a standard American diet or adhere to one of the healthier alternatives like unprocessed Paleo or intermittent fasting, we’re still stuck on what I call chronic “summer eating” patterns. Since the agricultural revolution, when we flattened out our ancestral dietary oscillations, we’ve relied excessively on summer-style carbohydrates and long eating windows (eating early and late during our many waking hours). Abandoning food rhythmicity for continuous summer eating has produced chronic conditions like inflammation, insulin resistance, and obesity, and we all know that these things corrode our health and happiness.

How Our Ancestors Ate

For most of history, humanity has consumed diets that varied based on geographic location and available local food offerings. Our early human ancestors, who roamed the earth over two million years ago, were hunters and gatherers. Gradually, they fanned out from the equator, exposing themselves to a wide array of local foods. Certain societies ate a lot of protein and fat-rich nuts; others consumed large quantities of starchy root vegetables like sweet potatoes; still others ate whale blubber and very few carbohydrates. Then, around ten thousand to twelve thousand years ago, we collectively discovered that we could consume grass seeds (in other words, grain), and that by storing them properly we could stop migrating altogether and lead more stationary lives. We began growing grain, claiming ownership over the land to protect our annual crops. Giving up the hunter-gatherer lifestyle, we took up living in fixed settlements, choosing a narrower but less perishable range of food options.1

With a larger and more stable food supply and the ability to have more children (fueled by lots of easy carbohydrates), populations mushroomed and civilization progressed—we became better at language, developed rich cultures, and improved our survival rates. Agriculture powered such civilizational flowering, allowing us to produce more food and keep more people alive, at least long enough to give birth to the next generation. Agricultural foods began to account for ever greater shares of our caloric consumption, displacing the more varied sources of plant and animal nutrition we had enjoyed previously. Unfortunately, we are not well adapted to subsist primarily on cereal grains. They might increase our collective survival rates by allowing us to live long enough to reproduce, but they don’t keep people resilient and healthy over our longer life spans.

Mummified remains in ancient Egypt and elsewhere bear witness to grain-based deficiencies like skeletal fragility, dental cavities, and maladies such as cardiovascular disease that were not widely present before the agricultural revolution.2 And yet, we weren’t about to give up on agriculture. As our sedentary ancestors expanded into towns and cities, population growth brought mounting pressure for more grain, further reinforcing our reliance on agriculture. With industrialization, our love affair with grain continued as we perfected the art of extracting the maximum amount of caloric energy from foods—much more than we need to enjoy optimal health.

From an evolutionary and ancestral health perspective, our diets shouldn’t be based on carbohydrates year-round, as they typically are in modern civilization. Rather, our carbohydrate/fat/protein intakes should vary seasonally as our food selections vary seasonally. We’ve already observed how, during the warmer months of the year, our Paleolithic ancestors would have gorged on carbohydrates relative to dietary fats and complete proteins. This summer-type pattern of eating is roughly analogous to modern vegetarian, vegan, or plant-based (though not plant-exclusive) diets because it is higher in carbohydrates and somewhat lower in fat. But our ancestors didn’t follow diet fads, and they ate both plants and animals year-round based on what was available to them. Instead, seasonal offerings dictated their higher-carbohydrate, lower-fat diets during the spring and summer seasons, as more relatively high-carb food sources—fruits, berries, honey—would have been more readily available for gathering. Consuming a large volume of nutrient-dense foods like these had metabolic and nutritional advantages during the warmer months. Naturally occurring fruits and vegetables are rich in antioxidants, which help protect our cells from damaging chemical compounds called “free radicals,” which would be plentiful in our bodies during these months of summer stress. Compounds like polyphenols and carotenoids, which were abundant in these fresh plant foods, would have helped to protect our bodies from continual sun exposure and the oxidative stress of large amounts of physical activity.3

During the summer months, the accumulation of excess carbohydrates hopefully would have increased our body fat stores (and all of the summer’s frenetic energy would have also improved our metabolic efficiency, aka fitness). This was important, given that carbohydrate-rich foods would have been in shorter supply during the fall and winter. As the air grew colder, the hunting of animals would likely have taken precedence over the gathering of plant foods (when’s the last time you went berry picking in winter?). These animals, especially in the fall and early winter, would have carried greater amounts of body fat, skewing our nutritional consumption toward more complete proteins and higher fat. With higher-fat nuts and seeds also in greater supply, fall and especially winter diets would have likely contained moderate protein energy (10 to 35 percent of calories) plus relatively high-fat, low-carbohydrate, nonprotein energy.

Just like summer’s antioxidants, winter’s increased natural fat consumption would have conferred evolutionary and metabolic benefits. As we consumed more protein, and especially fat, we became more efficient at using fat for fuel, possibly even entering a special type of metabolic state called “ketosis.” Ketosis requires the near-total absence of dietary carbohydrate and a huge amount of dietary fat, but some population groups who relied largely on animal foods in the winter would likely have moved into this adaptive metabolic scenario. An ancestral winter diet would have been somewhat analogous to the modern low-carb, high-fat or ketogenic dietary approaches, as well as to intermittent fasting protocols (given the relatively small daylight window that we had to gather and consume food in the winter). Going into winter ketosis would have helped us offset the metabolic effects of eating so many summer carbohydrates, while also allowing our bodies to experience flexibility in our metabolic pathways. That metabolic flexibility would have allowed us to stand a better chance of adapting to diverse nutritional environments. More adaptable organisms are better equipped to survive changing environments.

I have purposefully used terms like “relative” and “likely” throughout my cautious summary of hunter-gatherer eating patterns. Critics have attacked the Paleo diet movement and its offshoots for offering up simplistic narratives of ancestral life. I don’t pretend to know everything our ancestors did and ate in great detail. My main point concerns the relative shifts in food consumption (and thus nutrient intake) that might have occurred over the seasons—perhaps only a few percentage points of energy intake in either direction. In the wider context of the seasonal environment, such smaller changes were substantial enough to alter downstream physiological processes. It would have been advantageous for our ancestors to utilize diverse metabolic pathways, enabling their bodies to thrive on antioxidant-rich, high-carbohydrate foods in one season, while using primarily fat to power their metabolisms in the colder months. That said, I don’t wish to perpetuate Paleo fantasies and say this was exactly how our ancestors ate—we simply lack the historical evidence to make definitive claims. My point is that for most of human history, in most parts of the world, and in most cultures, our diets varied seasonally, oscillating just as our seasonal sleep patterns, modes of social interaction, and movement patterns did, and that we would all be better off if we honored some of the historical rhythms that our bodies expect.

The Good and Bad of “Fad” Diets

In other books, I’ve extracted lessons from this brief sketch of civilization and the flattening of diverse eating patterns, suggesting which foods we should eat and which we should avoid. Let’s now take the discussion a step further and address larger lessons about how and in what context we should eat. I’ve gently criticized the Paleo, keto, vegan, and other plant-based protocols for their faddishness or lack of sound supportive research. Although these diets may not be optimal for our long-term health, they can all confer benefits over the short term for one simple reason: all of them serve to dislodge us from something much worse—the mainstream, highly processed, grain-based diet on which most of us are stuck.

Nutritionists call this mainstream approach to eating the standard American diet, or SAD. Given the ultra-processed Frankenfoods that feature prominently in this diet, SAD couldn’t be more apt. In inflammatory, nutrient-poor pseudo-foods like cookies, chips, and microwave dinners, industrial processing has done all the hard digestive work for us, leaving us with food products that are effectively predigested (they’ve had their cellular structure largely removed or heavily altered), soft (don’t require a lot of chewing, or worse, are in liquid form), and extremely energy-dense (contain fats and sugars in amounts well beyond what humans would normally consume). The presence of this excess energy, combined with the inability of this Frankenfood to suppress our hunger and truly satisfy us, leads us to regularly overeat and store the excess energy as fat. These so-called “foods” deliver us calories for survival, but they don’t nourish us in a deep, satiating way. By leaving us without an intrinsic sense of rhythmic eating or of deep nourishment, they induce us to follow our stress-driven and reward pathways for carbohydrates, and especially sugar.

Veganism, vegetarianism, and other plant-based diets represent a marked improvement over the standard American diet. By prompting us to avoid or minimize animal protein, these diets boost the amount of whole-plant food a person consumes, increasing the share of nutrients we consume and improving our gut health. Like standard American diets, however, plant-based diets allow people to survive, but not necessarily to thrive. That’s because plant-based diets make it difficult for us to meet our complete dietary protein needs. Once again, context matters. If you aren’t physically active, a plant-based diet might give you enough protein. But plant-based diets tend to rely heavily on cereal grains and legumes, either because they’re convenient and inexpensive or because they deliver some dietary protein.

Most of us were raised to believe that whole grains are important pillars in our diet, furnishing “complex carbohydrates” to power us through a workout or an afternoon at the office. Unfortunately, beans, lentils, barley, wheat, and rye—not to mention pizza, pasta, and bread—are nutrient-poor foods that contain various and often problematic compounds like lectins that inhibit nutrient absorption and proper digestion, and can cause inflammation in the gut. Additionally, foods like barley, wheat, and rye contain a type of protein called gluten that can cause intestinal problems and inflammation. Such symptoms can be severely detrimental to people with celiac disease or non-celiac gluten sensitivity, but they likely inflict some degree of damage on most of us, increasing systemic inflammation.4

Eating low-fat vegetables and fruit all year long sounds healthy and virtuous, but this summer diet is a poor selection in midwinter, when the chilling winds, ice, and snow (as well as the larger number of hours of darkness) leave us wanting something heartier, which typically means warm and calorie dense. In the winter, we don’t require as many antioxidant-rich fresh plant foods, and naturally gravitate toward the higher-fat foods and complete proteins that help improve our ability to use fat as a fuel source. Not to mention: if you have snow on the ground where you live, that fruit and arugula would have racked up significant miles getting to you from warmer climes, decreasing their nutritional potency and placing undue strain on the physical environment. Bottom line: I want you to be strong, robust, and resilient, and I therefore can’t recommend permanent plant-based eating.

As this book goes to press, low-carb and the more extreme ketogenic diets have surged in popularity. Although some research supports the metabolic merits of ketogenic eating, many people come to it simply because our conventional dietary approaches are failing, and they’re desperately seeking alternatives. Carb restriction addresses our society’s endless summer diet that overemphasizes carbohydrates and causes insulin resistance, chronic inflammation, excessive deposition of body fats, and other diseases of civilization. Instead of eating too many carbs, low-carb or keto diets will have us veer to the opposite extreme, sticking to the protein and fats that approximate what our ancestors ate in the fall or winter months.

Ketogenic diets consist mainly of healthy dietary fats, like avocados, nuts, and fatty animal proteins, while keeping protein intake moderate and carbohydrate consumption to an absolute minimum. Keto’s underlying rationale is sound because when we perpetually supply carbohydrates, our bodies predominantly burn glucose (sugar), and if we limit carbohydrate and protein (some of which can be converted into glucose), our bodies will adapt and get considerably more efficient at using fat as a fuel source, including producing some special molecules called “ketone bodies” that our brains can use as a partial fuel source too. By abandoning chronic summer’s carbohydrates, ketogenic dieters have unknowingly reaped the metabolic benefits of moving into a different season. They’ve abandoned endless summer and embraced the more stark and wondrous beauty of winter. A ketogenic diet is effectively a corrective strategy. But like plant-based diets, carb-restricted or ultra-low-carb ketogenic diets are not optimal ways to eat long-term. They are ideal for winter—and winter only. See the pattern here? Different seasonal modes of eating confer different benefits, all in their correct time, but none of them are truly optimal as a fixed way of eating long-term.

Something similar holds for another popular dietary approach, intermittent fasting. In the ancient past, we mostly ate when it was light out because the food we hunted and foraged was visible. The time convenient for us to eat varied seasonally as well. In the summertime, when we would rise early and retire late, we had many hours to chow down. In the winter, the shorter days narrowed our feeding windows. So once again, given the prevalence of chronic summer habits, it makes sense that people seeking solutions to chronic summer’s harm might gravitate to fasting protocols. When people restrict their feeding windows (the hours that they choose to eat within), they move away from summer’s very wide feeding window. Like low-carb/ketogenic approaches, such a lifestyle is corrective—an unwitting nod to our long-neglected winter eating patterns. Yet problems arise when we take the very wide feeding window of summer, or the very reduced one of winter, and apply it every month of every year for decades. That merely substitutes endless winter eating for chronic summer. We should oscillate in and out of both patterns over the course of the year. Both are good.

Unfortunately, many people embark on fasting protocols out of season. They usually start over the spring or summer months (often to get lean or “beach ready”), even though fasting is better suited to winter, with its naturally shorter days and the shorter eating windows they gave our hunter-gatherer ancestors. Among habitual fasters I’ve also observed a tendency to delay eating after waking and not break the fast until much later in the day, often after midday (something referred to as late time-restricted feeding). As we’ll explore below, this strategy defies research suggesting that early time-restricted feeding (eating early in the morning, and fasting later in the day) is superior in terms of metabolic benefits.

If you search for any of these diets on social media, you’ll see countless people testifying to their wonders. These diets have lowered fasting blood sugar levels, helped reverse diabetes, cleared up acne, and even helped people manage chronic pain. They’ve given people more energy and an emotional kick start, enabling them to leave toxic relationships, open new businesses, or approach life with more zest. But if we were to conduct further investigation, we’d likely discover that these diets worked well for a season or two. The benefits of these adaptations don’t last indefinitely.

People aren’t as inclined to post dramatic confessions about how their dietary choices have failed to produce results after an initial success. This lack of long-term “results” often leads people to shift strategies, or to double down on their current strategy, potentially leading them farther down a maladaptive path. If, for example, your low-carb diet isn’t working as you’d hoped (perhaps because it’s the right diet at the wrong time), you might decide to intensify your efforts by going even lower carbohydrate, or perhaps keto.

Keto does its best work as a corrective counterpoint to chronic summer, in which we constantly consume and crave. If we can manage to resist all of those tempting carbohydrates on offer in modern society, and instead consume almost exclusively fats and some protein, our bodies go into ketosis, which means they begin burning fat and ketones, instead of glucose, for energy. Ketosis represents a significant break from our glucose-driven chronic summer eating patterns. The problem, of course, is that dieters who see benefits on ketogenic diets conclude that keto is a permanent, year-round solution. Many online forums feature people measuring their ketone levels by urinating on ketone strips, fretting about the concentrations of ketones in their blood, and whether that chicken breast with lunch kicked them out of ketosis (rendering them a dreaded “glucose burner” again). Before you know it, people are afraid to eat even healthy foods and become more restrictive as their health begins to deteriorate.

Popular Diets: More Similar than Different?

Such anxiety is really a shame because, upon closer examination, all of these diets converge on a similar pattern. Take, for example, a healthy low-carb meal. What does that look like on a plate? Probably something along the lines of a hand-sized portion of protein plus some nonstarchy vegetables. In such a meal the total carbohydrate is low, but the total fat may be higher (or at least higher than has been recommended over recent years). How about a healthy Paleo meal? On the plate, such a meal will represent something like a hand-sized portion of protein (typically animal protein) plus some nonstarchy vegetables. It may also contain some starchy root vegetables, but it is still likely to be lower in total carbohydrates than a typical Western or standard American diet meal. A keto diet? A higher-fat, very low-carb variant of the low-carb meat and vegetables. A vegan diet? If done right, plant-based protein sources plus vegetables. Carnivore diet? Protein sans vegetables. The Mediterranean diet? Protein with an emphasis on fish over red meat, plus vegetables. Intermittent fasting? When you do eat, you are encouraged to eat a good source of protein plus vegetables. Even the best versions of the old-school low-fat diets were … wait for it … protein plus vegetables. Think skinless chicken breasts plus steamed broccoli.

In addition to being similar, all of these diets boast similar benefits. In consuming whole foods instead of the preprocessed staples of the standard American diet, our digestive systems must work hard to break down cellular structures in order to extract a food’s nutritional value (from the chewing and grinding of our teeth, to the acid bath of our stomach, to the enzymatic breakdown in our intestine). Such exertion limits the resulting payload of amino acids, fatty acids, and sugars entering our bloodstream, with the process taking enough time so that our hunger can naturally subside as we experience “satiation” (the desire to no longer eat). These diets also help promote a healthy gut microbiome. The human “microbiome” refers to the microbes, like bacteria, fungi, and yeasts, that inhabit our gut and help us do everything from regulate our immune system to metabolize food.5 The concentrated fats and sugars contained in processed foods not only overwhelm our system directly, but they also skew the balance of our microbiome. But improving the gut microbiome can help reduce cancer, obesity, cognitive problems, and metabolic dysregulation.6

Although we have much to learn about the nature of the microbiome, shifting our eating away from processed food products and toward whole foods serves to drain the swamp. It decreases the numbers and types of bacteria that may, for example, promote inflammation in the gut, and creates a better balance of more desirable types that work more in concert with our immune system. Conventional dietary recommendations suggest we should consume more fiber in our diet, but I believe that “fiber” is just a proxy for the consumption of whole foods, of plant or animal origin, that have their original cellular structure either fully or mostly intact at the time we begin to ingest them. Consuming such whole foods improves both our gut health and total body wellness, given the important symbiotic relationship between human beings and their microbiomes.

Although the popular diets I’ve mentioned reduce the amount of ultra-processed foods we’re eating and help improve our gut microbiome populations, they bring a number of disadvantages. Far too often, such diets, and the dietary tribes to which they give rise, define themselves in negative terms—emphasizing what they’re restricting. If you’re on any form of low-carb diet, your identity is carbohydrate restriction. If you eat a low-fat diet, you’re all about fat restriction. If you eat a plant-based or vegan diet, your dietary identity centers on animal protein elimination. If you’re an intermittent faster, you restrict a range of food but not during certain windows of time. These diet camps, in turn, believe that those failing to adopt their respective restrictions will experience dire health outcomes. The plant-based tribes say: eating too much protein or too many calories will shorten your life span. The keto enthusiasts claim that eating carbs produces toxic by-products, so we must restrict all sugar. The intermittent fasters say making our bodies constantly metabolize food diverts our organs from performing other vital tasks, so we must eat less often. Popular nutritional frameworks have become a fearmongering case of “restrict (something) or die.”

Conversely, advocates of these diets often attribute their success to whatever restriction the individual or group embraces. A low-fat diet that cuts out cakes, bagels, muffins, and many other calorie-dense, ultra-processed foods, and replaces them with protein and fiber-rich plants, supposedly works because it reduces an individual’s caloric intake by reducing the macronutrient with the highest calorie load—fat. Low-carb diets also work, the logic goes, because as carbohydrate calories reduce, so do insulin loads, promoting fat-burning rather than fat storage. Paleo works because it cuts processed grains, sugars, and fats. Veganism (again, when done well) also removes many processed foods (because such foods often contain ingredients such as milk or butter). No matter what diet we champion, we’ll generally fly the flag for that diet based on what it primarily limits.

Still, each diet essentially represents a short-term, partial solution to chronic summer—an oscillation away from chronic sugar and carbohydrate intake to either a more moderate or “winter” mode of eating. This is the fundamental problem with the diet industry. It proposes short-term approaches that people interpret as global, permanent “solutions.” In reality, these diets happen to provide partial corrective strategies that compensate for our chronic summer eating.

Eat Seasonally to Get More Protein

For the last decade of my life, and following numerous “what do you eat?” requests, I’ve defined my own eating with the following sixty-second elevator pitch, emphasizing seasonal oscillation:

I eat naturally occurring and minimally processed foods like meats, eggs, vegetables, and fruits. I choose these whole, nutrient-dense foods over packaged, processed foods, which are often nutrient poor but calorie dense. Food quality is important to me—a concept that includes where my food comes from (local), how it was raised or grown (humanely; organic), and what its overall environmental impact is. I aim for well-balanced nutrition, so I eat a diet of predominantly unprocessed plant-based foods, anchored by appropriate amounts of quality animal-based protein foods. This balanced combination of plants and animals provides me with all the nutrients I need, including all the proteins, carbohydrates, and fats naturally inherent in these food groups. How I eat—the social and cultural aspects of food and nutrition—is just as important to me as what I eat.7

I deliberately crafted this pitch to show a very inclusive diet, and to signal that I make conscious decisions regarding the likes of animal welfare, as well as the environmental, social, and cultural impacts of my food choices. In fairness, I restrict as well, as I generally avoid ultra-processed and processed foods. Such foods negatively impact my health, and because they contain cheaply made foodstuffs (like corn derivatives), lack local color and cultural significance, and reach my table after long voyages around the world, they also diminish our larger environmental and cultural health. Despite my best intentions to convey a broad and varied diet, people still often label my way of eating as restrictive, be it restricting carbohydrates (because I don’t eat processed carbohydrate sources like pasta) or restricting fat (because I don’t eat or drink “fat bombs” that are popular on the keto diet). The framing of such restriction usually depends on a person’s own dietary bias. People typically want me to be their dietary ally or enemy (in the latter case, usually so they can pick a food fight with me). I’ve also been accused of not restricting my diet enough. I’ve been told that I could not possibly care for animals or the environment because I consume animal protein. Somewhat perversely, by discouraging the consumption of highly processed packaged foods, I’ve also been accused of promoting disordered eating patterns.

To critics, I would point out that two important components of my personal dietary approach—protein and food timing—are often lost in nutrition discussions today. No matter our dietary tribe or philosophy, when talking food and nutrition, it’s relatively easy to focus on fats and carbohydrates, our prime nonprotein energy sources. (Alcohol is the third source here and is typically the one most people don’t want to discuss, a discussion we’ll save for another time.) There is a tendency, in the nutrition community’s long-winded and myopic debates, to either skip over protein or to dismiss it entirely. This is largely because protein energy constitutes a relatively small percentage of our total energy intake compared to the other two macronutrients. Warring dietary factions debate the merits of carbohydrates versus fat, as protein gets lost in the shuffle. Also, protein intake has remained relatively stable during the obesity epidemic’s unfolding, and so hasn’t triggered significant interest from researchers, clinicians, and practitioners.

There is, however, a growing body of research indicating that the amount of energy we obtain from protein determines our total caloric intake. The Protein Leverage Hypothesis, as this theory is known among experts, which has been corroborated by a substantial body of evidence, suggests that metabolically healthy humans actively regulate their macronutrient intake, and in so doing, prioritize protein energy over nonprotein energy (fat, carbohydrate, alcohol).8 If our diet contains low absolute amounts of protein (and perhaps even specific amino acids from which proteins are synthesized), our appetites increase as an attempt to get us to seek more nutrients (including proteins), leading to the overconsumption of nonprotein energy as we attempt to fulfill our protein energy needs.

In other words, our protein requirements are such that we continue to eat until we reach our target protein intake, at which point our appetite and hunger subside. We’ll eat whatever is around until we get enough protein. But in a world where the cheapest, most readily accessible foods are generally low in protein, or the protein they contain is not readily digestible and absorbable (such as wheat proteins), we often struggle to hit that target, in the process overconsuming nonprotein calories. In a review of thirty-eight published experimental trials measuring how much protein people consumed when they could eat as much as they wanted, total protein intake was inversely related to total calories consumed. The greater the amount of protein people ate, the less total energy they consumed, regardless of their fat and carbohydrate intake.9

A 2011 University of Sydney study tested subjects on diets containing either 10, 15, or 25 percent energy from protein. The results showed that lowering the protein energy from 15 to 10 percent significantly increased the total energy (made up of nonprotein energy) people consumed, primarily because they were snacking between meals.10 The quality and composition of those snack foods? Processed foods rich in sugars, refined carbohydrate flours, and inflammatory industrial seed oils (“vegetable oils”). The trend is clear: if we consume less protein, we tend to increase our nonprotein energy intake, something that may predispose us to chronic overconsumption.11

Many health authorities don’t know about this research and continue to recommend lower amounts of protein energy intake, primarily to limit meat consumption. That makes sense, particularly as the world grows sensitive to the impacts of agricultural greenhouse gas emissions and climate change. But pushing people toward lower-quality plant-based proteins, such as those from wheat or legumes, and lower amounts of protein in general, seems counterproductive if it results in more hungry people who will overconsume nonprotein energy to quell that hunger. If we truly worry about our global carbon footprint and feeding the world’s ever-growing population, we should start limiting low-nutrient, nonessential foods first. Australian research estimates that about 27 percent of the food-related greenhouse gas emission footprint in that country comes from the production and consumption of “noncore” foods (fast food, candies, and the like).12 In North America, this number is probably much higher. Demonizing high-quality protein foods leads to higher junk food consumption, ultimately compromising human health and the environment.

Consider the experience of Sarah, a client of mine. A busy young twentysomething, Sarah began most days with a standard bowl of cereal and milk before leaving for work. She wasn’t much of a morning person, so she slept in as late as possible. By the time she showered, dressed, and did her hair and makeup, she had exactly three minutes and forty-seven seconds to eat breakfast before running out the door in a flustered rush. Not long after arriving at work, Sarah started to feel a bit hungry and contemplated what she would have at her midmorning break. If she was “being good,” she would grab a piece of fruit, or maybe a low-fat yogurt. More often than not, she craved something a bit sweeter, and a nearby coffee shop was always happy to oblige. She ate lunch a few hours after this sugary snack, usually just a salad or a sandwich, sometimes a juice, but nothing particularly substantial.

By early afternoon, Sarah’s appetite would explode. Having endured the postlunch energy dip, when she struggled to keep her eyes open and focus, her appetite would kick in aggressively. Quite often Sarah would claw at the office vending machine in a frenzied state of “hangriness” (hungry + angry), unable to decide if she wanted something sweet, savory, or both.

Arriving home late, tired, and hungry (again), Sarah would often eat a large meal, certainly her largest of the day—anything from take-out Mexican, Thai, or Indian that she picked up on the way home to a frozen pizza or mac ’n’ cheese meal. Either way, dinner was usually heavy on the carbs (mostly from rice or pasta) and meager on protein.

Because Sarah ate little to no quality protein throughout the day, she usually got hungrier as the day progressed. The appetite center in her brain continually prompted her, in vain, to seek foods that might satisfy the protein her body desperately needed. In the absence of sourcing this protein from her diet, Sarah’s body likely harvested it from the only real reservoir of protein in her body: her muscles and connective tissues. Meanwhile, she always thought about food, scrolling through Instagram’s “food porn” photos, and asking friends and colleagues what they planned to have for dinner that night. Because of her workplace’s physical and social food environment, and because she always felt stressed and rushed, she made consistently poor food choices.

Research suggests that the distribution of protein across the day is as important as the overall amount. The protein Sarah did consume was typically compressed into one meal and drowned out with foods (high-carb, high-sugar, and salty foods in particular) that blunted the satiety signal that protein typically provides. In general, it is better to eat three thirty-gram protein meals spread evenly across the day than a ten-gram protein meal at breakfast, a twenty-gram protein meal for lunch, and a large sixty-gram protein meal for dinner. The even distribution across three meals is also better than multiple, ten-gram “grazing”-type meals spread over the day. The smaller protein amounts aren’t enough to trigger proper satiety levels, and so it becomes too easy to overconsume nonprotein energy across the day.

The large protein-rich dinners typical of North American society has led to the oft-repeated narrative that everyone already eats too much protein. If you were to focus on just one meal of the day, examine the total protein content of that meal, and ignore the satiety-disrupting effects of the highly processed sugars, carbs, fats, and salt that normally sit alongside this protein (think: a couple of burgers, fries, and cola), it is easy to arrive at this conclusion. In fact, many individuals underconsume high-quality protein, fail to distribute it evenly across the day, and, obeying their own powerful hunger signals (and/or lack of satiety signaling), overconsume nonprotein energy. It’s high time we shine a light on protein and give it pride of place within our oscillating dietary framework.

Watch When You Eat

The problems associated with the common eating patterns of the near-globalized standard American diet aren’t limited to the macronutrients we eat, like protein. We must also look at when we eat, as well as our context for eating. Many people mistakenly regard food simply as fuel, nothing more than calories and micronutrients to be balanced, varied, and moderated. But alongside nourishment, food also provides our body with information about the environment, synchronizes us with the days and seasons, and, more important, binds us to one another and to the physical environment from which that food derives.

When it comes to our circadian rhythm, food and eating, much like light, can be a powerful zeitgeber—an external environmental cue that helps to govern our natural rhythms, both those pertaining to the twenty-four-hour light/dark cycle and those related to the seasonal changes over a year. Without such external cues, our body’s master clock (which itself “free-runs” over a slightly longer time period than the twenty-four-hour rotation of the earth) would progressively disconnect from the natural light/dark cycle of the local environments we inhabit.

We’ve observed how the light and dark signals of sunrise and sunset help set our circadian rhythm, and how insufficient and excessive bright light during the day disturb these rhythms. But other regular features of our days also help to synchronize and maintain this rhythm, including eating meals at regular times during daylight hours. Those last three words are key. Although modern society has normalized eating well into the night, such late evening feeding likely impairs our metabolic health and promotes weight gain, even when eating an otherwise balanced diet. When we eat matters a lot more than most of us recognize.

Research on eating and circadian rhythm biology at the Salk Institute found that human metabolic health—everything from the physical motility of our digestive system, enzyme release, and hormonal profiles in response to eating—is best when we confine eating to an eight-to-twelve-hour window, starting in the early morning and ending in early evening (relative to sunrise and sunset).13 In research circles, this pattern is called “early time-restricted feeding.”14 When examining modern, real-world patterns, however, the same research group found that the average person eats over a fifteen-hour (or longer) period each day, often grazing on foods and drinks high in sugar, fat, salt, and, oftentimes, alcohol.15

Leaving aside the qualitative aspects of the foods and drinks we consume over such an extended period, consuming what is often the largest part of our total energy intake for the day over the period spanning late afternoon to just prior to bedtime (and sometimes even waking up in the night to eat) conflicts with our fundamental physiology. Recall from earlier in this book that we have a distinct nighttime physiology. Under the influence of melatonin, our bodily processes slow down, preparing us for the tissue and organ repair that accompany sleep. Under these conditions, our bodies simply can’t deal with an onslaught of calories to process. Forcing them to do so leads to a variety of negative health consequences.

Consider glucose. You can think of glucose as operating in the opposite fashion as melatonin. The early daylight suppresses melatonin production, which only kicks in in a big way—ideally—at dusk, when the absence of natural light cues the secretion of the darkness hormone. The glucose coursing through our bodies follows an opposite pattern, gradually decreasing as the day wears on. The research is unambiguous: our bodies handle glucose better earlier in the day than in the evening or night.16 Eating large meals at night (or many smaller meals combined with extended snacking), especially those rich in carbohydrates, can lead to increased circulating glucose (and increased insulin) at a time when the body’s ability to healthfully and easily deal with both is impaired. In experiments where food intake is matched across subjects and only the timing of the meals changes (early morning versus late evening), subjects eating later in the day experienced higher blood pressure, worse blood glucose control, and increased body fat. Such changes, played out day after day, serve as precursors to the likes of cardiovascular disease, diabetes, and some cancers.

Just as our brain’s master clock governs our physiology in response to light and dark signals, each of our internal organs comes equipped with its own internal clock (synchronized with the master clock) that regulates its daily cycle of activity. Within the gut, this clock controls everything from the flow of digestive enzymes and stomach acid, to the absorption of nutrients, to the elimination of waste. Eating at the same time of day, every day, helps to normalize these processes.

The synchronization with our master clock, set by light exposure, played a critical role in helping our ancestral bodies regulate themselves over the seasons. As the light signal diminished in strength, at temperate latitudes, at least, moving from summer to fall, and eventually into the depths of winter, the length of time our digestive system would be “on” gradually decreased. This makes a lot of sense given that food was less abundant during the winter months relative to spring and summer. There’s no point in having a system hungry and ready to consume if no appreciable amount of food is forthcoming. Not to mention that the digestive system requires energy and other resources to function (think of all the enzymes, hormones, bile, acid, and so on it produces).

Our digestive physiology, like many other systems in our body, is linked to the light, and is seemingly at its most efficient when we eat most of our food early in the day, and wind down our energy intake toward the early evening. In sum, the research in this area confirms the old adage of eating breakfast like a king, lunch like a prince, and dinner like a pauper.

As the light intensity and day length ebbs and flows across the year with the changing seasons, so, too, does the feeding window over which it is best for us to eat: longer in summer, shorter in winter. Factor in what should be the natural variations in food availability over the seasons, and you have the makings of a dietary pattern that keeps everyone happy. You have periods of high vegetable and fruit consumption, periods of lower-carbohydrate, higher-fat consumption, periods of extended fasting, a bigger variation in protein sources across the course of a year, and eating patterns matched to our ever-shifting circadian rhythms. In other words, our ancestors did practice intermittent fasting—they just did so in winter. They ate low carb—sometimes. They were “plant-based”—sometimes.

Rhythmicity in eating might all sound good in theory, but it doesn’t fit neatly into how we behave in modern society. Our pattern of continuous energy consumption over so many hours is exactly the strategy many animals deploy in the height of summer and into fall in order to increase their body fat levels for the next winter. As a child, I saw this in the black bears that often showed up on my parents’ property to forage for everything and anything to eat as they sensed the cold winter approaching. This problem is compounded by the fact that most modern people, most of the time, consume most of their energy from main meals (both protein and nonprotein) in the evening, often well after sunset, leading to a mismatch in circadian rhythm signals. If that isn’t bad enough, recall that many people spend their days in relatively low light conditions (“winter light”). Talk about confusing your body with inconsistent—or incoherent—messages!

People keep chronic summer hours, spend time in chronic winter light, and get chronic summer sleep, all while chronically consuming a summer dieting schedule—all irrespective of what the actual season happens to be. From a circadian biology standpoint alone, this is a perfect recipe for metabolic disorders and other adverse health consequences. I don’t believe that most people consciously choose this lifestyle. They simply participate in the environment that has been built around them. They must go to work early and often return home late. They don’t control their daytime light exposure, and didn’t curate the menu of cheap, hyperpalatable carbohydrate-dense food that surrounds them. It isn’t easy to opt out of all of this, even when we’re well educated and well resourced, but it’s possible. Many dietary strategies see people concerned about eating locally, reducing food waste and packaging, and ensuring that the foods they source are raised and grown as healthfully, ethically, and sustainably as possible. These are laudable goals, and they are important. Now we just need to add seasonal rhythms and circadian biology to the mix. Don’t worry, this all comes together with a lot more ease than you might think.

Make It a Family Affair

Even if we pay attention to what and when we are eating (no small feat!), we often neglect one of the most important aspects of our modern food and nutrition landscape: whom we are chowing down with. Go to Europe, South America, or Asia, and you’ll notice a striking difference in food consumption. In these regions, broadly speaking, people rarely eat on the move, and they eat alone a lot less often than in North America. You’ll be hard-pressed to find anyone walking down the street with a burger in one hand and a soda in the other. Meals are generally sit-down affairs, enjoyed in the company of others.

Social engagement is a hallmark of the most long-lived communities and societies around the world—the so-called Blue Zones.17 One important aspect of this social engagement is the coming together to prepare and share meals, to break bread together, as it were. Social eating is a timeless human ritual with deep evolutionary roots. Research suggests that people who eat more socially tend to be happier and more satisfied, trust others more, and interact more in their own communities.18 They even have a larger support system. Recent research shows that eating the same food with others increases the sense of trust and cooperation.19 Food literally brings us closer to friends and strangers alike.

Eating together might have evolved as a way for us to connect and bond socially. And yet, Americans rarely eat together anymore. Food journalist Michael Pollan estimates that Americans consume at least one in every five meals in their cars.20 Almost 40 percent (36.6) of Americans eat at least one fast-food meal every single day—ultra-processed foods specifically designed to be eaten with one hand, on the go.21 As I’ve observed in my consultant practice, many American families don’t prioritize sharing meals together at all, and often only eat together a couple of times a week.22 Even when families, couples, and friends do eat together, it is often in a state of distraction, in front of the television, or worse, with smartphones and video games present. More people are also eating alone, compounding any feelings of isolation and loneliness they already experience. I’m always saddened when I witness a food hall full of individuals eating alone, often soothing their loneliness, or at least distracting themselves from it, by scrolling their phones or computers between bites.

In the more socialized countries of Europe and the Mediterranean region, the midday meal is often the day’s most important, with workers taking long lunch breaks. In such countries, it is often culturally frowned upon to eat in a rush and while on the go. The French, for instance, tend to eat together as a household more regularly and to follow a regular pattern of three meals a day.

In North America, Britain, and Australia, where society is more individualistic and convenience often dictates food choices, you see a greater consumption of energy-dense, highly processed fast foods and snacks. You also see a more entrenched culture of meal skipping. Yet the health authorities of these very same countries revere the Mediterranean diet as the model for how we all should eat. In so doing, they homogenize and distill the distinct and varied diets of the Mediterranean region to a few pithy, sound bite–like recommendations (eat fish and olive oil, they say, or nuts, or certain spices) without ever properly addressing the wider culture and context of eating in these regions. Mediterranean peoples eat simply, regularly, together, in a state of relaxation, without digital distraction, and largely without deliberate restriction. That’s a diet I can wholeheartedly endorse.

Accessing Our Intuition

As I’ve been arguing, we must all develop a more holistic approach to eating, starting with the food we select. If we find ourselves in the throes of chronic summer and are habituated to highly rewarding flavors and tastes, healthy or whole foods will have zero appeal. Before we can gain any awareness of our deep, intuitive yearning for seasonally appropriate, nutritious food, we must first extricate ourselves from the overstimulation of summer foods. If we’re used to nutrient-poor inflammatory foods, like Cheetos and Diet Coke—or even “healthy” summer foods like processed granola bars and white rice—we’ll never be drawn to salmon and summer squash.

Here’s a simple rubric: limit your diet to the food available at your local farmers’ market. Ask local farmers the most important questions about your diet: What are you making right now? What are you growing? What are you producing? If you adopt this practice of food selection, you’ll be about 75 percent of the way into seasonal eating. But be cautious. I’ve begun to see farmers’ market vendors offer foods from global suppliers and grocery stores, like strawberries in November or winter squashes in the spring. If you spot such offerings at your local market, steer clear, and make sure you only select locally grown and produced food. You might also consult my book It Starts with Food. My coauthor and I outline a diet that represents a moderate midpoint between the seasonal extremes of summer and winter. It helps readers move beyond a modern, refined agricultural diet based on grains and legumes to better food choices.

Once you’re selecting foods properly, you can begin to tap into your intuition about eating. When I first started paying attention to seasonality and food, I quickly noticed that I wasn’t interested in eating fruit during the winter months. It was during the cold and dark month of January, and I was living in Maine. I’m not a picky eater, and I broadly gravitate to fruits and vegetables of all kinds. But I just didn’t feel like eating any fruit. This wasn’t based on any preconceived diet or idea about what I should eat. It was an intuition about what was good and nourishing for my body in that context.

Once you start eating seasonal offerings at your farmers’ markets or have completed a round of the Whole30 diet, you’ll start to discover such intuitions. New personal wisdom about your body will emerge. But it takes time to distinguish cravings from intuitive longings for seasonally appropriate foods. Have you ever noticed that when you experience a craving, you gravitate toward sweet, salty, and fatty foods, and that it comes on pretty strong? Sugar supplies a calorie-dense and almost immediately available energy source, while salt helps us balance our electrolytes. Animal protein sources represent the most energy-dense food, which is why every now and then you might have a hankering for a thick piece of marbled steak or short ribs. Unfortunately, processed foods have allowed us to satisfy our cravings with place-holder foods and chemical triggers—chips, cookies, MSG (which provides umami—that savory, meaty taste), and the like—that don’t nourish or satiate us. If you work hard to overcome the appeal of these fake foods, you might just wake up one spring morning and say to yourself, “For some reason, a fresh green salad with strawberries and marcona almonds sounds delicious today.”

I hope I’ve persuaded you that what we eat should fluctuate with the seasons. But how about our movement patterns? Here, the science isn’t as straightforward. Still, just as with food, we need to develop our intuitive capacities to switch from chronic summer movement patterns to simpler fall and winter modes. A substantial portion of the population is either stuck on chronic summer workout patterns or is completely sedentary. But as we’ll explore next, these consistent modes of exercise, or lack thereof, have compromised our bodily integrity, our health, and our longevity. In order to be fully functional and flourishing human beings, we must restore the ancestral fluctuations in our movement patterns. Too bad there’s no “farmers’ market rule” for movement!