If the notion that your very genome was built on delicacies such as mammoth meat and caribou tongue is a little hard to accept, don’t blame me. Blame your brain. More than any other factor, its demands shaped the human affinity for fat in all its natural forms. The brain and fat go together like Fred and Ginger or Starsky and Hutch. They evolved to be in perpetual relationship. And not surprisingly, the health of our brains has suffered whenever we’ve deviated from satisfying that core need.
For an evolutionary scientist, the brain is the ultimate focus of study, because the speed of its evolution and growth in size, relative to that of any other species, is breathtaking. It is the organ that more than any other defines our humanness, with our extraordinary capacity for thought, feeling, language, philosophy, culture, and art. And it is the part of ourselves that we must protect, preserve, and optimize in order to build our resilience in the toxic modern landscape: smart choices, critical thinking, and empowered action have never been more crucial than they are today.
Unfortunately, today our brain is probably more under threat than any other part of us.
When I look at the brains of clients undergoing neurofeedback, via the squiggly lines of the EEG that depict the brain’s electrical activity, I know that what underlies them are two mighty hemispheres made up of fat—up to 80 percent fat by dry weight, more or less. Of this, roughly half is protective saturated fat (seeing as our brain isn’t refrigerated), containing fully 25 percent of your body’s total amount of cholesterol. The brain has an extraordinary secret weapon of sorts, one that is unique in all of nature: the ability to rely on special units of energy from fat, called ketones, on an ongoing basis. Only human brains are capable of this.1 And what does your brain do with all that fatty composition and unique fat-based energy? It can solve complex sophisticated problems at lightning speed, which no computer can match, and it can philosophically ponder the sophisticated origins of our vast universe and very existence.
What’s extraordinary is that we acquired these incredible neurological machines because we put down the bamboo leaves and picked up the bones (and cracked them open). Without that fundamental shift, we might have just been another potbellied primate, foraging for fourteen hours a day along the forest floor and picking the tasty bugs off our neighbors’ backs. Did our taste for meat and fat allow for that big, hungry brain? We don’t know for sure (though it is likely), but we do know that humans allocate a much higher proportion of their energy expenditure to their brains than any other mammal or primate our size. And by developing (relatively) carnivorous digestive tracts and robust gallbladders, we were able to fulfill this energy need by digesting meat and animal fat instead of plant matter, thereby ensuring a calorically dense diet—something that hours of foraging leaves and plants could never provide.2
This theory is well established and accepted in paleoanthropology as the “expensive tissue hypothesis.”3 It posits that the increased energetic demands of a relatively large brain—the most “expensive” tissue in the body in terms of energy cost—are balanced by the reduced energy demands of a relatively small gastrointestinal tract that can efficiently process meat and fat. This digestive development, along with eating key long-chain polyunsaturated fatty acids (LC-PUFAs) that are critical to brain development, powered our cognitive quantum leap ahead of other primates. (Fatty acids are groups of fat molecules that can be absorbed into the bloodstream and used by the body.) To quote veteran ketogenic researchers George F. Cahill Jr., MD, and Richard Veech, MD, “Without this metabolic adaptation [to using fat as our primary source of fuel], H. sapiens could not have evolved such a large brain.”4
Rewind time to about two million years ago, and you would discover that our earliest ancestor of the genus Homo had a brain with a volume of about 900 cm3, two to three times the brain size of our closest primate cousin, the chimpanzee.5 By the time that ancestor, Homo erectus, had become the anatomically modern Homo sapiens, his brain had increased 75 percent in size and was infinitely more sophisticated in functioning. This rapid encephalization—the technical term for brain growth—occurred over approximately 180,000 years, which is a relatively brief span of evolutionary time.6 By contrast, the chimpanzee’s much smaller brain hasn’t changed at all in seven million years. This is largely due to our different diets, our different digestive systems, and the different way we obtain and synthesize fats.
I like to use the chimp-human comparison because there is a prevailing mythology that persists in some nutritional circles that we are all essentially primates—naked apes—and because primates eat primarily plants, so should we. Not so. Sure, we share most of our genes with our chimp cousins, and there are some notable similarities, but our differences are not as negligible as some think. On the similarity side, both human and chimp brains are constructed from fatty acids. But one brain is especially highly sophisticated, and the other is less so. Why? One major factor is the specific type of fat involved.
Picture the chimp’s daily diet. It contains maybe an occasional bit of meat—mainly carrion, insects, and small animals—which donate some protein plus very little fat. The bulk of the diet, however, is plants, lots and lots of them. And what the chimp does with this vegetation is impressive: she processes it inside the incredible fermentation vats of her belly (which we lack), which accommodates about 60 percent more volume of large intestine than a human belly does. Microflora (gut bacteria) convert the plant material into short-chain saturated fatty acids. Through this fairly long and labor-intensive process (the labor being to eat, eat, eat and for her gut bacteria to ferment, ferment, ferment), she gets the calories she needs to supply up to 50 percent of her energy needs from saturated fat. Odd as it may seem, the carbohydrate in the sweet/starchy vegetation contributes just a small amount of energy.7 The whole energy-making process is pretty inefficient, to say the least, but since the chimp’s brain is fairly “cheap” (it only uses about 8 percent of her energy intake), this process suffices.
Human physiology evolved very differently. The large human brain is exceptionally hungry, or “expensive,” requiring 20 to 30 percent of the body’s available energy. These energy needs are even more pronounced in our early years. Babies’ brains utilize up to 85 percent or more of the calories they consume, and in older children, the figure is 45 to 50 percent.8 To meet this need, our digestive system got super-efficient in order to derive maximum energy directly from food, rather than by microbial fermentation. We evolved a hydrochloric acid–based digestive system, with a longer small intestine and shorter colon than other primates. This meant that we didn’t need to—and actually couldn’t—ferment vast amounts of plant matter to derive fatty acids and other nutrients; instead we have the ability to bypass the lengthy fermentation process and get our prefab fatty acids by eating animals that have already done the converting for us in the form of grass-fed, wild-ranged, healthy-as-it-comes meat—a process I describe below. And these fatty acids from animal meat included two complete game changers.
To build a brain takes more than mere calories; it takes building blocks of a variety of specific types of fatty acids, and a lot of cholesterol. Where chimps’ brains use predominantly omega-6 essential fatty acids derived from plant foods as their building blocks, our human brains are structured using two critical long-chain fatty acids known as arachidonic acid (AA) and a key omega-3, docosahexaenoic acid (DHA), both of which are available only from (you guessed it) animal-source foods. This particular pair of pre-formed animal-source fatty acids accounts for many of the human brain’s unique cognitive capacities.9 They are the blocks that build our advanced brain structures with higher functions, such as our unique neocortex and prefrontal cortex. DHA in particular is the dominant storage form of omega-3s in the human brain; comprising fully one quarter of the brain, it is arguably the most critical fatty acid. We owe our human qualities in no small part to this wonder fat, which is found in meat that is 100% pastured and in wild-caught fish. (AA, by the way, is an elongated, animal-source omega-6 necessary for cognitive functioning, but it must be in just the right balance with sufficient omega-3s—which it is in wild and fully pastured animals.)
You might say, “But I get my omega-3s from flax oil” (or chia oil, or South American sacha inchi oil), “and the bottle says it is ‘rich in omega-3’! What’s not to love about that?” I hate to break it to you, but our brains actually don’t have the capacity to make much use of the type of omega-3 derived from plant oils, called alpha-linolenic acid (ALA).10 To become a building block for the brain, the ALA has to be elongated through a complicated biochemical process of enzymatic conversion (for the true geeks, the substances involved are called desaturase enzymes). Unlike chimpanzees and cows, humans have a very limited ability to do this. We have the capacity to convert maybe 6 percent of ALA into a substance called eicosapentaenoic acid (EPA), with a negligible amount converted into DHA. (Aside: Those of northern European, Celtic, and Native American heritage tend genetically to be unable to even achieve that much and simply cannot make these conversions at all, lacking the key first enzyme in the process, delta-6-desaturase.)11
Apart from being poorly converted (if at all) to the elongated omega-3s that your brain really needs, that pricey bottle of plant-based oil, whether flax, chia, sacha inchi, perilla, or walnut, has another downside: its contents will likely oxidize before you get halfway through it, as plant-based oils are very fragile and quickly go rancid when taken out of the seed or nut and exposed to air. Should you try to cook with them, you are almost guaranteed to render them oxidized and unusable—and even dangerous to your health. Consuming even high-quality plant sources of ALA in the amount you need to make better conversions (assuming no genetic or metabolic limitations) can easily flood you with tissue-damaging free radicals. Conversely, animal foods provide delicate omega-3s side by side with robust saturated fats that protect them from oxidizing and causing harm; the saturated fats actually help the omega-3s get safely to the areas where your body needs them, such as to combat inflammation or, of course, to build the brain. Nature designed them to occur together in a perfect nutritional package called 100% grass-fed, grass-finished meat.
Cows, sheep, and goats are designed to graze. They are classified as ruminants, a name derived from the Latin word ruminare, meaning “to chew over again.” Inside their bellies, large digestive systems comprising four stomach chambers allow them to swallow forage and then later regurgitate it, rechew it (called “chewing the cud”) to break it down into smaller particles, and then swallow it again, where microorganisms ferment it, converting it into short-chained saturated fatty acids—mainly butyric acid, which they then use for up to 70 percent of their energy. (See, cows are fat burners too, though their brains can’t typically run on ketones like ours.) Meanwhile, they also convert ALA from the chloroplasts of green plant cells into EPA and DHA and then store these in their tissues, something our position at the top of the food chain as hunters of grass-eating animals allowed us to exploit.
The takeaway: Consuming pre-formed, elongated omega-3s from animal-source foods—animals that ate fresh green grass/natural forage for their entire life cycle and were allowed to roam free, with plenty of sunshine—radically changed the way our brains operated and what we were capable of doing. And today our primal physiology hasn’t changed, even while plant seed/nut oils galore are available on the shelves of health food stores. Fat-rich, raised-on-grass animal foods, rich in DHA, EPA, and sufficient AA (along with a plethora of other unique and critical nutrients), are nonnegotiable for your neurological system—grain-fed animals have a different fat profile, as we will discover—and the dearth of fully pastured fats in most people’s diets is a little-discussed crisis.
If you remember only one thing about this subject, remember this: if DHA is not in your diet, it’s not in your brain.
While the “expensive tissue” hypothesis is well validated, there still remains a school of thought that says we owe our big brains to starchy foods (in other words, preagricultural, wild-growing tubers and roots—the prehistoric versions of yams and carrots).14 This “potato” hypothesis is predicated on a mistaken idea—one that this book refutes—that glucose, the energy from carbohydrate, is the human brain’s primary and preferred source of fuel. As I will show you in the pages to come, this is true only if you have conditioned yourself for it to be true by training the brain to use glucose through a diet unnaturally high in sugar and starch. What the brain works best on, and what it needs not just to evolve in size but to function at its best every single day, is the energy and nutrients from animal-sourced fat. A quick look at the flawed “potato” hypothesis helps to illuminate a fundamental and very critical idea: humans could not have evolved by using plant food as a primary source of calories, even if other primates have happily subsisted on them, because there are simply not enough calories or brain-building fats and other brain-critical nutrients in starchy foods to supply our demanding brain.
Vegans may not love me when I say this, but a solely plant-based diet simply goes against our evolutionary design. Any wild, fibrous, uncultivated plant tubers that our early ancestors came across would have been entirely indigestible, because when raw, their starches cannot be broken down or its nutrients absorbed by the small intestine. Starches in roots and tubers become digestible only after they have been cooked through prolonged exposure to high heat. Deriving energy from starchy roots and tubers during our early evolution would have required extensive cooking. The most current data suggest that our human ability to produce fire at will occurred only well after we had evolved into modern humans,15 little more than 75,000 to perhaps 100,000 years ago.16 By this time our uniquely dramatic brain expansion had already long since occurred. Interestingly, research shows that even habitual use of fire, including in more temperate regions where plant life was likely flourishing,17 did not translate to a high consumption of plant-source foods.
Perhaps more critically, this would have required ample amylase, the enzyme that helps us to digest starch. Our species actually lacked the necessary genes to create much amylase until about two hundred thousand years ago, which was after we had already developed our big brains.18 The adaption to significant starch consumption is not even a done deal today: modern people have highly variable capacities to digest fully cooked starches. This is because modern humans differ widely in the amount of amylase genes we carry, and may have anywhere from two to sixteen copies. (And there is no correlation with larger brain size in those with more copies.)
Roots and tubers—cooked or not—do not supply the critical nutrients that nourish the human brain. Even if our ancestors had been able to make use of starch calories in any significant way, they would have been getting substantially fewer calories and nutrients than they got in the dietary fat from the animals they hunted.
While we’re at it, let’s address the misconception that early humans roamed the earth munching on fruit, honey, and nuts galore. These edibles would have been merely a seasonal treat, and only in temperate regions. They would have certainly helped to fatten us up for the winter or in the possible event of food scarcity, helping us to develop an adaptive mechanism called mild insulin resistance, which actually supported us to store sugar as fat for the winter to come. But these plant foods would have been scarce, an occasional side dish at best, and not plentiful enough or fully nutritious enough to rely on to survive Paleolithic extremes. For vulnerable Ice Age beings, no matter what kind of ecosystem they lived in, dietary fat was the key to survival.
It boils down to this: Cooking didn’t expand our brain. Eating potatoes or other carbohydrates didn’t grow our brain. Eating dietary animal fat—and lots of it—is truly what provided the substrate for the highly sophisticated structure of our brain and ultimately made us human.19
The research of human metabolic expert George F. Cahill Jr., MD, presents an exciting confirmation of this theory: it turns out that humans are the only animal that can run virtually entirely on ketones, the energy units that come from fat. This unique, evolutionary adaptation was driven primarily by our brain! The cost of running our most expensive organ, the brain, is so high that we needed the most abundant, stable, reliable fuel source possible—fat. As you’ll see in Chapter 4, the brain can use both glucose and ketones for energy—it’s flexible—but it definitely prefers and performs better on the latter. In fact, fat is so preferable that, once your fat-burning state is fully switched on (i.e., when a healthy state of effective ketogenic adaptation is achieved), the ketones you produce stimulate the pathways that enhance the growth of new neural networks and protect neurons against multiple types of neuronal injury.20 This is just one reason why the fat-burning or ketogenic approach to eating is used as a therapy for all kinds of neurological imbalances and diseases—often with rapid and remarkable results.
Here’s a second fascinating insight: we are natural-born fat burners. Our earliest neurological functioning and development are powered by ketones. When an infant is nursing on mother’s milk, it’s the energy from the fat in the milk (in the form of ketones) that is the major fuel for brain development.21 According to Dr. Cahill, “The larger the brain/body ratio, the more rapidly the ketosis develops, as in the newborn.”22 He went on in the same paper to say that “We are the only primate born fat, probably to furnish the caloric bank for our big brains.” The sugars in the breast milk are important, too; they feed the lactobacillus bacteria in the baby’s gut and are rapidly converted to subcutaneous fat, to give the baby a good insulating layer—in Ice Age terms, the chubbier the baby, the higher the odds of survival.23
The ketones derived from this stored fat and the DHA that comes with it, from mother’s milk, are pivotal to feeding a rapidly developing human brain. The infant makes barely any starch-digesting amylase until at least five months of age. Nature very clearly intended for fat to be the fuel for the extraordinary development and functioning of our brains from the very start of life. Once we start feeding children sweet and starchy foods, we start to shortchange this critical development process . . . and they start craving sweets.
These scientific insights add more confirmation to the fat-burning hypothesis. To our primal physiology, ketones are the ultimate brain fuel. In terms of conferring brain benefits as we live and age, they leave the starch-based diet trailing in the dust.
Paleoanthropologists and other folks who like to investigate our ancestors have bad news for us: our brains are shrinking. They peaked in size about twenty thousand to thirty thousand years ago, as demonstrated by studies of Cro-Magnon remains—Cro-Magnons were the robust and powerful humans who were the first to live in cold, Ice Age conditions in what is now Europe—and our brains have been getting smaller ever since. The most rapidly evident shrinkage has taken place within the last ten thousand years—the era of agriculture, starch-based diets, and cultivated grains. If the “potato” hypothesizers were right, you would expect our health and brain function (and brain size and sophistication) to have improved significantly since agriculture helped us to adopt a starch-based diet. But the reverse is true.24 As starch became our default food and nutrient-rich foods from wild animals were replaced, our brains started to shrink. Coincidence? I wonder.
How is it possible that we lost just over 10 percent of our brain volume in a mere ten thousand years? Did the abundance of grains and starches help us develop “improved brain efficiency,” allowing us to do more from less (as potato-hypothesis proponents might argue)? In light of our evolutionary past, this seems more of a rationalization than a viable hypothesis.
The more compelling answer is that we shifted from a diet that contained close to 90 percent brain-building animal-source foods, rich in brain-building fats, to a diet that has as little as 10 percent of such foods today. As we did so, the brain lost its source of critical building blocks, such as DHA, as well as the fat-soluble nutrients that make it work optimally. Nature’s design for our brain—to be built from fat, to function on fat, and to thrive on fat—got subverted. This has dramatic implications across the board, including, not surprisingly, how we age. As you’ll see in Chapter 8, our post-agricultural diet is intricately connected to the modern crisis of neurodegenerative compromise, degeneration, and disease.
Vegetable lovers can rejoice: the Primal Fat Burner Plan puts a strong emphasis on filling your plate with fibrous vegetables and greens (the non-starchy kind—think chard, asparagus, cauliflower, and cucumbers, not potatoes, rice, beans, peas, and corn) to get a steady supply of phytonutrients that our modern lifestyle and environment requires. But vegetarians, take heed: animal-sourced foods take their (moderate) share of the plate as well.
Having worked intimately with the human brain for twenty years as a neurofeedback practitioner, I’ve come to an undeniable—if somewhat politically incorrect—conclusion: by far the most damaged and intractable brains and nervous systems I encounter are those of strict vegetarians, and specifically vegans, who have spent years eating a low-fat, higher-carbohydrate diet. Whether they eat lots of whole foods or processed ones, the nutritional deficiencies from the lack of animal-source foods plus overreliance on antigenic and inflammatory grains and legumes can be very similar: the lack of fat-soluble vitamins, complete protein, essential fatty acids (EPA and DHA), and cholesterol compromises brain function and over time results in considerable mood lability and instability, from foggy thinking, irritability, agitation, insomnia, anxiety, and attention disorders to autoimmune issues and pronounced neurodegenerative symptoms.
On top of that, the damaging waves of glucose and insulin that result from their sugary, grain- and starch-filled diet unnaturally age the body and brain. By the time I see these clients, they often feel as if their health and mental clarity are eroding. They feel something is fundamentally wrong. From an ancestral health point of view, this is completely unsurprising. Nothing is more stabilizing to the brain than quality dietary fat, and nothing is more destabilizing than sugar and starch. From a modern statistical standpoint, the numbers are shocking: fully 75 percent of vegetarians and vegans abandon this way of life within ten years—and most do so typically because of health-related issues.25
Though you might not think it from reading this far, I actually feel a deep allegiance to these clients. They care deeply about planetary health, animal suffering, and restoring a healthy ecosystem, as I do. The world needs more people like this. But by cutting out all animal-source foods, they are sacrificing themselves to an ideal that has never existed in nature. On the occasions when I can influence a shift to a fat-burning way of eating with some quality animal-source foods, we often see significant improvements in stability, clarity, and total functioning—just as I personally experienced after my time in the Arctic—although the recovery is contingent, I’m sorry to say, on the length of time these compromises have existed.
I usually invite my vegetarian clients to ponder the greater philosophical question: when carbohydrate-based eating is your norm, and you invariably get mood and energy instabilities, cravings, and blood sugar spikes and crashes throughout the day, are you truly free to live your life as you like, or are you enslaved to being a grazer, constantly eating to fill a hunger that never quite dies? And in this era of Big Food, Big Agriculture, and Big Ag’s best friend, Big Oil, who do you suppose benefits the most from us being in this (I have to say it: sheep-like) state?
When we return to eating the way our primal bodies and brains are wired to eat, we reclaim not just better physical health but a healthy dose of independence and autonomy, too—we begin to know what our own body needs, and to take charge of the best way to fulfill that.
We haven’t even gotten into the mechanics of fat burning, but I hope you can see how following a high-carb, low-fat, low-cholesterol diet—the exact prescription given out by decades of USDA food pyramids and erroneous factoids on the back of cereal boxes—is not only a scientifically unfounded proposition but the absolute opposite of what three million years of evolution intended for us.