CHAPTER 4

Should we eat like the Flintstones?

I love BBQs. I love geeking out over the gear, I love the smell of lighter fluid over charcoal, I love the food, I love the cold beer, I love dining al fresco ... despite the fact I left California 25 years ago and now live in a small village 10 miles outside of decidedly less sunny Cambridge. It’s also never too early or late in the year to grill! I can be found outside, wrapped up in a fleece, scarf, woollen mittens and a beanie, breath visible in the crisp late February air, lighting charcoal for ribeye steaks.

It’s nearly spring!’ I always tell my wife, while she looks at me through the patio door, shaking her head.

Every year, on the first weekend in May (which is always a three-day weekend here in the UK), my wife and I hold a BBQ in our backyard for friends and colleagues, as well as their partners and children. As it so happens, since I popped into the world on 1 May, the whole affair also doubles as my annual birthday celebration. I pride myself in preparing, marinating and cooking all of the food for the seventy or so hungry mouths that typically show up. A couple of years back, as I was manhandling a stack of baby-back ribs on to the grill, one of my friends, while clutching a beer in one hand (drinks always flow freely at the Yeo Birthday BBQ) and a teriyaki chicken wing in the other, shouted out:

‘Hey, it’s like watching Fred Flintstone with his Brontosaurus ribs! Are you trying to get all of us onto the Paleo diet?’

I guess if a Paleo diet means grilled meats in a sweet marinade, potato salad, sourdough bread and cold beer, then yes! However, aside from the presence of grilled meat (even then it has to be specific cuts of meat), everything else I’ve suggested is actually the antithesis of ‘Paleo’. The sugar found in most BBQ marinades and sauces, potatoes, wheat flour in the bread as well as alcohol are all to be avoided like the plague if you are a Paleo enthusiast.

There are many people who assert we should be eating a Paleo diet. Are they right?

A ‘STONE-AGE’ DIET

The term ‘Paleo’ is a contraction of Palaeolithic, which is a prehistoric period going from about 2.6 million years ago (wayyyyy before Homo sapiens were walking about and around the time of Homo habilis or ‘handy man’) to 10,000 BC, and was characterised by, amongst other things, the development of stone tools. Hence, it is often colloquially referred to as ‘The Stone Age’. The end of the Palaeolithic era also corresponded with the end of the Pleistocene, or ‘Ice Age’, and the beginning of the agricultural revolution. While it was first proposed by Dr Walter L. Voegtlin in his 1975 book The Stone Age Diet,1 the Paleo diet as it exists today entered the popular consciousness in 2002, thanks to Dr Loren Cordain’s bestselling book The Paleo Diet: Lose Weight and Get Healthy by Eating the Foods You Were Designed to Eat.2 The diet was so named because it was designed to mimic the diet of indigenous populations prior to the agricultural revolution. The basic premise for the whole Paleo movement was that for the vast majority of human existence, since Homo sapiens emerged 200,000–300,000 years ago, we subsisted as hunter-gatherers. Then the dawn of the agricultural revolution some 12,000 years ago brought about huge changes to our diet. Since diet-related illnesses are responsible for much of the chronic and non-communicable disease burden today, and no evidence of such conditions can be seen in the Palaeolithic skeletal record, the blame for our contemporary woes must lie with the post-agricultural diet. Hence, according to Cordain, the solution to a skinnier and less-is-healthier human species must surely be to shift back to a pre-agricultural, so-called ‘ancestral’, subsistence.

Cordain developed this ‘ancestral’ diet, which he coined ‘The Paleo Diet’, using evidence from the fossil records, diets consumed by contemporary hunter-gatherer, chimpanzee diets and nutrients in wild animals and plants. He argued that in contrast to the modern western diet, Palaeolithic people hardly ever ate cereal grains, did not drink or eat dairy products, did not salt their food and the only sugar they ate was honey and only rarely and in small amounts. Instead, Palaeolithic diets were dominated by wild, lean animal meat and foods low in carbohydrate content; what little carbohydrates that were consumed came from non-starchy wild fruits and vegetables; and the staple fats in Palaeolithic diets were monounsaturated, polyunsaturated and omega 3 fats, found largely in plants and fish.

A central tenet to Cordain’s philosophy is that the foods eaten (and not eaten) by ancient hunter-gatherer populations ensured optimal health due to the food’s compatibility with the human genome. Cordain argues that those who adopt the Paleo diet will feel good ‘because it is the only diet that is consistent with [the human] genetic makeup’.3 Based on DNA evidence, he claims that basic human physiology has not changed significantly in 40,000 years, making our genes well adapted to a hunter-gatherer and NOT an agricultural lifestyle. Cordain claims that by switching to (or is it back to?) a Paleo lifestyle, we have the potential to control and even prevent diseases such as type 2 diabetes and insulin resistance, multiple sclerosis, rheumatoid arthritis, coeliac disease, cancer, heart disease, inflammation, osteoporosis and anaemia. In fact, he goes further, repeatedly stating that there are ‘very few chronic illnesses or diseases that do not respond favourably to our ancestral diet, the diet to which our species is genetically adapted’.4 These are very bold claims indeed.

Aficionados, of which there are many, can and will quote chapter and verse about the intricacies of the Paleo lifestyle or diet.5 However, these can, by and large, be condensed into two basic sets of ground rules that you need to follow:

a)what you SHOULD eat is lean meat and seafood, as well as fruit and non-starchy vegetables, which can be consumed without limitation;

b)what you SHOULD NOT eat are cereals, legumes, dairy products and processed foods.

Just a note. We all know what grains from grasses are; so wheat, rice, oats and barley, etc. However, just in case, like me, you feel the need to look up what a legume is, its fruit is sometimes also referred to as a ‘pod’, and develops from a simple carpel, usually opening along a seam on two sides. The grain seed inside these ‘pods’ are known as pulses, which includes all beans, and also encompasses lentils, peas, peanuts and tamarind.

AGRICULTURE: EVOLUTION OR DEVOLUTION?

It is almost impossible to overstate the role that the agricultural revolution played in the development of human civilisation, for better and for worse, as it exists today. Because hunter-gatherers foraged all of their food, relying on what was in relatively close proximity, they had to follow the food and were therefore largely nomadic, or certainly at least semi-nomadic. Agriculture meant far higher food yields at predictable intervals, which allowed for the accumulation of stored food and a movement away from a nomadic lifestyle as well as the ability to support higher population densities. Hunter-gatherers, because of the inherent unpredictability of their food supplies, would have needed to devote most of their waking time to foraging and hunting. A predictable food supply afforded by farming, however, allowed for the emergence of non-food specialists in society. This was crucial because, all of a sudden, humans had the luxury of using their big brain for something else other than devising hunting strategy, finding out where the edible roots and berries were, and trying not to get killed or eaten. It turns out, in fact, that this availability of spare neuronal processing capacity, more than any other factor, was the central requirement for the subsequent development of all complex thoughts, concepts and technology. It is pretty difficult to contemplate quantum mechanics or cosmology, for instance, if you don’t know where your next meal is coming from; so do remember to pass on a silent thanks to your proto-farmer ancestor as you order a pizza from your couch, using your smartphone. Another consequence of people beginning to specialise and do different things was that it drove the stratification of society. This stratification inevitably led to an ‘upper’ class (the people who owned the land and ate the food) wanting to control a ‘lower’ class (the people who were paid, or made, to produce the food). As villages grew into towns and towns into cities, centralised states and religions emerged, across the globe, as effective and efficient methods to control large populations. The next natural step was then the raising of armies of professional soldiers to defend and propagate state and religion, and here we are today. Hurrah for progress.

How we got into the mess we are in today, however, is not the main topic of discussion here. Let’s rewind ourselves back 12,000 years to the dawn of agriculture. The truth of the matter is that Cordain was right that the transition from hunter-gatherer to farmer was not smooth at all. We know this from the archaeological evidence. Agriculture emerged independently in at least three different geographical locations. Emmer and Einkorn wheat was domesticated in the fertile crescent (a crescent-shaped area of fertile land along the River Nile in Egypt and the Tigris and Euphrates rivers in Mesopotamia) 10,000–12,500 years ago;6 rice or Oryza sativa was thought to be domesticated in the Yangtze River Valley (although the exact location remains hotly debated, with the Pearl River Valley also staking a claim) in China some 9,000 years ago;7 and maize or corn or Zea mays was domesticated from Mexican wild grass in the south of the country, also around 9,000–10,000 years ago.8 According to a study of human remains from all of these areas, the height of the average person declined by 3 inches or more during the millennia in which the cultivation of each of these crops intensified.9

Our early farming ancestors were not just shorter than hunter-gatherers, indicating malnutrition and problems with growth, their skeletons also revealed a whole host of other illnesses. They had terrible dental problems, for example, with the incidence of cavities jumping nearly six-fold as people started relying on grain. One of the main issues with agriculture was that it resulted in a stunning reduction in the diversity of foods that we as humans ate, and consequently forced a huge shift in the sources from which we obtain our calories. Our continued reliance on the starch-rich seeds from five different grasses as our main source of calories is a perfect case in point. A second issue is the appearance of entirely new sources of food that were only possible after the establishment of agriculture. Consumption of milk and dairy products, for example, would only have made sense once herd animals had been domesticated. And likewise, the fermentation of fruit and grain to produce alcoholic beverages would only have been possible if there was an excess of such foods. The early farmers clearly had major problems adapting to this shift in diet.

The rapid increase in population density enabled by farming also resulted in a huge increase in infectious disease.10 Diseases like measles, which, in the past, would have occurred in isolated nomadic pockets and then fizzled out, now had whole villages and towns of warm bodies, coughing, sneezing and spewing all manner of fluids, to aid their propagation. In addition, domestication brought large numbers of animals into close proximity with humans for the first time, allowing so-called ‘zoonotic’ diseases to jump between animals and humans. (These include common diseases such as the flu and chicken pox, to ones that cause more alarm like HIV and Ebola.)

Yet the advantages of agriculture to the growth and proliferation of our species were just too great, and so humans, flexible as we are, eventually adapted. The question is, however, were these adaptations at a detriment to our health? And if so, is the Paleo diet therefore the panacea to our modern-day problems of obesity and other metabolic diseases?

In order to answer these questions, let’s take some of the major claims made by Cordain and other Paleo enthusiasts, and explore them in a little more detail.

1. OUR PALAEOLITHIC ANCESTORS ATE ‘THE PALEO DIET’

The headline claim of the Paleo movement is that it has managed to replicate, or at the very least work out a reasonable facsimile of, what Palaeolithic people would have eaten. How true is this likely to be?

First, although Cordain is adamant about the specific foods and ratios in which they were consumed by hunter-gatherers, there are many that disagree with him. Randolph Nesse, the so-called godfather of evolutionary medicine, professor and director of the Centre for Evolution & Medicine at Arizona State University, and author of many influential papers exploring the links between evolution and disease, finds the Paleo Diet simplistic and flawed. In an interview with NPR’s (National Public Radio) Eliza Barclay, Nesse said:

‘There’s this tendency to want to find the normal human diet, but every single diet you pick has an advantage of some sort. Humans have lived in all kinds of places and we have adapted to all kinds of diets’.11

In a letter to the editor addressing a study published by Cordain and his colleagues in The American Journal of Clinical Nutrition,12 Katherine Milton from the University of California at Berkeley stated that, ‘hunter-gatherer societies, both recent and ancestral, displayed a wide variety of plant-animal subsistence ratios, illustrating the adaptability of human metabolism to a broad range of energy substrates. Because all hunter-gatherer societies are largely free of chronic disease, there seems little justification for advocating the therapeutic merits of one type of hunter-gatherer diet over another’.13

Peter Ungar and his colleagues, in the Annual Review of Anthropology reasoned that early Homo species would have had an adaptable and flexible diet, enabling them to survive in changing and unpredictable environments.14 This does not necessarily mean that the foods they ate were varying all the time, but that they had the capacity to gain sustenance from many sources. This is really the bottom line, that there was no single Palaeolithic diet, because there was no singular Palaeolithic people. Rather, most experts agree that hunter-gatherers ate what was available to them at the time, depending on their environment.

Second, while the Paleo lifestyle condemns (and this really is not too strong a word here) foods that are a product of agriculture, the number of available foods that are NOT the product of agriculture, in one way or another, is actually extremely rare. The vast majority of foods we eat today have been subject to years of genetic selection. Just to be crystal clear, I don’t mean that everything we eat has been modified using modern genetic-engineering techniques. These are so-called GM foods, which I don’t, in principle, have a problem with, while acknowledging we need to take care and attention with such powerful technology (I will discuss this in further detail in the final chapter). Rather, I refer to the fact that humans have been breeding plants and animals for the past 12,000 years. Farming has allowed humans to gradually alter plants to be more nutritionally dense, less toxic, larger and resistant to pests. Some examples include the domestication of the tomato, which has been bred to no longer contain toxins found in nightshade vegetables (whew!), or the evolution of apricots and almonds, which were bred from the same precursor; one to rear a larger fruit, the other an edible seed (I know! This was news to me too!). Apples are the products of hundreds of generations of careful breeding that started with a tree native to Central Asia, Malus sieversii. The author Michael Pollan once described those original fruit as tasting like ‘a tart potato’.15 Even avocados, the seemingly ‘go-to’ food for the clean-eating Instagram generation, have similarly been transformed from the original fruit, which consisted of just a thin layer of flesh surrounding a giant seed, to the delicious and creamy guacamole precursor of today. Almost every vegetable and fruit that we consume today would have undergone similar selection, to amplify their desirable traits and minimise the undesirable.

Similarly, the meat section in our supermarkets would be entirely foreign to our Palaeolithic ancestors. The busty, short-legged chickens of today are a far cry from the stringy, long-legged jungle fowl from which they were originally bred. All of our beef and dairy cattle are descended from the now-extinct aurochs, which were much larger and fiercer animals. On a recent trip to the University of Otago, in Dunedin, New Zealand, I happened upon a life-sized model of a Moa, a now extinct indigenous flightless bird that reached 3.6 metres in height. The size of the model was jaw-dropping. New Zealand was originally settled by humans around 1280 AD. Until that time, the Moa had no known natural predators, and so had no reason to fear these strange small hairless creatures that jumped off wooden boats and on to the beaches. From the point of view of those original Polynesian settlers, these birds were like a walking picnic, larder and restaurant wrapped up in one. Because they were not afraid of humans, the Polynesians simply walked up to them, poked them with a sharp stick, and boy, were they tasty. So much so that by 1400 AD, some 120 years after the arrival of the original human settlers, the Moas had been completely wiped out. 120 years, barely three generations, was all it took. Like the mammoths, mastodons and aurochs before, our ancestors hunted them to extinction. As Marlene Zuk writes in her book Paleofantasy, ‘The reality is that we are not eating what our ancestors ate, perhaps because we do not want to, but also because we can’t’.16

2. HUMANS EVOLVED TO EAT MEAT

The ‘Paleo lifestyle’ prescribes a diet high in protein, mostly from lean ‘wild’ or game meat and seafood, so the pork ‘brontosaurus’ ribs at my May BBQ, which I will attest do not fall into the ‘lean’ category, might not exactly fit the bill. However, there is a clear belief amongst the Paleo community that humans were evolved to eat primarily meat. Christina Warinner, an archeological scientist, gave an informative and revealing TED talk in 2013,17 where she asserted that humans have few adaptations, genetic or otherwise, for meat consumption, but are actually physically adapted for plant consumption. These adaptations include a long digestive tract for plant digestion and the inability to produce vitamin C (not an asset, but evidence for the importance of plant consumption nonetheless). Carnivores, on the contrary, have shorter digestive tracts and have no need to ingest vitamin C, as they are able to create their own. They also have specific metabolic adaptations to an all-meat diet, such as the inability to synthesise vitamin A. In addition, Milton remarked that it was true that some Palaeolithic societies would have consumed large quantities of animal protein by today’s standards, but ‘this does not imply that such a dietary pattern is the most appropriate for human metabolism or that it should be emulated today’.18 She stated that human gut proportions, as mentioned by Warinner, indicated a varied diet with digestion occurring primarily in the small intestine. Meat is, by far, more calorically dense than many vegetables. When this was coupled with the ability to control fire and the development of cooking techniques, the caloric availability of meat was incredibly valuable to Palaeolithic humans. However, chasing down meat was high-risk and hard-work, whereas the gathering of fruit, vegetables and roots was a more guaranteed source of food. Humans didn’t evolve to eat ONLY meat, we evolved to eat whatever was available, including meat. Thus the fact that humans were omnivorous, with the ability to subsist on a hugely varied diet, was a key factor in our success as a species.

3. PROCESSED FOODS ARE BAD FOR YOU

On the face of it, saying that processed foods are bad for us does not seem to be a controversial statement. It certainly brings to mind the vast midsections of our favourite supermarkets that are set aside for processed confectioneries to suit all tastes. Whole aisles of breakfast cereals, chocolates and other candy, ready to leap on us and make us fatter. There are foods available whose shelf lives laugh in the face of going stale, almost defying biological decay and decomposition. A common urban myth claims that the infamous American ‘cake’ (a very loosely used term here I would argue), the Twinkie, has an infinite shelf life, due to all of the chemicals used in their production. In homage to this legend, the Pixar film WALL-E, which is set in part on a ‘post-apocalyptic’ Earth, has a scene where a cockroach is shown eating its way through the cream-filled centre of a Twinkie, both of which have survived the apocalypse, emerging out the other side not having been struck down by food poisoning. The myth is just that of course – a myth – but Twinkies certainly stay ‘edible’ (another word used here with great artistic license) for an unnaturally long time, particularly as they are filled with ‘cream’ (or at least some kind of cream analogue ... what in heavens is in Twinkie cream?). In a less apocryphal tale, I was teaching at the Cuernavaca campus of the Universidad Nacional Autónoma de Mexico (UNAM) a few years ago, and was introduced, for the first time, to actual bona fide corn tortillas. They were delicious and made fresh to order, wherever they were served; be it as a street-food, or as fine dining in a Michelin-starred establishment. They have next to no shelf life at all, and are discarded if not eaten within the hour. ‘Corn tortillas’ are also available at my local supermarket, where I buy them to accompany my famous (at least within my household) chicken and steak fajitas. While these look like tortillas, they are a different beast entirely, and when kept sealed in the packaging they are sold in, have a shelf life of more than a year. If you think I jest, go check it out for yourselves! When I told my Mexican hosts about these ‘tortillas’, they stared at me in disbelief and said: ‘Then they are not tortillas!’

The term ‘processed’, however, is a broad church. Cooking, for example, is by definition a ‘process’ which sterilises food and increases caloric availability. Is cooking bad for you? Clearly not. Heat treatment is also used as a process of preservation. Pasteurisation, for instance, heats milk (and other liquids) up to 6O°C–72°C (140°F–161°F) for a few seconds, killing off the majority of microbes without impairing flavour, and extending the shelf life of milk from a couple of days to two weeks or more. The US Center for Disease Control (CDC) says that improperly handled raw milk is responsible for nearly three times more hospitalisations than any other food-borne disease source, making it one of the world’s most dangerous food products. Hurrah for pasteurisation!

As well as this, putting heat-treated food into a heat-sterilised container and then sealing it, as happens during the canning process or when one is making jam, extends its shelf life indefinitely, often without the use of any preservatives, aside from sugar or vinegar.

The processing of food, prior to the advent of refrigeration, was almost always used as a form of preservation. Curing is probably the oldest such example, and has been the dominant method of meat preservation for thousands of years. Meat or fish were most commonly cured by the addition of salt or sugar or a combination of both, with the aim of drawing moisture out of the food by the process of osmosis. The resulting decrease in moisture content and increase of salt and/or sugar concentration within the meat makes it an inhospitable environment for the bacteria that causes food to spoil. Smoking is often performed in conjunction with the addition of salt, helping to seal the outer layer of the food being cured, and making it more difficult for bacteria to enter. Thus emerged bacon, ham, salami and a whole menagerie of smoked fish, including salmon, haddock and mackerel. Dehydration, by simply drying food out in the sun (beef jerky, anyone?), is another of the earliest forms of food preservation and works because bacterial growth requires water.

Fermentation is a food preservation method using microorganisms. Examples include any process that produces alcoholic beverages or acidic dairy products, for instance, and involves either yeast or specific types of bacteria converting sugars into lactic acid or alcohol. The earliest documented use of fermentation dates back to nearly 7000 BC in Jiahu, China, where the first evidence of an alcoholic drink made from the fermentation of fruit was discovered.19 Yogurt, which is produced by the bacterial fermentation of milk, converting lactose to lactic acid, was thought to have been invented in Mesopotamia around 5000 BC.20 Fermentation has also been used as a method of preserving vegetables in many cultures, with examples of fermented cabbage that include sauerkraut in Europe and kimchi in Korea.

Then there are processes that involve separation, such as the milling of rice or wheat, which is where it has its husk, bran and germ removed. The act of polishing after milling is what gives rice its characteristic white colour, its taste and texture. Crucially this refining of rice extended the storage life of the dried grain, which would presumably have been the primary reason why it was developed. Unfortunately, an unintended consequence of this refining is that much of the nutritional content of rice, which is actually contained within the husk, is removed. In fact, a diet based on unenriched white rice increases vulnerability to the neurological disease ‘beriberi’, due to a deficiency of vitamin Bi or thiamine. Today, in an effort to replace some of the nutrients stripped from it during its processing, white rice is required by law, at least in the United States, to be enriched with vitamins Bi, B3 and iron.

Another ancient process is the practice by indigenous peoples in the Americas of soaking corn in an alkaline solution, typically lime (not the fruit but the mineral), prior to cooking. Before the arrival of Europeans, corn would have been the primary grain in the Americas. It still is the primary grain in many parts of Central and South America today. The problem with corn is that while it is particularly rich in the micronutrient niacin, otherwise known as vitamin B3, the niacin is chemically unavailable to the human digestive system, and passes right through us. So if the grain you ate was primarily corn, you could end up with niacin deficiency, resulting in a disease called pellagra, which is characterised by diarrhoea, dementia, rash on the hands and feet, and if left untreated, is lethal. Soaking the corn in alkaline solution liberated the niacin, making it available to us during digestion. How the indigenous Americans worked it out is a mystery, but seeing as corn was domesticated some 10,000 years ago, work it out through trial and error they did, and pellagra was never a problem for them. After the arrival of Europeans, corn was very quickly adopted by the colonial Europeans and also exported to Europe. None of the Europeans or white Americans, however, picked up on the alkali treatment of corn prior to cooking, or if they did, they certainly didn’t grasp the significance of that process. For most non-native Americans, the unavailability of niacin from corn was not a problem, because niacin was available from other sources of food, including meat and other grains. However, corn was cheap and easy to grow, so ended up being a primary staple to those in poverty, causing a sharp increase in incidence of pellagra amongst the poor. It wasn’t until the mid-1930s and early 1940s that pellagra was recognised as a disease resulting from niacin deficiency and eradicated by the fortification of grains and flour. Imagine that: corn had been around for nearly 10,000 years, and it took non-native Americans and Europeans until the early twentieth century to learn how to process it properly for food.21

So while the term ‘processed food’, as used today, is associated with a whole host of negative connotations, the devil truly is in the detail. Clearly, the modern industrial processes giving rise to Twinkies and mutant corn tortillas have extended the shelf life of food, sometimes indefinitely, making transport and storage far easier, and, importantly, driving down their cost. These are critical characteristics of contemporary food that allow the planet to sustain its seven billion human inhabitants and rising. However, while calories today in most developed economies are plentiful and cheap, highly industrially processed foods are typically stripped of much of their nutrients and fibre, contain higher levels of sugar, fat and/or salt in order to improve flavour, and are as a consequence far more calorically available. However, food preservation and separation processes were critical to our ability as a species to survive and to thrive, ensuring that we had a predictable source of calories through seasonal changes in the availability of fresh food, and buffering against environmental crises such as drought. So depending on how one defines it, we can’t – and shouldn’t – avoid all processed foods.

4. OUR GENES ARE NOT ADAPTED TO THE POST-AGRICULTURAL DIET

The claim that our genes are not adapted to the post-agricultural diet is one that is possibly used the most by Paleo enthusiasts. It is also the most nuanced, particularly as a key argument which I make is that our food environment has changed too quickly for our genes to adapt, leading to the current obesity epidemic. Before I am accused of providing ‘alternative facts’, let me explain. The environmental changes that I spoke about in the beginning of this book, which include changes in food availability, type of food, as well as changes to our lifestyle, have occurred in the past thirty years. This is far too short a time for us to have genetically adapted. In contrast, humans have been exposed to agriculture now for thousands of years and, given the selection pressure the environment has provided, have had ample time and opportunity to genetically adapt to it. Two informative examples have been our adaptation to handle an increased volume of starch and our ability to drink and metabolise alcohol.

The other aspect to consider is whether it’s true that our Palaeolithic ancestors did not eat grains. Today, the starchy rich seeds of five grasses (wheat, rice, oats, corn and barley) provide 50 per cent of the calories consumed by all humans. This is certainly very different from what our Palaeolithic hunter-gatherer ancestors would have faced. However, Anna Revedin, an archaeologist at the Italian Institute of Prehistory, found that grinding stones were used to process wild cereals and ground seeds as far back as 30,000 years ago.22 By examining starch grains recovered from the grinding stones found at three Palaeolithic sites across Europe, they concluded that early Palaeolithic populations made flour, allowing them to consume the starch-rich portions of otherwise inedible plants and providing a more dependable food source. They also suggest that their findings indicate the use of this type of food processing would have been relatively common practice throughout European Palaeolithic hunter-gatherers. What is now clear is that the technology to convert inedible grain into energy-rich flour predated the agricultural ‘big bang’ by nearly 20,000 years. While Palaeolithic people were certainly not exposed to the same volume of the five starch-rich grasses that we see today, it is patently untrue that they did not consume grains as food.

The problem is with the amount of grains, and hence starch, that was suddenly available. Our foraging ancestors in the Pleistocene may have occasionally happened upon a field of early wheat or rice, but their caloric base would have come from a huge variety of foods, nuts, berries and game.

In humans, our ability to digest starch begins with amylase in saliva, which helps break down starch during chewing, and continues its action as it travels down to the stomach, and through to the small intestine, where other amylases released from the pancreas and other organs take over. Although we typically carry two copies of any given gene, one from each parent, humans actually have a variable number of the saliva amylase gene AMY1, ranging from 2 to more than 30 copies. The more copies of AMY1, the better our ability to digest starch. This improvement in ability is subtle but measurable. Genetic studies on existing hunter-gatherer peoples today, such as the Yanomami people of Venezuela, who continue to subsist on a high protein and low starch diet, reveal they have fewer copies of AMY1; while other primates, who are primarily fruit eaters, only ever have 2 copies of AMY1.23

Humans have always eaten starch, we just were not able to fully digest it and unlock all of the available calories. This was not a problem when starch was not our overwhelming source of calories (such as for the Yanomami), which it is, of course, today. What this genetic adaptation has done is to improve our efficiency to metabolise starch, allowing us to extract more calories from every gram of starch. Or to look at it another way, we could eat less food and get the same number of calories. Even in the relatively plentiful environment post the agricultural ‘big bang’, at least in comparison to the scrabbling existence of hunting and gathering, most humans still didn’t have a surfeit of available food. An increased efficiency to digest starch provided a huge selective advantage over those without the genetic adaptation, and was therefore incorporated into the gene-pool, such that almost all humans today carry multiple copies of AMY1.

Our shift to a starch-rich diet had one other unintended consequence that ended up significantly influencing human culture; it was a major driving factor in the domestication of dogs. Dogs were domesticated from wolves 10,000–20,000 years ago, possibly multiple times independently. Dogs don’t have salivary amylase, but have an amylase that is produced by the pancreas called AMY2B. Similar to AMY1 in humans, dogs have a variable number of AMY2B genes, with more copies indicating a better ability to digest starch.24 Wolves, however, only appear to have 2 copies of AMY2B. Together with a less vicious disposition, the enhanced ability of the ancestors of modern dogs to digest starch led to them being able to subsist off human starch-rich food waste, which would have been a valuable source of energy, and they were therefore less likely to eat their human companions (that would not have worked out). People often assume that dogs are ‘obligate’ carnivores, a fact which is true for wolves and indeed cats. However, a peek at the ingredient list of any brand of packaged dog-food will reveal a surprisingly high starch content, a diet which our dogs are genetically adapted to!

Alcohol

On a lovely October’s day at the turn of the millennium, I married the love of my life (Jane) here in Cambridge. We were married at the Catholic Church in the centre of town, and then booked out a large local Chinese restaurant for our wedding banquet. My wife is English and largely of Northern European extract, with about an eighth Irish thrown into the mix. As a result, the guests in attendance were about 40 per cent Chinese (groom’s side), 50 per cent white Caucasian (bride’s side) and 10 per cent of every shade and colour else (our academic colleagues from all corners of the world). In addition to the food on offer, which catered to all the dietary needs you might imagine, there was of course a typical selection of beer, wine and non-alcoholic drinks, and as it was a wedding reception in the UK after all, copious amounts of champagne were also on offer. Some guests, particularly from my wife’s side of the family, enjoyed more than a few drinks (it was a wedding after all!), the volume of conversation and laughter growing louder as the afternoon merged into the evening. Others (primarily on my side of the family) drank little, if at all. Many of my family simply have an inability to metabolise alcohol, evident by the flushed cheeks on display after only a few sips of bubbles, and also a boisterous contribution to the conversation and laughter. So why are there differences in the way we metabolise alcohol, and how has its consumption, or lack thereof, become so culturally entrenched?

Alcohol is broken down in the body by a number of alcohol dehydrogenases (ADHs). The oldest forms of ADH4, found in primates some 50 million years ago, broke down small amounts of alcohol very slowly and inefficiently. About 10 million years ago, however, a single genetic mutation occurred that enabled a common ancestor of humans, chimpanzees and gorillas to develop a version of the ADH4 protein that was 40 times more efficient at metabolising alcohol.25 This abrupt shift occurred at a time of rapid climate change, causing the forest ecosystem of East Africa, home of our primate ancestors, to be replaced by more widely dispersed forests and grasslands. While early primates had lived their lives mostly in trees, as the environment and food sources began to change, some of them made the transition to a more ground-based lifestyle.

Crucially, the new ability to digest alcohol helped our ground-dwelling primate ancestors eat rotting and fermenting fruit (which were alcoholic) that fell to the floor of the forest when other food was scarce. Such fallen fruit would have been unlikely to be the first choice of food, but it would have allowed them to survive. Other primates without this mutation were more likely to get sick or drunk off the fermented fruit, and be less effective at defending their territory or finding more food. Thus all primates that live primarily on the ground (including us) can handle a small amount of alcohol.

The embedding of alcohol into our culture, however, came with the emergence of agriculture. According to archaeological evidence, the first modern humans began turning fermented fruit and other foods into alcoholic concoctions at the beginning of the agricultural revolution. Critically, this then influenced the strategy of how certain populations solved the problem of safe drinking water. Alcohol, in the form of beer and wine, was used by those in the fertile crescent (today’s Middle East) and northern Europe to ensure safe drinking water. Thus an enhanced ability to metabolise alcohol, the result of another single gene mutation, was a hugely powerful selective advantage. In fact, European children regularly drank weak beer up until the 17th century.

Whereas for other cultures, in particular many East Asian and other indigenous cultures, the strategy of boiling water (eventually enhanced in flavour by the addition of local leaves to produce ‘tea’) was used instead. That is not to say that ancient East Asians did not drink alcohol. The residue of the earliest known alcoholic beverage (a mixed fermented drink of rice, honey and fruit) was actually found on early pottery used in Jiahu,26 a Neolithic village in China’s Yellow River Valley dating to 7000–6600 BC, slightly predating that of barley beer and grape wine, which emerged in the Middle East. However, because East Asians had clean boiled water, they didn’t HAVE to drink alcohol, unlike the Northern Europeans in particular. Thus emerged the variation in the ability of different populations today to handle the drinking of alcohol.

Now let’s return to my wedding, where the reception has now metamorphosed into a full wave your hands in the air’-type party. From a sociological perspective and given the diversity of nationalities in attendance, it is interesting to note that, earlier in the proceedings, many guests felt the need to accept a small (or not so small) glass of champagne to toast the bride and groom, whether or not they could drink any of it, because it is the culturally acceptable thing to do. This is hardly unique to my wedding. Economic globalisation has led to the emergence of a large middle class with expendable income in places like China and India, where both at home and abroad they are exposed to western culture. For better or worse, a part of this involves the adoption of western drinking culture as a status symbol; what could be more ‘cultured’ than a glass of champagne at a wedding in Cambridge?!

Now, alcohol is actually a toxin, which is why it is used to sterilise drinking water. Drinking alcohol in moderate amounts is not a problem for those who can metabolise it, thus removing it rapidly from the blood. The problem is, if one lacks the ability to remove alcohol from the blood, like many of my family and East Asian colleagues, then the alcohol remains circulating as a toxin, where even at moderately high concentrations, it can be as carcinogenic as smoking a pack of cigarettes. A small snifter of champagne at a chimeric English-Chinese wedding is not going to kill anyone. But it is best to be aware of what our bodies are designed to eat and drink.

The point is, if there is a strong enough selection pressure, then genetic adaptations, such as an increased efficiency of starch digestion or the ability to drink alcohol, can take place over a relatively short period of time.

A beer belly segue

One thing that many people don’t take into consideration is the caloric content of alcoholic beverages; not an issue at all for folks like my dad, however a particular problem for those that can drink like fish. But surely it’s only beer that contains lots of carbs and therefore lots of calories? Isn’t that why all of those loud and sweaty men (I am channelling my wife here for a moment) who swill beer the world over have beer bellies? I mean, look at all of those skinny ladies drinking prosecco ...

Here’s the thing. When it comes to alcohol and weight gain, the term ‘beer belly’ is misleading. The majority of calories from drinking don’t come from the carbs, they come from the alcohol. Pure alcohol, at seven calories per gram, contains almost the same amount of calories as fat (9 calories per gram), and is no more or less calorific whatever drink it happens to be in; whether it is beer, wine, gin or whisky. So, a 175ml glass of wine, with an alcohol content of 13 per cent, contains nearly 160 calories, while a whole bottle will be around 700 calories. A standard 330ml bottle of beer with 5 per cent alcohol contains 140 calories, while the equivalent in a UK pint will be over 240 calories. Compare this to 140 calories found in a can of Coca-Cola and 250 calories found in a Snickers or Mars bar. Very few people would consume six cans of cola in one sitting, whereas many would think nothing of drinking a six-pack of beer. In addition, there are many alcoholic drinks, mixers or alcopops, for instance, that are also high in sugar, jacking up the calorie content even further. And the denouement is that the more you drink, the more disinhibited you become, and the less you will care about what and how much you are consuming. An evening out could easily lead to 700–1,000 calories just in drinks, before you even consider any food you have eaten.

Any spare calories you don’t use immediately will then be converted in to fat. Where you put your fat depends on your own personal biology. As we’ve discussed, on average, men are more likely to store theirs in visceral fat that sits around their belly (hence beer belly), while women will tend to store it underneath their skin as subcutaneous fat. So it’s just less obvious in a woman ... although perhaps we could coin the term ‘wine wobble’, or maybe ‘champagne chin’?

But I digress. The Paleo diet is a made-up construct because what we might have eaten in the Palaeolithic Age is always going to be a guess, and much of what we may have actually eaten probably doesn’t exist in any recognisable form today anyway! And the arguments that we haven’t genetically adapted to an agricultural diet simply make no sense, as I have just shown.

DO HIGH-PROTEIN, LOW-CARB DIETS WORK?

One of the key characteristics of the Paleo Diet is its prescribed high-protein and low-carbohydrate content. This however, is not unique to the Paleo world, and is shared by other diets including, amongst many others, the Dukan and Ketogenic diets, as well as the granddaddy of them all, the Atkins diet. I do realise that these are different diets with differing philosophies; and I am using the word ‘philosophies’ here with all the baggage that it comes with. Let’s take a look at each of them in turn.

The Dukan diet has the ATTACK phase, which gives your weight loss a jump start; the CRUISE phase, which gets you down to your target weight; the CONSOLIDATION phase, which tries to prevent any immediate rebound; and the STABILISATION phase, which is the long-term strategy to keep the weight off.27 It is clear that the Dukan diet has been heavily influenced by the Atkins diet, which is also divided into four phases: induction, ongoing weight loss, pre-maintenance and maintenance. In fact, at first glance, they might almost be the same diet.

This is what the Dukan diet claims to do. ‘The Dukan Diet will redesign your eating habits and help you permanently stabilise your weight. The Dukan Diet is a high-protein, low-fat, low-carb diet – a healthy-eating plan based on proteins and vegetables, 100 foods in total. And what’s best, it’s EAT AS MUCH AS YOU LIKE’.28 (There are those shouty capital letters again.) It really is quite a restrictive diet ... only 100 foods you can eat, but without limit! The Atkins diet is nowhere near as prescriptive, really only limiting carbohydrate consumption, with protein and fat making up the rest of the diet. For both Dukan and Atkins, the main difference between the four phases is the protein to carb ratio of the diet; with phase 1 in both having the highest protein and lowest carb content.

The ketogenic diet is a high-fat, moderate-protein, very low-carbohydrate diet. It is so named because it forces the body to burn fat rather than carbohydrates, a process called ketosis, which produces ketone bodies as a by-product. In the clinical setting, it has been used for decades to treat particularly hard to control epilepsy in children. The exact mechanism is unknown, but by forcing the brain to use ketones as fuel rather than its favoured glucose, this seems to reduce the number of seizures that occur. It is, however, now beginning to gain traction as a weight-loss plan. While the ketogenic diet is sold as a high-fat, ultra-low-carb diet, the reality is that many people cannot manage the 75 per cent to 80 per cent fat that is recommended. It is an extreme level of fat that is quite unpalatable to many. Keto enthusiasts tend to come to the diet with an almost evangelical aim of avoiding carbs, and because the calories still need to come from somewhere, the diet actually ends up being quite high in protein; maybe as high as 20–25 per cent, as compared to the 15 per cent protein in today’s typical western diet.

The reason that this common thread of high protein and low carb exists in many diets is because the approach does work, at least in the short term, for weight loss.

The main stumbling block is one that is universal to ALL diets; that is, the moment you come off the diet, the weight, almost invariably, goes back on. Why? Because your brain absolutely hates it when you lose even a few pounds, sensing weight loss as a danger signal, a warning that you need to maintain your energy stores. Your body fights back by reducing its energy expenditure and making you feel hungrier, in an effort to claw back the weight it has lost. So if you stop whatever diet you were on, and go back to how you were eating, your weight will soon climb back to where it was before you began the diet.

Then there are the myriad of side-effects to contend with on a ketogenic diet, depending on how extreme a protein to carb ratio you try to stick to. Heart palpitations, headaches, leg cramps, constipation and bad breath, for example, have all been reported as common side-effects. Taken to its most extreme, however, such as in untreated type 1 diabetes, where you don’t have any insulin and thus your muscles and fat are unable to absorb any glucose, you would end up burning only fat, producing uncontrolled amounts of ketones, and ending up with diabetic ketoacidosis. Because ketone bodies are acidic, a large accumulation would dangerously lower the pH of the blood – thus the name ketoacidosis, which, if left unchecked, can be fatal. This, by the way, is my riposte to those who say we can live perfectly healthy lives without eating any carbohydrates and just use ketones from the breakdown of fat as fuel. Without at least some carbohydrates in our diet, we would die.

WHY DOES HIGH PROTEIN EQUAL WEIGHT LOSS?

But there is good evidence that a moderate increase in the protein to carb ratio works for weight loss. The question is why and how does it work? Most of the diets above, in their various website blurbs and glossy brochures, focus on the ‘low-carb’ element of the diet. There is an almost universal (although admittedly to varying degrees) condemnation of carbohydrates and agreement about their toxic nature; it is after all the food element that is being removed, or at least reduced drastically. How about the macronutrient that is being added, though? What about the role of increasing the amount of protein being eaten?

The first thing to consider is caloric availability, which we discussed in detail in Chapter 3. Protein takes the most energy and the longest time to digest, with nearly 30 per cent of the total calories eaten in protein required to digest it. Contrast this to carbohydrates, which only take 5–10 per cent of consumed calories to digest, depending on whether we’re talking about complex starches, which would cost more, or simple sugars which would cost less. So calorie for calorie, it takes more energy to digest protein than carbohydrates.

But protein also makes you feel fuller than carbs or fat. Why? The answer to this question lies in our gut hormones and the biology of how they control our feeding behaviour.

GUT HORMONES

In order to effectively modulate our feeding behaviour, including how full or how hungry you might feel, your brain needs to know the answers to two critical questions. First, what is the state of your long-term energy stores? This is, broadly speaking, the amount of fat that you currently have on-board, which is how long you would last without food. Second, what is the state of your short-term energy stores? This would include what you are currently eating, what you have just eaten and how much is being consumed. As food travels from our mouths, down our oesophagus and through our gastrointestinal tract, hormones are released all along the way that reflect not only how many calories are in the meal, but also how much protein, fat and carbohydrates are present. The vast majority of hormones that are released by the stomach and gut, such as CCK, GLP-1 and PYY, make us feel full.29 CCK or chole-cystokinin, for example, is released from the stomach and is very short acting, with its ‘feel full’ effect lasting less than 30 minutes; thus it is likely to play a role in signalling when you should stop eating. In contrast, PYY or peptide YY, which is secreted by the small and large intestine, has effects that last many hours, bridging the gap between meals, and therefore influencing how much you are likely to eat from one meal to the next. The one exception to the ‘gut hormones make you eat less rule’ is ghrelin, which is produced by the stomach, and makes us feel hungry.30 In fact, ghrelin levels in the blood rise acutely just before a meal, triggering hunger and playing a role in when we choose to eat.31 Your brain doesn’t sense and respond to each of these hormones in isolation. Rather, it is a mix of these short and long acting signals (of which there are many more than just the four I’ve discussed here), as well as when and where they are released, that signals to the brain how much and what is being eaten.32

In recent years, our understanding of how these gut hormones modulate our feeding behaviour has been transformed by lessons learnt from gut bypass or bariatric surgery.33

WHY PROTEIN MAKES YOU FEEL FULL – LESSONS LEARNT FROM RE-PLUMBING THE GUT

Gut bypass or bariatric surgery is used as a weight-loss treatment for severe obesity. It is, in fact, the only current anti-obesity therapy that reliably keeps weight off in the long term. The observation of how obese people responded to gut-bypass surgery not only gave us a further appreciation of how the whole system was working, but also proved a plausible mechanism for how and why high-protein diets work in weight loss. There are a number of different types of bariatric surgery that exist, with the most commonly used being the Roux-en-Y gastric bypass or RYGB. By stapling off the upper section of the stomach, this surgery first reduces the size of the upper stomach to a small pouch of about 30 millilitres in volume; then the second stage sees this pouch being attached directly to a part of the small intestine lower down, bypassing about one metre of the upper small intestine, or duodenum. The surgery has, in effect, re-plumbed the gut. By making the stomach drastically smaller and bypassing a length of small intestine, RYGB was designed to limit the amount of food able to be consumed and to reduce the amount of fat and calories absorbed from the food that has been eaten. However, while RYGB was tremendously effective in reducing weight, HOW it worked turned out to be a big surprise.

Two key observations of patients pre- and post-RYGB have transformed our understanding of how the gut works. First, many of the obese people who underwent bariatric surgery also commonly suffered from type 2 diabetes. The doctors observed that after RYGB, the patients’ diabetes improved dramatically; in many cases, patients could actually reduce or even stop their anti-diabetic medication. It was the speed at which this occurred that really surprised doctors, with improvements in diabetes happening less than twenty-four hours after the surgery, clearly before any notable weight loss could have occurred!34 Second, after the surgery, many patients reported a change in eating behaviour, once again beginning before any notable weight loss. Not only did they feel full faster, but their taste in food also changed.35 Many found fatty foods less appealing, for instance.

So what was going on? Was the smaller stomach playing a major role? No, as it turns out. Because the stomach is very flexible, it begins to stretch quite rapidly, and before long, the same volume of food can fit into the stomach again. The major effects came from the bypass element of the surgery. Under normal circumstances, food leaves the stomach and enters the small intestine, where the bulk of digestion occurs. The lower down the intestine the food goes, the more digested the food gets, and all the while the intestine is secreting hormones to keep the brain informed of progress. The effect of the bypass is to introduce food that is in a different state, that has been slightly less digested, lower down the small intestine. This part of the gut that has been exposed to less digested food then responds by secreting a different mix of hormones. It is this different repertoire of hormones that resolves the diabetes, makes the patient feel more full and eat less.36

What does this have to do with how high-protein diets work? In a post-bariatric surgery gut, the bypass transfers less digested food further down the gut, which makes you feel fuller. Whereas in a higher-protein diet, because proteins are less calorically available and take longer to digest than fat and carbs, they end up travelling further down the gut, which, due to hormonal release, then makes you feel fuller. What happens when you feel full? You eat less. And what happens when you eat less? Well, it is the most effective way to lose weight.

LEVERAGING PROTEIN

The reason there is even a need for bariatric surgery and all of these diets is, of course, because of rapidly increasing rates of obesity and all of its associated diet-related illnesses. A major driver of this modern obesity crisis is our food environment, including, as I’ve discussed earlier in this chapter, the increased availability of industrially processed, calorically dense, yet nutrient-poor food. We are driven to eat these items because they are often rich in fat, sugar and/or salt, and are therefore highly palatable.

A different hypothesis has emerged in recent years that provides another possible reason why we might find these foods so palatable, and it revolves around the fact that these foods also tend to have a lower protein content. In 2005, writing in the journal Obesity Reviews, Australian scientists Stephen J. Simpson and David Raubenheimer from the University of Sydney proposed ‘The Protein Leverage Hypothesis’ as a possible driver of the obesity epidemic.37 They argued that most emphasis on the dietary causes of obesity has been on our changing patterns of fat and carbohydrate consumption. Whereas the role of protein has been relatively ignored because it only comprises around 15 per cent of the western diet, and all available evidence points to protein intake remaining constant across multiple different populations, even in the face of rising obesity levels. So because protein intake hasn’t changed, it couldn’t be playing a role in the problem.

What if, however, a major driving force in what and how much we eat is to maintain a certain level of protein in our diet? In other words, even if protein only forms 15 per cent of our diet, what happens if we NEED it to be 15 per cent, because it is a critical level that we need to meet in order to survive?

Simpson and Raubenheimer conducted an initial small and short-term experiment to test their hypothesis.38 They put ten volunteers into the same living space for six days. For the first two days, everyone was free to select all of their meals from a buffet comprising a wide range of foods with known macronutrient content. Everything they ate was weighed and the amount of protein, carbs and fat calculated. Then for the next two days (days three and four), one group of subjects (treatment 1) was restricted to foods that were high in protein and low in carbs, while the remaining volunteers (treatment group 2) were provided with only low protein and high carb items. Then on days five and six, everyone reverted to the free-choice buffet, as on days one and two. The results showed that when the subjects were restricted to a diet that contained either a higher (treatment 1) or lower (treatment 2) protein-to-carb ratio than they had self-selected during the first two days, they maintained their intake of protein at the expense of regulation of carbohydrate intake. In other words, treatment group 1 (high protein, low carb) ate fewer carbs rather than overeat protein, while treatment group 2 (low protein, high carb) overate carbs in order to get enough protein. Because the volunteers were subconsciously trying to achieve a specific level of protein rather than calories per se in their diet, the protein ‘leveraged’ the amount of carbs that they ended up eating; eating less of a high-protein, low-carb diet and more of a low-protein, high-carb diet. While this initial 2005 study was small and conducted over a short term, more evidence from human epidemiological studies, as well as studies in flies and mice, have emerged to support this hypothesis.

There are obvious and worrying implications for this ‘protein leverage effect’. In the western world, obesity risk is inversely related to socio-economic class; to put it bluntly, it means that the poorer you are, the more likely you are to be obese or overweight. The reasons for this are varied, complex and, frankly, not entirely known. One explanation is that fast-food outlets and takeaways, which tend to sell cheap and highly processed food, are found at their highest densities in less affluent areas. Conversely, high protein foods such as fish and lean meat are relatively more expensive to those on lower incomes than to the more affluent. Many of you, as I was when I found out, will be surprised, shocked even, to know that ‘food insecurity’ is a serious problem here in the UK. In 2014, there were an estimated 8.4 million people in the UK, the world’s sixth-largest economy, living in households reporting having insufficient food. That is the equivalent of the entire population of London or New York: just think about that for a minute. I am acutely aware that I am fortunate enough not to be ‘food insecure’, so I am not going to judge how someone on a limited budget chooses to feed his or her family. I am also not inferring causation; but the easy availability of cheap, highly processed food that tends to be lower in protein content is clearly part of a vicious cycle driving the obesity epidemic, and also exacerbating social inequality.

HOW SHOULD WE EAT?

But just so we are clear, it is not simply about protein, but about ‘putting the balance back in diet’.39 In 2015, Simpson and Raubenheimer, together with David Le Couteur, authored an essay in the journal Cell, in which they argue that ‘the notion of dietary balance is fundamental to health yet is not captured by focusing on the intake of energy or single nutrients’.40 This is a crucial point, given that we don’t eat individual nutrients; rather, we eat food.

Let us take the fruit fly as an example. Fruit flies eat primarily protein and carbs, and hardly any fat at all. So they represent a simple model of testing out the physiological effect of altering the ratio of protein to carbs in their diet, and also what flies actually prefer to eat. If flies are made to eat specific diets, then the lower the protein to carb ratio of their diet (low protein, high carb), the longer the lifespan of a fly. Whereas a diet that maximised their reproductive success, but compromised on lifespan, had a far higher protein to carb ratio. What was really interesting was when allowed to choose what they preferred to eat, flies chose to mix a diet maximising reproductive output rather than lifespan. This makes an awful lot of sense, because the goal of all living beings, after all, is not to live as long as possible at all costs, but to live long enough to be reproductively successful, to be able to pass on their genes. Similar studies were performed on mice, but using protein to carb + fat ratios, and the same results emerged; the diet composition that best supported longevity was not the same as that which sustained maximal reproductive output. The bottom line is when given a choice, mice and flies choose diets that balance longevity with reproductive capability.41 Put another way, mice and flies prefer to live fast and die young!

Clearly humans make dietary decisions for different reasons from mice and flies. However, in at least one respect, humans are not so different; we strive for balance. Fifteen per cent of protein in our diet is what we aim for, but the lower protein content in the industrial processed elements of our modern diet leverages us into eating too much, driving obesity, and shortening our lifespan. On the other end of the spectrum, multiple studies, including from the Nurses’ Health Study, the Health Professionals’ Follow-up Study, the Swedish Women’s Health and Lifestyle cohort, and the Greek cohort of the European Prospective Investigation into Cancer and Nutrition, all consistently indicated that over a long term, low-carbohydrate and high-protein diets increased risk of diabetes, heart disease and mortality.42

So how should we eat? Well, with diets, as with most things in life, the most boring of messages continues to ring true; moderation is key.