We are very fortunate in this day and age to be living in a time in which mass communication has made it possible to share the scientific breakthroughs of our species, giving people a better grasp of the underlying workings of our universe and ourselves. We are slightly disadvantaged, however, by our limited attention span, which seems to want nothing more than to take the whole sum of human knowledge and distil it into an internet listicle of things that cause/cure cancer1 or, more importantly, will make you fat/thin. This last obsession, with slimming to an ‘ideal’ weight, is an extraordinary situation to be in for our species, considering the first couple hundred thousand years of our existence (and our ancestors’ before that, ad infinitum) was mostly spent trying very, very hard not to starve to death. It’s only in the last century or two that obesity has become a mainstream option; previously restricted to those elites around the globe who could command the greatest resources (and food), obesity simply never really registered as a problem in the past. The case is rather different today, as the availability of cheap calories expands rapidly and our waistlines follow. The obesity epidemic is blamed on a hundred different things, depending on some arcane formula known only to news editors, and each new scientific discovery is heralded in the media as definitive proof that some particular item (a food, the internet, anything enjoyable) is solely responsible for the shape of the dent in your office chair.
Given the flurry of interest that stirs up whenever a putative cure or cause for one of modern humanity’s most costly epidemics pops up in the news, it’s not surprising that solutions to our obesity crisis with a certain ‘back to basics’ theme have flourished. Humans are masters of analogy and are very fond of linking small-scale, easily comprehensible narratives to more complex, abstract phenomena, and never mind the details. One of the most overused of these logical shortcuts is the one linking ‘x was better in the past’ to ‘we must do everything as we did in the past’.2 Obesity is a very real problem, and the desire to do something about it, especially on a personal level, is easily understandable. It’s not terribly surprising that, over the last few years, there has been increasing clamour about ‘reverting’ to a diet from a time when we were not all one bacon-flavoured cupcake away from type 2 diabetes. Cue the rise of the Paleo Diet™ and its many side branches, imitators and modifiers. These generally advocate a return to a putative ‘pristine’ hunter-gather state of health through the adoption of an allegedly authentic hunter-gatherer diet.3
At the time of writing I am utilising our capacity for mass communication to look online at the wrapper for something called the PaleoDiet Bar. It promises me optimal nutrition for a hunter-gatherer lifestyle. It is gluten-free, grain-free, soy-free, dairy-free, preservative-free, and fibre- and protein-rich. The presence of joy is not mentioned, but I think one could sensibly infer its absence from the litany above. I cannot actually get hold of one of these bars, as I’m writing this chapter from the middle of a recently resurgent guerrilla war in the southeast of Turkey, and on top of the roadblocks, the shootings and the heat, there don’t seem to be any nutritional supplement/health food emporia this close to the border. However, the question remains: why does this product exist? Why is there a pre-packaged, plastic-wrapped, mass-produced snack endorsed by ‘the world’s leading expert and founder of the Paleolithic movement’4?
Aside from the incredible unlikelihood of any human alive today having ‘founded’ the stone age – at last check, that would have been around 3 million years ago – it’s a bit unfair to pick on this one particular instance of what I would like to start referring to as Neolithic denial. The ‘palaeo’ movement is indeed a popular one, and its starting point is that the development of agriculture has been one long, devastating mistake for our species. The key argument is that eating cultivated foods (ones that require human interaction, at least to some extent, to grow to desired numbers or amounts) is something we are not evolutionarily designed for. The cultivars we came up with – wheat, rice, legumes – are something our bodies are not able to digest, and many of our modern health problems are due to the toxic nature of our heavily gluten-based diet. Awareness of gluten sensitivity or gluten intolerance has skyrocketed in Western, affluent contexts in recent years, fuelling understandable interest. So the only way to solve our inexplicable constant health crises5 is to revert to a diet that our evolutionary history has prepared us for. According to palaeo-diet experts, this includes meat, fish, vegetables, fruit and not much else. But is it true? And if so, what did hunter-gatherers really eat? The invention of farming – is it really making us all sick?
The study of human health and human diets is a great preoccupation of bioarchaeologists, myself included. What we eat, and how much of it, has a big impact on our bodies – from growth and development to death and disease. In the long timeline of human history, there are only a few points at which we can really identify a major change in the way we ate. The shift with the biggest effect on how and where we live our lives for the last 10 millennia has been the development of farming: the domestication of plants and the development of all of the very labour-intensive practices that go into making a living from the soil. For hundreds of thousands of years humans subsisted on more or less ‘found’ food. Of course, some of this finding requires quite specialised skills and knowledge, and the archaeological record shows that humans have eaten a quite varied diet in the past, depending on location, environmental conditions and factors such as prey availability. It’s simply not possible to declare there is one true pre-farming diet. One of our species’ greatest tools for survival is our adaptability; we are omnivorous, and clever with it.
Take for instance the several groups who live by hunting and gathering in and around the Kalahari Desert in Southern Africa.6 Famous for their traditional method of persistence hunting, where they track down ruminants at speed for days until the animals just give up and stop running from exhaustion, this long-range hunting technique has been explained as the evolutionary outcome of our unique combination of sweating and bipedalism. This is the model of human physical achievement that many people (and all stock photo editors) have in mind when picturing the ‘hunter-gatherer’ lifestyle. But gazelles are not the only food; when opportunity arises for these hunters, just about any animal will do, up to and including the slower but much pricklier rodent-of-unusual-size found in the region, the porcupine. Modern ethnographic research with the Hadza of West Central Africa has shown that particular groups of hunter-gatherers actually get about 70 per cent of their total calories from non-meat sources. Anthropologists Frank Marlowe, Collette Berbesque and colleagues have made a detailed study of the food in the Hadza camp, noting, weighing and watching what’s for dinner. They found that meat accounts for only about 32 per cent of the total calorie intake, with the rest being made up of a combination of baobab (14 per cent), tubers (19 per cent) and berries (20 per cent). Surprisingly, the last 15 per cent of calories consumed by the Hadza comes from honey, which requires considerable effort, danger and discomfort to acquire; nonetheless, it is a substantial (and much favoured) part of their diet.
Archaeological finds indicate that our ancestors ate a huge variety of foods.7 There are bones with cut marks, boiled bones, burnt bones, charred seeds and even tiny bits of plants stuck in the plaque on ancient teeth that help us reconstruct our ancient diet. It’s actually relatively recently that archaeological recovery has begun to focus on the microscopic evidence of past diets. This might go some way towards explaining why the emphasis in the early days of archaeology seems to have been solely on animal consumption, as the evidence of meat eating is largely identifiable by the naked eye, and the bones themselves big enough to be collected by archaeologists. The meat-eating caveman trope that ‘palaeo-diet’ proponents hark back to is, however, rather outdated. In the 1960s, archaeobotany (the study of plants that humans ate and used in the past) really came into its own. Though it might seem unlikely, some organic material does survive down through the millennia. Plant remains can be carbonised into charcoal, seed pods can be dried up and scattered through archaeological soils, and individual grains of starch and pollen can be retrieved from very carefully recovered bits of dirt and even dental plaque. These microscopic finds are the result of an excavation technique involving a flotation tank, where water is used to allow heavier (non-organic) material to separate from lighter (organic) bits, and even more recent techniques such as the analysis of dental calculus (plaque) – and the results have dramatically affected what we know about the human past, as archaeologists themselves have readily admitted:
The reader will note that our preliminary report on the 1961 season states confidently that ‘plant remains were scarce at Ali Kosh’. Nothing could be farther from the truth. The mound is filled with seeds from top to bottom; all that was ‘scarce’ in 1961 was our ability to find them, and when we added the ‘flotation’ technique in 1963 we recovered a stratified series of samples totalling 40,000 seeds.8
Prior to the development of these advanced techniques for identifying the micro-traces of plant use in the past, archaeologists reconstructed our relationship with plants through archaeological finds such as storage jars and bins, and finds of tools used for harvesting or threshing.
In the first part of the twentieth century, V. Gordon Childe began to pull together the scattered pieces of evidence for a new type of plant use from archaeological digs all across the Near East and Europe. He theorised, building on previous work and his own encyclopaedic study of artefacts from the ancient Near East and Europe, that around 10,000 years ago there had been a major revolution in human societies. Hunting and gathering had given way to a new Stone Age invention: farming. Childe was a lifelong Marxist, an Australian émigré to England, director of the prestigious Institute of Archaeology at University College London, and perhaps the most influential figure in archaeology in developing grand theories on the how’s and why’s of the human past. His political views strongly influenced his conception of this major shift in human activity in revolutionary terms. This book in fact owes him a considerable debt, as two of his concepts are reiterated here:9 the idea of the Neolithic Revolution, and the idea of the Urban Revolution. We will revisit the Urban Revolution in later chapters, but it’s in the Neolithic Revolution that we find the very first examples of a way of living – sedentary, dependent on agriculture – that some would argue our body of evidence suggests we were better off without.
The Neolithic Revolution, as described by Childe, is not a revolution in the dramatic, anthem-singing mode of eighteen-century France or twentieth-century Russia. His revolution is the progressive outcome of friction between different aspects of human life, rubbing up against each other until they wear each other into entirely new shapes.10 For Childe, there is one clear hallmark of revolution: demography. He sees a ‘bend’ in the line on the population chart as a sure sign that society has changed – that the societal gears have slipped and fallen into new positions. He is not the only one. Subsequent researchers have begun to piece together the skeletal evidence for population booms and busts, not just in Childe’s area of interest (the ancient Near East, especially Mesopotamia), but in many different locations where settled life and cultivation have developed. This includes the very important work of Bocquet-Appel who identified a Neolithic Demographic Transition alongside the development of agriculture: population booms accompanied by corresponding rises in illness and mortality. These two basic concepts, of a revolution and the evidence left behind in bones, are at the heart of this chapter. The question we must answer is whether the major changes of the Neolithic carry a similar body count to the more guillotine-happy upheavals – whether the development of settled life and control of crops and animals are the critical first step on a slippery slope towards the second of Childe’s revolutions, urban living, and thence on to a rising tide of disease and death. But first we need to take a look at what the Neolithic Revolution really meant.
For Childe, the Neolithic Revolution was the point at which humans became masters of their own food supply. But this is an insufficient description of the changes in human lives first occurring around 12,000 years ago in the ancient Near East. After all, humans have always presumably been in control of their food supply to some extent, or we simply wouldn’t be here at all. Many societies that do not extensively farm still maintain small gardens or localised patches of tended wild plants: yams, tubers, rice, and maize, among others. Plant-based foods are a critical part of the human diet, and as with adherents to exclusively animal-protein-based diets today, people in the past would have been at risk of protein poisoning, or what has been called ‘rabbit starvation’: without a sufficient mix of nutrients, a diet consisting of meat (particularly lean meat) alone can cause debilitating symptoms in as little as three days, and death within weeks.
Plant-based foods vary among the many ecological niches humans have occupied, but there are several lines of evidence that show the long antiquity of plant eating in the Homo lineage, including the advances made by microscopic techniques. Many plants create tiny silicate structures called phytoliths; these come in different shapes and sizes, and because of their mineral durability can be traced in archaeological remains. Phytoliths from the Neanderthal occupation of Amud Cave at the margin of the Jordan Rift Valley show that the edible seed heads of grassy plants were part of the human story as early as 50,000 to 70,000 years ago. Fragments of plant starches found calcified in Neanderthal dental plaque have shown that not only were they eating grassy plants, they were eating cooked starchy plants too. These techniques have given us a wealth of archaeobotanical evidence for the exploitation of plant foods by modern humans as well. But at some point, our casual relationship with wild types of grasses and tubers morphed into a much more complicated, interdependent affair.
Wild types of the major staples of agriculture have been identified in several regions of the world, with different types of edible plants predominating in different ecozones. In the Americas, the wild ancestor of maize (corn) has been identified as a particularly unwelcoming-looking spindly grass called teosinte, native to the Balsas River valley of Mexico. Teosinte seed heads have just a handful of armour-plated seeds, nowhere near as enticing as the fat kernels found on modern maize,11 but archaeologists have found traces of them on grinding stones dating back nearly 9,000 years. Rice, the staple crop of half of the world’s population, has gone through similar changes. DNA analysis by Bin Han and colleagues of different types of domesticated rices (long- and short-grain) suggests that the wild ancestor of both varieties comes from the Pearl River valley in China, and archaeological evidence of rice is present in sites along the Yangtze River from around 9,000 years ago. However, like maize, the rice found in early archaeological contexts is not quite the same as the rice we know today: the seeds are of variable sizes, and earlier rice had seeds that would shatter easily – a good thing for a self-propagating plant, but less helpful for human consumers. Dorian Fuller, who led research at one of these early sites in the Yangtze River valley, has suggested that rice domestication was a slow process that occupied thousands of years, during which time the population of the area largely depended on other sources of food.12 Cereals like wheat and barley have deviated from their wild ancestors in the mixed oak and pistachio ecozones of the ancient Near East with similar mutations: seeds that don’t shatter to make them easier to harvest; changes in seasonality, distribution and the way the plants propagate to make them easier to cultivate; and changes in seed size.
In fact, almost everything we eat today has been genetically modified by human intervention. Potatoes, yams and other tubers have been domesticated independently in several areas, but there are clear distinctions between the petite purple Andean wild potato and the mealy behemoth to be found under baked beans and cheese in most English cafes. Carrots, as many people will be aware, started off purple; watermelons began pink and about the size of a grapefruit; grapefruits, limes, lemons and oranges can be just about any colour between yellow and green but are in fact man-made hybrids of a handful of green wild citrus types; and almost every green we eat – from broccoli to kale – is a type of mustard plant. There are incredibly few commonly consumed plant foods that haven’t been bred into submission to human tastes. We have a long history of genetically modifying foods, a fact that is generally conveniently left out of the debate on the ethics of using genetic engineering (tampering at DNA level) to modify crops. Or, if you wish to take it from the view of the plants, humans have been extremely successfully domesticated as part of the dispersal and reproduction methods of several different plant taxa. Exactly how long we’ve been messing with our food, however, is a subject that’s still yielding new evidence – making us rethink what we know about the connection between the major changes in how people lived during the Neolithic Revolution and the development of farming.
The best-studied examples of the development of agriculture come from Southwest Asia (the ancient Near East), East Asia and the Americas. It was the ‘Fertile Crescent’ of Southwest Asia that Childe identified as the source of the Neolithic Revolution, and wheat and barley domestication does seem to appear several thousand years earlier in the region than that of rice in Asia or maize in the Americas. It’s also the region I know best, having spent some time working on the fringes of the Southwest Asian agricultural phenomenon, on Neolithic sites on the Central Anatolian plain. Central Anatolia today is dominated by agriculture; fields of grain roll endlessly across the high plateau, disappearing off into the horizon in a wash of golden-yellow stalks broken only by the occasional irrigation channel. The wild ancestors of the modern suite of crops used in Europe and Western Asia can be found from Anatolia to the Southern Levant: einkorn, emmer wheat and barley, alongside other staple plant foods such as peas, chickpeas and lentils. That is, they can mostly be identified in these regions; the combination of evidence from modern crop DNA sequencing and the finds from archaeological sites paints a very muddy picture of the domestication process. Archaeobotanists are still searching for the wild ancestors of several modern food crops, like the broad bean, and results from DNA sequencing could be explained by a host of scenarios where wild and domestic versions of the same crop from different regions were repeatedly mingled over a period of thousands of years.
In 2015 a team of researchers working at the 23,000-year-old Levantine cave site of Ohalo II on the shores of the Sea of Galilee published new findings on the plant remains recovered from their site. These researchers didn’t just look for the wild ancestors of our modern domesticates on site, they also identified the weeds that go with all of our tastier plants. By looking at the grouping of weeds and at infinitesimal scars on the seed heads, the researchers concluded that the residents had engaged in a sort of ‘trial’ farming, more than 11 millennia before Childe (and more than a few others) saw evidence of agriculture. The mix of weeds and edible grasses, the presence of sickles and the slightly domesticated shape of some of the seeds all suggest that the camp at Ohalo was an early experiment at crop cultivation – but one that ultimately ended up in failure. Like the Natufian experiment in settling down into one place, it seems that many of our initial attempts at revolution sort of fizzled out after a few thousand years or so. The ‘Neolithic Revolution’ seems to be more a ‘Terminal Pleistocene Experiment’, with a very gradually built foundation of new technology and lifestyles occasionally just razed to the ground while everyone goes back to hunting and gathering for a few thousand years.
In the 1970s, while researching the early agricultural group known as the Mound Builders,13 physical anthropologists stumbled on a problem. Agriculture, most archaeologists reckoned, was a critical step on the progressive path towards civilisation. The sheer number of sites that pop up all over the world with evidence of agriculture clearly indicate that the human population had boomed with the advent of farming. For most, there was a clear trajectory from the ‘revolution’ of the Neolithic to the development of cities and all the technological advances our increasing numbers, in such condensed conditions, could come up with. With a rather uncritical concept of progress as ‘a good thing’ in much of archaeology, physical anthropologists suspected that they would find among the Dickson Mounds burials in Illinois, which included remains from both before and after the development of agriculture in the region, evidence of the benefits of this progress. What they actually found was a different story: the population grew, but at a price. Life expectancy was shorter in the farmers at Dickson Mounds, and life in general more risky. Childhood health and adult survival seemed to nosedive; how could this possibly be explained as progress? If the result of the Agricultural Revolution was a slide in living standards, why did it happen again and again, all over the world? How good was our new agriculture-based lifestyle, if we couldn’t stick with it until 12,000 years ago? And how modified was it, really, from what humans did before? Here we come to the actual evidence for the effects of the Neolithic Revolution, taken from bones and teeth.
Researchers have been tracking human progress in the transition to agriculture from bioarchaeological markers for some time. A very early insight into the physical effects of agriculture came from the ‘eloquent’ bones of Abu Hureyra, an archaeological site in Syria that showed signs of human occupation from the hunting and gathering Natufian period through to early agricultural experimentation. Theya Molleson, a pioneering physical anthropologist from the Natural History Museum in London, researched the remains of those early agriculturalists. Molleson described a suite of changes to the bodies of the Abu Hureyrans, a build-up of the bone in locations where muscles might take the strain from repetitive heavy loads. For instance, she identified changes in the neck vertebrae where the weight from a heavy load carried on the head would fall; collapsed vertebrae right at the arch of the spine; and, of all things, an uncommon number of cases of arthritis of the big toe. Alongside these peculiar pathologies, she found considerable evidence of muscle use on the arms and legs; the parts of the bone where the big muscles attach were very built-up. Heavy muscle use encourages the bits of bone where the muscles actually anchor themselves onto the skeleton to expand their surface area in order to attach more tissue: on a skeleton, this might be identified as extra bone formation at the locations where muscles insert. Rightfully rejecting an early initial theory of Neolithic ballerinas, Molleson identified a pattern of wear and tear very specific to the act of grinding grain on a stone quern. Holding this original kneeling ‘plank’ position while grinding grain had devastating effects on bodies, particularly those of the women of Abu Hureyra. While later studies have questioned the extent to which any habitual activity really alters the structure of the skeleton (something we’ll discuss in detail in Chapter 13), it is clear that Molleson had identified a characteristic suite of actions that had real (and lasting) consequences for Neolithic people.
There is strong evidence for a downside to the Neolithic lifestyle. Early anthropologists noted that many population groups in the past had remarkably straight, healthy teeth. There were far fewer criss-crossed incisors, straggling canines or sideways-sprouting wisdom teeth.14 Searching for an explanation, many researchers identified changes in the way we use our teeth as the cause of our recent dental distress. One of the things researchers have noted is that the very hard enamel surface of teeth carries a tiny legacy of microscopic scratches and pits from the many things we chew. Peter Ungar pioneered new methods of studying dental microwear using high-resolution images of teeth blown up to a size where these scratches can be counted and described, and has suggested that by distinguishing between long thin lines, deep scratches and pits, different types of diets15 can be identified. A switch between a diet of hard nuts and seeds (leaving lots of microscopic pits) and a diet more focused on softer fibrous plants like tubers (leaving more scratches) should change the patterns seen in the enamel under the microscope. This is in fact what has happened: there are reports, for instance, of an increase in pitting that are attributed to an uptick in the amount of hard nuts and seeds consumed in the long Neolithic transition of the Late Archaic/Woodland people of North America. Patrick Mahoney, looking back at the ancient Near East, sees the same phenomenon, but adds a few notes of caution – different foods probably require different amounts of chewing, so he tested the idea that soft grassy seed heads wouldn’t change too much with the shift to a Neolithic diet among the Natufian people, but what would change is the pitting on their teeth. The change in pitting could be attributed to the archaic trend (recently revived) for stone-ground grains. The same stone-grinding techniques that were building up the muscles of the women of Abu Hureyra were leaving microscopic traces somewhere else: the near-invisible bits of stone grit you get from bashing rocks against each other went straight into the food, which went straight onto the teeth.16
Many foraged foods – for example, the baobab mentioned above – are fairly tough, fibrous options and require a considerable amount of chewing in order to obtain any nutrition out of them. The soft and mushy cooked carbs that came to predominate our diet with the advent of agriculture, it was theorised, were rather less taxing on our jaws, even if the stone we ground them with did leave craters in our teeth. This created a sort of Pot Noodle effect: soft, slurpable foods means less work for our jaws, which means that less muscle needs to attach to them, which means that they don’t need to be as big to support the muscle; and if the jaw doesn’t need to be big, then perhaps the face doesn’t either. And shrinking faces is exactly what we get, according to observations from different studies by prominent physical anthropologists such as Simon Hillson, Clark Spencer Larsen and C. Loring Brace. Shrinking faces would be neither here nor there in terms of positive or negative effects of agriculture, but the problem is that our faces have our teeth in them. Teeth are under quite strong genetic control, responding far less plastically than the rest of the skeleton to changes in use and environment. If the teeth stay the same size in smaller jaws, or even just shrink more slowly, you’re going to get overcrowding, and the malocclusion (teeth pointing every which way) observed in many remains from the cusp of the Neolithic. While there’s probably a strong component of genetic luck to how well our teeth fit in our jaws, those of us who have had to have our painful wisdom teeth yanked out might want to send the dental bills to those early farmers who kick-started our easy-chewing diet.
It’s not just the size of our jaws and teeth that have changed in the last 12,000 years. Our overall dental health has taken a pretty severe beating. Caries (or cavities) are the holes that enterprising bacteria excavate into our teeth given the right environment. These holes can expose the nerve endings at the heart of our teeth, leading to sensitivity (to heat, to cold, to contact) and occasionally rather excruciating pain. While our teeth do have a built-in defence mechanism, building up bulwarks to try to protect the nerve when the hard outer enamel is eaten away by the lactic acid emitted by well-fed bacteria, caries can work faster and lead to considerable destruction, even the loss of the whole tooth. The determining factors in this whole painful process are the combination of oral bacteria (something you are likely to more or less inherit17) and the foods you feed them every time you put something in your mouth. Caries bacteria are pH-sensitive, thriving in a base environment and less active in an acid one. When you eat, the pH balance of your mouth changes according to the type of food you need to digest. Eating starchy, carby, sugary foods is the best way to encourage caries bacteria – the sugars generated by eating these foods depress the mouth’s pH balance for a longer time, allowing more opportunity for caries to develop. So when we see teeth from the past riddled with holes, there is good reason to suspect that a starchy, sugary diet might be at work – and when it comes to the development of agriculture, we start to see an epidemic of rotting teeth.
Of course, we have good evidence that humans have been getting lots of their calories from carbohydrates for a long time. In Morocco, mobile groups from the Iberomaurusian tool-using culture (not farmers at all) have fairly wretched teeth. Caries seems to have been a big problem for these hunter-gatherers. Researchers identified the likely cause as acorns – a good source of nutrition, but a poor choice from the point of view of non-rotting teeth. Another hiccup in the easy assumption that ‘caries equals farmers’ is the variation in caries between different groups who are eating similar foods. As DNA analysis techniques continue to improve, we might find that different strains of caries-causing bacteria are more or less virulent, and that the kind of oral bacteria you inherit might have a considerable effect on your teeth’s survival. Natural fluoridation of local water sources also plays a role in protecting teeth from caries. While rotting, misaligned teeth are not necessarily the smoking gun of agricultural innovation, the numbers do however suggest that they start to become a real problem for our species round about the time we develop agriculture.
So, how risky was the Neolithic Revolution? It might mess up your teeth, but is that enough to kill you? Given the uptick in births and the fertility ‘revolution’ that researchers have identified during the transition to agriculture, how do we understand all the factors that kept our numbers down? This is a question that colleagues and I have tried to address by looking at the life history information that is locked into the hard enamel of human teeth. Teeth are a wonderful resource. Made of about 98 per cent mineral, they are robust and durable in most archaeological soils, and may survive thousands of years beyond the more fragile bones. Teeth begin to form before birth, and never remodel,18 unlike bone, so they carry the chemical and physical signature of the time when they were growing with them forever. The enterprising dental anthropologist has a range of techniques available to try to reconstruct these signals, giving us the rather unique opportunity to look at human lives in the past, rather than human deaths.
If you are at all familiar with the concept of tree rings, you can imagine a similar scenario at work in your teeth. A tree grows in successive layers, each bounded by a ring as the tree trunk expands year-on-year. In good years, the tree grows quite a bit, and the ring formed is larger; in bad, it grows less, and the ring is smaller. While the analogy is loose,19 your teeth also form in similar successive layers.20 The layers respond to a sort of innate rhythm, an internal timer that runs at about a cycle a week, and where growth stops (and restarts), it leaves a little ring around the tooth. Anyone over 18 will probably have brushed away most of these little rings, but occasionally they might still be visible with the help of a strong light and a good mirror. Being able to count the rings on teeth is of course useful in the same way that counting rings on trees can be – it tells you how long the item in question has been forming. But more than that, it’s when the rings go missing that we start to get real insight into what was happening when your teeth were growing. Where there are gaps in the normal pattern of the rings – basically, depressed grooves or lines on your teeth – it’s a sign that normal growth shut down, usually due to illness, possibly disease, or even malnutrition. These grooves on teeth are called enamel hypoplasia, and they can tell bioarchaeologists whether the child who was growing the teeth was healthy – or rather, when it was not.
Researchers have been aware of the connection between lines on teeth and childhood health for some time. It took the people of Dickson Mounds, however, to really emphasise its importance in tracking how human populations deal with changing circumstances. This 1970s study was one of the first to illustrate that evidence of childhood illness and malnutrition locked into the teeth could be compared between farmers and non-farmers; and when the results were in, the farmers had far more enamel hypoplasia than the non-farmers. There is a general agreement that the transition to agriculture leads to an increase in the number of lines on teeth – this is observed not only in the Near East, but almost everywhere that farming kicked off. This is a critical point – if we have conflicting evidence for how taxing the transition to agriculture was, can we really make such sweeping universal statements? It’s what archaeologists do,21 but we can look to different areas of the world and different types of Neolithics to see that the observations that hold true on the East Coast of the US don’t necessarily match those elsewhere, like Thailand. As technology improved, it has been possible for bioarchaeologists to get ever more detailed information about when and how children were sick in the past; so, armed with some dental kit and a blind faith in my ability to drive across Anatolia, I set off in 2012 to have a (much) closer look at one particular site: Aşıklı Höyük.
Aşıklı Höyük is a great mound of earth built up on the sides of the Melendiz River on the fringes of Turkey’s mountainous Cappadocia region. It’s closest to the modern-day city of Aksaray, but still a lengthy22 drive from just about anywhere. Excavations on the mound began in 1992 under the direction of Ufuk Esin, a pioneering Turkish archaeologist, and continue today under her former student, Mihriban Özbaşaran of Istanbul University. I had met Mihriban and her team while we were both working at the UNESCO World Heritage site of Çatalhöyük in 2008. Çatalhöyük is another mound site on the Anatolian Plateau, and rather better known – decades of excavation there have unearthed a warren of mud-brick houses decorated with painted plaster and cattle skulls and with underfloor burials.23 Dating to nearly 9,000 years ago, Çatalhöyük is billed as one of the earliest ‘cities’ in the world; at its peak, perhaps 10,000 people lived and farmed together. A crack team of physical anthropologists from all over the world gathered at the site every field season to investigate the remains of these early inhabitants, and I was fascinated by the opportunity to learn more about these lives on the edge of the Neolithic Revolution. I was therefore understandably excited when I learned that the Istanbul team was digging another settlement site in Anatolia – and this one was even earlier.
Despite an inauspicious introduction,24 Mihriban and her assistant director Güneş Duru graciously invited me to visit Aşıklı in the summer season of 2012. Aşıklı was turning out to be an enormously important site. Radiocarbon dates showed that the first phase of settlement on the mound was around 10,500 years ago – nearly 1,000 years before the hubbub at Çatalhöyük. What’s more, the site covered almost a millennium of occupation, from early roundhouses that were just a tad more permanent than the seasonal encampments of contemporary hunter-gatherer sites, to a full-blown mud-brick village with wide public spaces and the scythes and storage jars that are the hallmark of an agricultural lifestyle. Of course, nothing would do but for me to come look at the human remains. Here was a chance to look the Neolithic Revolution straight in the face – though in my case, it was the teeth I was really interested in. With the support of the British Institute at Ankara, and Yılmaz and Dilek Erdal at Hacettepe University, I took dental impressions from the dead then carefully carried the impressions back to a basement lair at the Institute of Archaeology at University College London. There I spent a rather tedious amount of time in a sunless room carefully counting lines on teeth.
Fortunately, this was time well spent. The Aşıklı teeth showed a pattern, very faintly, of interruptions to the normal pattern of growth lines. The teeth that came from individuals from the later period of the site showed big grooves where growth had been interrupted, occurring about every two years starting around the age of two. One lone individual, however, had a slightly different signal – there was more evidence of growth disruption around the age of three. This was the one skeleton from the earliest phase identified at Aşıklı – the phase of roundhouses and ambiguous evidence for dependence on agriculture or domesticated animals. While one skeleton does not a conclusion make, it’s tantalising to consider that the problems these children had growing up are linked to the same changes identified by Bocquet-Appel and colleagues: more babies, and more often. For many primates, the most dangerous point in childhood is when the mother turns her attention to the next infant; problems with food supply and illness can accompany a newly free-range child. In Aşıklı, it’s possible that the timing of this rude interruption to childhood health related to the birth of new siblings. In an experimental time, without easy access to branded snack foods (gluten-free or otherwise), it might be that the fertility unleashed by settled life took its toll on the health of the children of the revolution. In the meantime, we must wait while more evidence is uncovered, and excavations at Aşıklı are ongoing.
So our universal ideas of the Neolithic may not be so universal after all. Physical anthropologist Dan Temple has spent nearly25 as much time as I have counting up lines on teeth. Working with Clark Spencer Larsen at Ohio State University on the skeletons and teeth of foragers and subsequent farmers who inhabited prehistoric Japan (the Jomon and Yayoi cultures), Temple has reported that the adoption of agriculture didn’t lead to more lines on teeth or other markers of poor nutrition. The only negative sign he has found in the Japanese Neolithic transition is evidence of a slightly increased infectious disease burden. It is interesting that wet rice agriculture seems to have this effect not only in Japan, but in the well-documented Neolithic transition of Thailand.
As bioarchaeological techniques become more sophisticated, archaeologists will be able to tell more about the survival and fitness of the earliest farmers. Advances in aDNA analysis can also shed light on the success – on a population scale – of Neolithic ways of life. Sophisticated modelling techniques offer new insights as well: recent work by a group led by Steven Shennan has tracked an enormous number of early farming sites in Europe, and their data shows a ‘boom and bust’ pattern of expansion of agriculture-dependent lifestyles. The idea of a steady march of progress as a straight line between the invention of farming and the height of modern civilisation26 has been shown to be more of a St Vitus’s dance – jittery, unpredictable and with a pretty high body count at the end.
1 Looking at you, Daily Mail.
2 e.g. any sentence beginning: ‘In my day …’
3 See the excellent Paleofantasy by Marlene Zuk for a much broader discussion.
4 According to Loren Cordain’s website. It’s unclear what the endorser is the world’s leading expert on, but one suspects it may not be palaeoarchaeology.
5 Everyone has diabetes! Everyone has asthma! Everyone has allergies! Everyone has coeliac disease!
6 Many of these groups have been referred to collectively as the San (lit. ‘foragers’, but with pejorative connotations) or ‘bushmen’ of Southern Africa, but there is a great deal of linguistic and cultural variety in this reductive grouping.
7 My colleague and former office mate Laura Buck spent several months delighting me with ethnographic evidence for the consumption of reindeer stomach, but inexplicably rejected my suggested article title of ‘Reindeer tummies and the Neanderthals who loved them’.
8 This quote and more information on the development of archaeobotany can be found on Dorian Fuller’s website: https://sites.google.com/site/archaeobotany/
9 Loosely, with a great deal of revision and far less revolution.
10 This is perhaps the most simplistic description of Marxist theory in the history of archaeology; for those with the stomach for more on Childe’s Marxism and how it has influenced archaeology, I highly recommend Randall McGuire’s 2006 article ‘Marx, Childe, and Trigger’ in The Works of Bruce G. Trigger: Considering the Contexts of His Influences.
11 Though, according to exacting scientific research by Nobel Prize winner George W. Beadle, they can be made to pop.
12 As Dorian Fuller said in his 2014 Nature article ‘Domestication: The birth of rice’: ‘Nobody mentions the acorns.’
13 So called because they liberally littered the centre and southeast of the modern US with impressive earthworks from around 5,000 years ago right up until the contact period of the 1500s.
14 And, presumably, far fewer dentists.
15 And even tool use – in a world before the invention of the table clamp, teeth were frequently pressed into service as a third hand.
16 And still does; modern stone-ground flour is also likely to contain grit from the grinding process.
17 Caries bacteria are usually transferred directly to an infant – pre-chewed food and maternal affection being prime culprits.
18 Anyone who has ever chipped a tooth will be dramatically aware of this fact.
19 Should my old PhD supervisors ever come across this section, I fully expect reports of eminent physical anthropologists spontaneously combusting to rapidly follow.
20 And in multiple rows, like a shark; baby or milk teeth start growing before birth and the wisdom tooth finishes around age 15. The in-between stage, where both sets are present to some extent, is truly terrifying in X-ray.
21 Especially ones with the temerity to write books for a popular audience.
22 Not to mention slow. It once took me nearly 30 minutes to traverse the small village of Kızılkaya near the site, thanks to successive traffic jams caused by cows, women herding cows, geese, women leading donkeys to follow the cows, chickens, women following the women leading the donkeys following the cows, dogs, and, just in front of the gate, the world’s least motivated tortoise.
23 While this might sound like the description of a particular kind of rural drinking establishment, the decor scheme is actually slightly more apocalyptic – there is an erupting volcano mural, for instance.
24 I may or may not have accidentally destroyed a 9,000-year-old wall.
25 This is my book. He can say he’s winning in his book.
26 As originally conceptualised, this generally means the British Empire. Though occasionally, also the French. Absolutely no one has ever used this concept to describe a world with reality television in it.