The importance of the horse in human history is matched only by the difficulties
inherent in its study; there is hardly an incident in the story which is not the
subject of controversy, often of a violent nature.
1—Grahame Clark, 1941
In the summer of 1985 I went with my wife Dorcas Brown, a fellow archaeologist, to the Veterinary School at the University of Pennsylvania to ask a veterinary surgeon a few questions. Do bits create pathologies on horse teeth? If they do, then shouldn’t we be able to see the signs of bitting—scratches or small patches of wear—on ancient horse teeth? Wouldn’t that be a good way to identify early bitted horses? Could he point us toward the medical literature on the dental pathologies associated with horse bits? He replied that there really was no literature on the subject. A properly bitted horse wearing a well-adjusted bridle, he said, really can’t take the bit in its teeth very easily, so contact between the bit and the teeth would have been too infrequent to show up with any regularity. Nice idea, but it wouldn’t work. We decided to get a second opinion.
At the Veterinary School’s New Bolton Center for large mammals, outside Philadelphia, the trainers, who worked every day with horses, responded very differently. Horses chewed their bits all the time, they said. Some rolled the bit around in their mouths like candy. You could hear it clacking against their teeth. Of course, it was a vice—properly trained and harnessed horses were not supposed to do it, but they did. And we should talk to Hilary Clayton, formerly at New Bolton, who had gone to a university job somewhere in Canada. She had been studying the mechanics of bits in horses’ mouths.
We located Hilary Clayton at the University of Saskatchewan and found that she had made X-ray fluoroscopic videos of horses chewing bits (figure 10.1). She bitted horses and manipulated the reins from a standing position behind. An X-ray fluoroscope mounted beside the horses’ heads took pictures of what was happening inside their mouths. No one had done this before. She sent us two articles co-authored with colleagues in Canada.1 Their images showed just how horses manipulated a bit inside their mouths and precisely where it sat between their teeth. A well-positioned bit is supposed to sit on the tongue and gums in the space between the front and back teeth, called bars “the” of the mouth. When the rider pulls the reins, the bit presses the tongue and the gums into the lower jaw, squeezing the sensitive gum tissue between the bit and the underlying bone. That hurts. The horse will dip its head toward a one-sided pull (a turn) or lower its chin into a two-sided pull (a brake) to avoid the bit’s pressure on its tongue and gums.
Figure 10.1 A modern metal bit in a horse’s mouth. Mandible bone tinted gray. (a) jointed snaffle bit; (b) X-ray of jointed snaffle sitting on the tongue in proper position; (c) X-ray of snaffle being grasped in the teeth; (d) bar bit showing chewing wear; (e) X-ray of bar bit sitting on the tongue in proper position; (f) X-ray of bar bit being grasped in the teeth. After Clayton and Lee 1984; and Clayton 1985.
Clayton’s X-rays showed how horses use their tongues to elevate the bit and then retract it, pushing it back into the grip of their premolars, where it can no longer cause pressure on soft tissue no matter how hard the rider pulls on the reins. The soft corners of the mouth are positioned in front of the molars, so in order to get a bit into its teeth the horse has to force it back against the corners of its mouth. These stretched tissues act like a spring. If the bit is not held very firmly between the tips of the teeth it will pop forward again onto the bars of the mouth. It seemed likely to us that this repeated back-and-forth movement over the tips of the front premolars should affect the lower teeth more than the uppers just because of gravity—the bit sat on the lower jaw. The wear from bit chewing should be concentrated on one small part of two teeth (the lower second premolars, or P2s), unlike the wear from chewing anything else. Clayton’s X-rays made it possible, for the first time, to say positively that a specific part of a single tooth was the place to look for bit wear. We found several published photographs of archaeological horse P2s with wear facets or bevels on precisely that spot. Two well-known archaeological zoologists, Juliet Clutton-Brock in London and Antonio Azzaroli in Rome, had described this kind of wear as “possibly” made by a bit. Other zoologists thought it was impossible for horses to get a bit that far back into their mouth with any frequency, like our first veterinary surgeon. No one knew for sure. But they had not seen Clayton’s X-rays.2
Encouraged and excited, we visited the anthropology department at the Smithsonian Museum of Natural History in Washington, and asked Melinda Zeder, then a staff archaeozoologist, if we could study some never-bitted ancient wild horse teeth—a control sample—and if she could offer us some technical advice about how to proceed. We were not trained as zoologists, and we did not know much about horse teeth. Zeder and a colleague who knew a lot about dental microwear, Kate Gordon, sat us down in the staff cafeteria. How would we distinguish bit wear from tooth irregularities caused by malocclusion? Or from dietary wear, created by normal chewing on food? Would the wear caused by a bit survive very long, or would it be worn away by dietary wear? How long would that take? How fast do horse teeth grow? Aren’t they the kind of teeth that grow out of the jaw and are worn away at the crown until they become little stubs? Would that change bit wear facets with increasing age? What about rope or leather bits—probably the oldest kind? Do they cause wear? What kind? Is the action of the bit different when a horse is ridden from when it pulls a chariot? And what, exactly, causes wear—if it exists? Is it the rider pulling the bit into the front of the tooth, or is it the horse chewing on the bit, which would cause wear on the occlusal (chewing) surface of the tooth? Or is it both? And if we did find wear under the microscope, how would we describe it so that the difference between a tooth with and without wear could be quantified?
Mindy Zeder took us through her collections. We made our first molds of ancient equid P2s, from the Bronze Age city of Malyan in Iran, dated about 2000 BCE. They had wear facets on their mesial corners; later we would be able to say that the facets were created by a hard bit of bone or metal. But we didn’t know that yet, and, as turned out, there really was not a large collection of never-bitted wild horse teeth at the Smithsonian. We had to find our own, and we left thinking that we could do it if we took one problem at a time. Twenty years later we still feel that way.3
Bit wear is important, because other kinds of evidence have proven uncertain guides to early horse domestication. Genetic evidence, which we might hope would solve the problem, does not help much. Modern horses are genetically schizophrenic, like cattle (chapter 8) but with the genders reversed. The female bloodline of modern domesticated horses shows extreme diversity. Traits inherited through the mitochondrial DNA, which passes unchanged from mother to daughter, show that this part of the bloodline is so diverse that at least seventy-seven ancestral mares, grouped into seventeen phylogenetic branches, are required to account for the genetic variety in modern populations around the globe. Wild mares must have been taken into domesticated horse herds in many different places at different times. Meanwhile, the male aspect of modern horse DNA, which is passed unchanged on the Y chromosome from sire to colt, shows remarkable homogeneity. It is possible that just a single wild stallion was domesticated. So horse keepers apparently have felt free to capture and breed a variety of wild mares, but, according to these data, they universally rejected wild males and even the male progeny of any wild stallions that mated with domesticated mares. Modern horses are descended from very few original wild males, and many, varied wild females.4
Wildlife biologists have observed the behavior of feral horse bands in several places around the world, notably at Askania Nova, Ukraine, on the barrier islands of Maryland and Virginia (the horses described in the childrens’ classic Misty of Chincoteague), and in northwestern Nevada. The standard feral horse band consists of a stallion with a harem of two to seven mares and their immature offspring. Adolescents leave the band at about two years of age. Stallion-and-harem bands occupy a home range, and stallions fight one another, fiercely, for control of mares and territory. After the young males are expelled they form loose associations called “bachelor bands,” which lurk at the edges of the home range of an established stallion. Most bachelors are unable to challenge mature stallions or keep mares successfully until they are more than five years old. Within established bands, the mares are arranged in a social hierarchy led by the lead mare, who chooses where the band will go during most of the day and leads it in flight if there is a threat, while the stallion guards the flanks or the rear. Mares are therefore instinctively disposed to accept the dominance of others, whether dominant mares, stallions—or humans. Stallions are headstrong and violent, and are instinctively disposed to challenge authority by biting and kicking. A relatively docile and controllable mare could be found at the bottom of the pecking order in many wild horse bands, but a relatively docile and controllable stallion was an unusual individual—and one that had little hope of reproducing in the wild. Horse domestication might have depended on a lucky coincidence: the appearance of a relatively manageable and docile male in a place where humans could use him as the breeder of a domesticated bloodline. From the horse’s perspective, humans were the only way he could get a girl. From the human perspective, he was the only sire they wanted.
Animal domestication, like marriage, is the culmination of a long prior relationship. People would not invest the time and energy to attempt to care for an animal they were unfamiliar with. The first people to think seriously about the benefits of keeping, feeding, and raising tame horses must have been familiar with wild horses. They must have lived in a place where humans spent a lot of time hunting wild horses and learning their behavior. The part of the world where this was possible contracted significantly about ten thousand to fourteen thousand years ago, when the Ice Age steppe—a favorable environment for horses—was replaced by dense forest over much of the Northern Hemisphere. The horses of North America became extinct as the climate shifted, for reasons still poorly understood. In Europe and Asia large herds of wild horses survived only in the steppes in the center of the Eurasian continent, leaving smaller populations isolated in pockets of naturally open pasture (marsh-grass meadows, alpine meadows, arid mesetas) in Europe, central Anatolia (modern Turkey), and the Caucasus Mountains. Horses disappeared from Iran, lowland Mesopotamia, and the Fertile Crescent, leaving these warm regions to other equids (onagers and asses) (figure 10.2).
Figure 10.2 Map of the distribution of wild horses (Equus caballus) in the mid-Holocene, about 5000 BCE. The numbers show the approximate frequencies of horse bones in human kitchen garbage in each region, derived from charts in Benecke 1994 and from various Russian sources.
In western and central Europe, central Anatolia, and the Caucasus the isolated pockets of horses that survived into the Holocene never became important in the human food quest—there just weren’t enough of them. In Anatolia, for example, a few wild horses probably were hunted occasionally by the Neolithic occupants of Catal Hüyök, Pinarbaşi, and other farming villages in the central plateau region between about 7400 and 6200 BCE. But most of the equids hunted at these sites were Equus hydruntinus (now extinct) or Equus hemionus (onagers), both ass-like equids smaller than horses. Only a few bones are large enough to qualify as possible horses. Horses were not present in Neolithic sites in western Anatolia, or in Greece or Bulgaria, or in the Mesolithic and Early Neolithic of Austria, Hungary, or southern Poland. In western and northern Europe, Mesolithic foragers hunted horses occasionally. But horse bones accounted for more than 5% of the animals in only a few post-Glacial sites in the coastal plain of Germany/Poland and in the uplands of southern France. In the Eurasian steppes, on the other hand, wild horses and related wild equids (onagers, E. hydruntinus) were the most common wild grazing animals. In early Holocene steppe archaeological sites (Mesolithic and early Neolithic) wild horses regularly account for more than 40% of the animal bones, and probably more than 40% of the meat diet because horses are so big and meaty. For this reason alone we should look first to the Eurasian steppes for the earliest episode of domestication, the one that probably gave us our modern male bloodline.5
Early and middle Holocene archaeological sites in the Pontic-Caspian steppes contain the bones of three species of equids. In the Caspian Depression, at Mesolithic sites such as Burovaya 53, Je-Kalgan, and Istai IV, garbage dumps dated before 5500 BCE contain almost exclusively the bones of horses and onagers (see site map, figure 8.3). The onager, Equus hemionus, also called a “hemione” or “half-ass,” was a fleet-footed, long-eared animal smaller than a horse and larger than an ass. The natural range of the onager extended from the Caspian steppes across Central Asia and Iran and into the Near East. A second equid, Equus hydruntinus, was hunted in the slightly moister North Pontic steppes in Ukraine, where its bones occur in small percentages in Mesolithic and Early Neolithic components at Girzhevo and Matveev Kurgan, dated to the late seventh millennium BCE. This small, gracile animal, which then lived from the Black Sea steppes westward into Bulgaria and Romania and south into Anatolia, became extinct before 3000 BCE. The true horse, Equus caballus, ranged across both the Caspian Depression and the Black Sea steppes, and it survived in both environments long after both E. hemionus and E. hydruntinus were hunted out. Horse bones contributed more than 50% of the identified animal bones at Late Mesolithic Girzhevo in the Dniester steppes and Meso/Neolithic Matveev Kurgan and Kammenaya Mogila in the Azov steppes; also at Neo/Eneolithic Varfolomievka and Dzhangar in the Caspian Depression, Ivanovskaya on the Samara River, and Mullino in the southern foothills of the Ural Mountains. The long history of human dependence on wild equids in the steppes created a familiarity with their habits that would later make the domestication of the horse possible.6
The earliest evidence for possible horse domestication in the Pontic-Caspian steppes appeared after 4800 BCE, long after sheep, goats, pigs, and cattle were domesticated in other parts of the world. What was the incentive to tame wild horses if people already had cattle and sheep? Was it for transportation? Almost certainly not. Horses were large, powerful, aggressive animals, more inclined to flee or fight than to carry a human. Riding probably developed only after horses were already familiar as domesticated animals that could be controlled. The initial incentive probably was the desire for a cheap source of winter meat.
Horses are easier to feed through the winter than cattle or sheep, as cattle and sheep push snow aside with their noses and horses use their hard hooves. Sheep can graze on winter grass through soft snow, but if the snow becomes crusted with ice than their noses will get raw and bloody, and they will stand and starve in a field where there is ample winter forage just beneath their feet. Cattle do not forage through even soft snow if they cannot see the grass, so a snow deep enough to hide the winter grass will kill range cattle if they are not given fodder. Neither cattle nor sheep will break the ice on frozen water to drink. Horses have the instinct to break through ice and crusted snow with their hooves, not their noses, even in deep snows where the grass cannot be seen. They paw frozen snow away and feed themselves and so do not need water or fodder. In 1245 the Franciscan John of Plano Carpini journeyed to Mongolia to meet Güyük Khan (the successor to Genghis) and observed the steppe horses of the Tartars, as he called them, digging for grass from under the snow, “since the Tartars have neither straw nor hay nor fodder.” During the historic blizzard of 1886 in the North American Plains hundreds of thousands of cattle were lost on the open range. Those that survived followed herds of mustangs and grazed in the areas they opened up.7 Horses are supremely well adapted to the cold grasslands where they evolved. People who lived in cold grasslands with domesticated cattle and sheep would soon have seen the advantage in keeping horses for meat, just because the horses did not need fodder or water. A shift to colder climatic conditions or even a particularly cold series of winters could have made cattle herders think seriously about domesticating horses. Just such a shift to colder winters occurred between about 4200 and 3800 BCE (see chapter 11).
Cattle herders would have been particularly well suited to manage horses because cattle and horse bands both follow the lead of a dominant female. Cowherds already knew they needed only to control the lead cow to control the whole herd, and would easily have transferred that knowledge to controlling lead mares. Males presented a similar management problem in both species, and they had the same iconic status as symbols of virility and strength. When people who depended on equid-hunting began to keep domesticated cattle, someone would soon have noticed these similarities and applied cattle-management techniques to wild horses. And that would quickly have produced the earliest domesticated horses.
This earliest phase of horse keeping, when horses were primarily a recalcitrant but convenient source of winter meat, may have begun as early as 4800 BCE in the Pontic-Caspian steppes. This was when, at Khvalynsk and S’yezzhe in the middle Volga region, and Nikol’skoe on the Dnieper Rapids, horse heads and/or lower legs were first joined with the heads and/or lower legs of cattle and sheep in human funeral rituals; and when bone carvings of horses appeared with carvings of cattle in a few sites like S’yezzhe and Varfolomievka. Certainly horses were linked symbolically with humans and the cultured world of domesticated animals by 4800 BCE. Horse keeping would have added yet another element to the burst of economic, ritual, decorative, and political innovations that swept across the western steppes with the initial spread of stockbreeding about 5200–4800 BCE.
We decided to investigate bit wear on horse teeth, because it is difficult to distinguish the bones of early domesticated horses from those of their wild cousins. The Russian zoologist V. Bibikova tried to define a domesticated skull type in 1967, but her small sample of horse skulls did not define a reliable type for most zoologists.
The bones of wild animals usually are distinguished from those of domesticated animals by two quantifiable measurements: measurements of variability in size, and counts of the ages and sexes of butchered animals. Other criteria include finding animals far outside their natural range and detecting domestication-related pathologies, of which bit wear is an example. Crib biting, a stall-chewing vice of bored horses, might cause another domestication-related pathology on the incisor teeth of horses kept in stalls, but it has not been studied systematically. Marsha Levine of the McDonald Institute at Cambridge University has examined riding-related pathologies in vertebrae, but vertebrae are difficult to study. They break and rot easily, their frequency is low in most archaeological samples, and only eight caudal thoracic vertebrae (T11–18) are known to exhibit pathologies from riding. Discussions of horse domestication still tend to focus on the first two methods.8
The size-variability method depends on two assumptions: (1) domesticated populations, because they are protected, should contain a wider variety of sizes and statures that survive to adulthood, or more variability; and (2) the average size of the domesticated population as a whole should decline, because penning, control of movement, and a restricted diet should reduce average stature. Measurements of leg bones (principally the width of the condyle and shaft) are used to look for these patterns. This method seems to work quite well with the leg bones of cattle and sheep: an increase in variability and reduction in average size does apparently identify domesticated cattle and sheep.
But the underlying assumptions are not known to apply to the earliest domesticated horses. American Indians controlled their horses not in a corral but with a “hobble” (a short rope tied between the two front legs, permitting a walk but not a run). The principal advantage of early horse keeping—its low cost in labor—could be realized only if horses were permitted to forage for themselves. Pens and corrals would defeat this purpose. Domesticated horses living and grazing in the same environment with their wild cousins probably would not show a reduction in size, and might not show an increase in variability. These changes could be expected if and when horses were restricted to shelters and fed fodder over the winter, like cattle and sheep were, or when they were separated into different herds that were managed and trained differently, for example, for riding, chariot teams, or meat and milk production.
During the earliest phase of horse domestication, when horses were free-ranging and kept for their meat, any size reductions caused by human control probably would have been obscured by natural variations in size between different regional wild populations. The scattered wild horses living in central and western Europe were smaller than the horses that lived in the steppes. In figure 10.3, the three bars on the left of the graph represent wild horses from Ice Age and Early Neolithic Germany. They were quite small. Bars 4 and 5 represent wild horses from forest-steppe and steppe-edge regions, which were significantly bigger. The horses from Dereivka, in the central steppes of Ukraine, were bigger still; 75% stood between 133 and 137 cm at the withers, or between 13 and 14 hands. The horses of Botai in northern Kazakhstan were even bigger, often over 14 hands. West-east movements of horse populations could cause changes in their average sizes, without any human interference. This leaves an increase in variability as the only indicator of domestication during the earliest phase. And variability is very sensitive to sample size—the larger the sample of bones, the better the chance of finding very small and very large individuals—so changes in variability alone are difficult to separate from sample-size effects.
Figure 10.3 The size-variability method for identifying the bones of domesticated horses. The box-and-whisker graphs show the thickness of the leg bones for thirteen archaeological horse populations, with the oldest sites (Paleolithic) on the left and the youngest (Late Bronze Age) on the right. The whiskers, showing the extreme measurements, are most affected by sample size and so are unreliable indicators of population variability. The white boxes, showing two standard deviations from the mean, are reliable indicators of variability, and it is these that are usually compared. The increase in this measurement of variability in bar 10 is taken as evidence for the beginning of horse domestication. After Benecke and von den Dreisch 2003, figures 6.7 and 6.8 combined.
The domestication of the horse is dated about 2500 BCE by the size-variability method. The earliest site that shows both a significant decrease in average size and an increase in variability is the Bell Beaker settlement of Csepel-Háros in Hungary, represented by bar 10 in figure 10.3, and dated about 2500 BCE. Subsequently many sites in Europe and the steppes show a similar pattern. The absence of these statistical indicators at Dereivka in Ukraine, dated about 4200–3700 BCE (see chapter 11), and at Botai-culture sites in northern Kazakhstan, dated about 3700–3000 BCE, are widely accepted as evidence that horses were not domesticated before about 2500 BCE. But marked regional size differences among early wild horses, the sensitivity of variability measurements to sample size effects, and the basic question of the applicability of these methods to the earliest domesticated horses are three reasons to look at other kinds of evidence. The appearance of significant new variability in horse herds after 2500 BCE could reflect the later development of specialized breeds and functions, not the earliest domestication.9
The second quantifiable method is the study of the ages and sexes of butchered animals. The animals selected for slaughter from a domesticated herd should be different ages and sexes from those obtained by hunting. Herders would probably cull young males as soon as they reached adult meat weight, at about two to three years of age. A site occupied by horse herders might contain very few obviously male horses, since the eruption of the canine teeth in males, the principal marker of gender in horse bones, happens at about age four or five, after the age when the males should have been slaughtered for food. Females should have been kept alive as breeders, up to ten years old or more. In contrast, hunters prey on the most predictable elements of a wild herd, so they would concentrate their efforts on the standard wild horse social group, the stallion-with-harem bands, which move along well-worn paths and trails within a defined territory. Regular hunting of stallion-with-harem bands would yield a small number of prime stallions (six to nine years old) and a large number of breeding-age females (three to ten years old) and their immature young.10
But many other hunting and culling patterns are possible, and might be superimposed on one another in a long-used settlement site. Also, only a few bones in a horse’s body indicate sex—a mature male (more than five years old) has canine teeth whereas females usually do not, and the pelvis of a mature female is distinctive. Horse jaws with the canines still embedded are not often preserved, so data on gender are spotty. Age is estimated based on molar teeth, which preserve well, so the sample for age estimation usually is bigger. But assigning a precise age to a loose horse molar, not found in the jaw, is difficult, and teeth are often found loose in archaeological sites. We had to invent a way to narrow down the very broad range of ages that could be assigned to each tooth. Further, teeth are part of the head, and heads may receive special treatment. If the goal of the analysis is to determine which horses were culled for food, heads are not necessarily the most direct indicators of the human diet. If the occupants of the site kept and used the heads of prime-age stallions for rituals, the teeth found in the site would reflect that, and not culling for food.11
Marsha Levine studied age and sex data at Dereivka in Ukraine (4200–3700 BCE) and Botai in northern Kazakhstan (3700–3000 BCE), two critical sites for the study of horse domestication in the steppes. She concluded that the horses at both sites were wild. At Dereivka the majority of the teeth were from animals whose ages clustered between five and seven years old, and fourteen of the sixteen mandibles were from mature males.12 This suggested that most of the horse heads at Dereivka came from prime-age stallions, not the butchering pattern expected for a managed population. But, in fact, it is an odd pattern for a hunted population as well. Why would hunters kill only prime stallions? Levine suggested that the Dereivka hunters had stalked wild horse bands, drawing the attention of the stallions, which were killed when they advanced to protect their harems. But stalking in the open steppe is probably the least productive way for a pedestrian hunter to attack a wild horse band, as stallions are more likely to alarm their band and run away than to approach a predator. Pedestrian hunters should have used ambush methods, shooting at short range on a habitually used horse trail. Moreover, the odd stallion-centered slaughter pattern of Dereivka closely matches the slaughter pattern at the Roman military cemetery at Kestren, the Netherlands (figure 10.4), where the horses certainly were domesticated. At Botai, in contrast, the age-and-sex profile matched what would be expected if whole wild herds were slaughtered en masse, with no selection for age or sex. The two profiles were dissimilar, yet Levine concluded that horses were wild at both places. Age and sex profiles are open to many different interpretations.
If it is difficult to distinguish wild from domesticated horses, it is doubly problematic to distinguish the bones of a mount from those of a horse merely eaten for dinner. Riding leaves few traces on horse bones. But a bit leaves marks on the teeth, and teeth usually survive very well. Bits are used only to guide horses from behind, to drive or to ride. They are not used if the horse is pulled from the front, as a packhorse is, as this would just pull the bit out of the mouth. Thus bit wear on the teeth indicates riding or driving. The absence of bit wear means nothing, since other forms of control (nosebands, hackamores) might leave no evidence. But its presence is an unmistakable sign of riding or driving. That is why we pursued it. Bit wear could be the smoking gun in the long argument over the origins of horseback riding and, by extension, in debates over the domestication of the horse.
Figure 10.4 The sge-at-death method for identifying the bones of domesticated horses. This graph compares the age-at-death statistics for Late Eneolithic horses from Dereivka, Ukraine, to domesticated horses from the Roman site of Kesteren, Netherlands. The two graphs are strikingly similar, but one is interpreted as a “wild” profile and the other is “domesticated.” After Levine 1999, figure 2.21.
After Brown and I left the Smithsonian in 1985 we spent several years gathering a collection of horse lower second premolars (P2s), the teeth most affected by bit chewing. Eventually we collected 139 P2s from 72 modern horses. Forty were domesticated horses processed through veterinary autopsy labs at the University of Pennsylvania and Cornell University. All had been bitted with modern metal bits. We obtained information on their age, sex, and usage—hunting, leisure, driving, racing, or draft—and for some horses we even knew how often they had been bitted, and with what kind of bit. Thirteen additional horses came from the Horse Training and Behavior program at the State University of New York at Cobleskill. Some had never been bitted. We made casts of their teeth in their mouths, much as a dentist makes an impression to fit a crown—we think that we were the first people to do this to a living horse. A few feral horses, never bitted, were obtained from the Atlantic barrier island of Assateague, MD. Their bleached bones and teeth were found by Ron Keiper of Penn State, who regularly followed and studied the Assateague horses and generously gave us what he had found. Sixteen Nevada mustangs, killed in 1988 by ranchers, supplied most of our never-bitted P2s. I read about the event, made several telephone calls, and was able to get their mandibles from the Bureau of Land Management after the kill sites were documented. Many years later, in a separate study, Christian George at the University of Florida applied our methods to 113 more never-bitted P2s from a minimum of 58 fossil equids 1.5 million years old. These animals, of the species Equus “leidyi,” were excavated from a Pleistocene deposit near Leisey, Florida. George’s Leisey equids (the same size, diet, and dentition as modern horses) had never seen a human, much less a bit.13
We studied high-resolution casts or replicas of all the P2s under a Scanning Electron Microscope (SEM). The SEM revealed that the vice of bit chewing was amazingly widely practiced (figure 10.5). More than 90% of the bitted horses showed some wear on their P2s from chewing the bit, often just on one side. Their bits also showed wear from being chewed. Riding creates the same wear as driving, because it is not the rider or driver who creates bit wear—it is the horse grasping and releasing the bit between its teeth. A metal bit or even a bone bit creates distinctive microscopic abrasions on the occlusal enamel of the tooth, usually confined to the first or metaconid cusp, but extending back to the second cusp in many cases. These abrasions (type “a” wear, in our terminology) are easily identified under a microscope. All bits, whether hard (metal or bone) or soft (rope or leather) also create a second kind of wear: a wear facet or bevel on the front (mesial) corner of the tooth. The facet is caused both by direct pressure (particularly with a hard bit of bone or metal), which weakens and cracks the enamel when the bit is squeezed repeatedly between the teeth; and by the bit slipping back and forth over the front or mesial corner of the P2. Metal bits create both kinds of wear: abrasions on the occlusal enamel and wear facets on the mesial corner of the tooth. But rope bits probably were the earliest kind. Can a rope bit alone create visible wear on the enamel of horse teeth?
With a grant from the National Science Foundation and the cooperation of the State University of New York (SUNY) at Cobleskill we acquired four horses that had never been bitted. They were kept and ridden at SUNY Cobleskill, which has a Horse Training and Behavior Program and a shirty-five-horse stable. They ate only hay and pasture, no soft feeds, to mimic the natural dental wear of free-range horses. Each horse was ridden with a different organic bit—leather, horsehair rope, hemp rope, or bone—for 150 hours, or 600 hours of riding for all four horses. The horse with the horsehair rope bit was bitted by tying the rope around its lower jaw in the classic “war bridle” of the Plains Indians, yet it was still able to loosen the loop with its tongue and chew the rope. The other horses’ bits were kept in place by antler cheek-pieces made with flint tools. At four intervals each horse was anaesthetized by a bemused veterinarian, and we propped open its mouth, brushed its teeth, dried them, pulled its tongue to the side, and made molds of its P2s (figure 10.6). We tracked the progress of bit wear over time, and noted the differences between the wear made by the bone bit (hard) and the leather and rope bits (soft).14
Figure 10.5 Bit wear and no wear on the lower second premolars (P2s) of modern horses.
Left: a Scanning Electron Micrograph (SEM) taken at 13x of “a-wear” abrasions on the first cusp of a domesticated horse that was bitted with a metal bit. The profile shows a 3.5 mm bevel or facet on the same cusp.
Right: An SEM taken at 15x of the smooth surface of the first cusp of a feral horse from Nevada, never bitted. The profile shows a 90˚ angle with no bevel.
Figure 10.6 Brown and Anthony removing a high-resolution mold of the P2 of a horse bitted with an organic bit at State University of New York, Cobleskill, in 1992.
The riding experiment demonstrated that soft bits do create bit wear. The actual cause of wear might have been microscopic grit trapped in and under the bit, since all the soft bits were made of materials softer than enamel. After 150 hours of riding, bits made of leather and rope wore away about 1 mm of enamel on the first cusp of the P2 (figure 10.7). The mean bevel measurement for the three horses with rope or leather bits at the end of the experiment was more than 2 standard deviations greater than the pre-experiment mean.15 The rope and leather mouthpieces stood up well to chewing, although the horse with the hemp rope bit chewed through it several times. The horses bitted with soft bits showed the same wear facet on the same part of the P2 as horses bitted with metal and bone bits, but the surface of the facet was microscopically smooth and polished, not abraded. Hard bits, including our experimental bone bit, create distinctive “a” wear on the occlusal enamel of the facet, but soft bits do not. Soft bit wear is best identified by measuring the depth of the wear facet or bevel on the P2, not by looking for abrasions on its surface.
Figure 10.7 Graph showing the increase in bevel measurements in millimeters caused by organic bits over 150 hours of riding, with projections of measurements if riding had continued for 300 hours.
TABLE 10.1 Bevel Mea sure ments on the P2s of Bitted and Never- Bitted Mature (>3yr) Horses
Table 10.1 shows bevel mearurements for modern horses that never were bitted (left column); Pleistocene North American equids that never were bitted (center left column); domestic horses that were bitted, including some that were bitted infrequently (center right column); and a smaller sub-group of domestic horses that were bitted at least five times a week up to the day we made molds of their teeth (right column). Measurements of the depth of the wear facet easily distinguished the 73 teeth of bitted horses from the 105 teeth of never-bitted horses. The never-bitted/bitted means are different at better than the .001 level of significance. The never-bitted/daily-bitted means are more than 4 standard deviations apart. Bevel measurements segregate mature bitted from mature never-bitted horses, as populations.16
We set a bevel measurement of 3.0 mm as the minimum threshold for recognizing bit wear on archaeological horse teeth (figure 10.8). More than half of our occasionally bitted teeth did not exhibit a bevel measuring as much as 3 mm . But all horses in our sample with a bevel of 3 mm or more had been bitted. So the last question was, how adequate was our sample? Could a 3 mm wear facet occur naturally on a wild horse P2, caused by malocclusion? Criticisms of bit wear have centered on this problem.17
Figure 10.8 From our 1998 data: bevel measurements of never bitted, occasionally bitted, and frequently bitted horse teeth plotted against age. All domesticated horses had precisely known ages; all feral horses were aged by examining entire mandibles with intact incisor teeth. The line excludes feral horses and horses aged ≤3 yr. and includes only bitted horses. After Brown and Anthony 1998.
Very young horses with newly erupted permanent premolars do display natural dips and rises on their teeth. New permanent premolars are uneven because they have not yet been worn flat by occlusion with the opposing tooth. We had to exclude the teeth of horses two to three years old for that reason. But among the 105 measurable P2s from mature equids that had never been bitted, Pleistocene to modern, we found that a “natural” bevel measurement of more than 2.0 mm is unusual (less than 3% of teeth), and a bevel of 2.5 mm is exceedingly rare (less than 1%). Only one of the 105 never-bitted teeth had a bevel measurement greater than 2.5 mm—a single tooth from the Leisey equids with a mesial bevel of 2.9 mm (the next-nearest bevel was 2.34 mm). In contrast, bevels of 2.5 mm and more occurred in 58% of the teeth of mature horses that were bitted.18
A bevel of 3 mm or more on the P2 of a mature horse is evidence for either an exceedingly rare malocclusion or a very common effect of bitting. If even one mature horse from an archaeological site shows a bevel ≥3 mm bit wear is suggested, but is not a closed case. If multiple mature horses from a single site show mesial bevel measurements of 3 mm or more, they probably were bitted. I should stress that our method depends on the accurate measurement of a very small feature—a bevel or facet just a few millimeters deep. According to our measurements on 178 P2 teeth of mature equids the difference between a 2 mm and a 3 mm bevel is extremely important. In any discussion of bit wear, precise measurements are required and young animals must be eliminated. But until someone finds a population of mature wild horses that displays many P2 teeth with bevels ≥3 mm, bit wear as we have defined it indicates that a horse has been ridden or driven.19
Many archaeologists and historians in the first half of the twentieth century thought that horses were first domesticated by Indo-European–speaking peoples, often specifically characterized as Aryans, who also were credited with inventing the horse-drawn chariot. This fascination with the Aryans, or Ariomania, to use Peter Raulwing’s term, dominated the study of horseback riding and chariots before World War II.20
In 1964 Dimitri Telegin discovered the head-and-hoof bones of a seven- to eight-year-old stallion buried together with the remains of two dogs at Dereivka in Ukraine, apparently a cultic deposit of some kind (see figure 11.9). The Dereivka settlement contained three excavated structures of the Sredni Stog culture and the bones of a great many horses, 63% of the bones found. Ten radiocarbon dates placed the Sredni Stog settlement about 4200–3700 BCE, after the Dnieper-Donets II and Early Khvalynsk era. V. I. Bibikova, the chief paleozoologist at the Kiev Institute of Archaeology, declared the stallion a domesticated horse in 1967. The respected Hungarian zoologist and head of the Hungarian Institute of Archaeology, Sandor Bökönyi, agreed, noting the great variabity in the leg dimensions of the Dereivka horses. The German zoologist G. Nobis also agreed. During the late 1960s and 1970s horse domestication at Dereivka was widely accepted.21
For Marija Gimbutas of UCLA, the domesticated horses at Dereivka were part of the evidence which proved that horse-riding, Indo-European–speaking “Kurgan-culture” pastoralists had migrated in several waves out of the steppes between 4200 and 3200 BCE, destroying the world of egalitarian peace and beauty that she imagined for the Eneolithic cultures of Old Europe. But the idea of Indo-European migrations sweeping westward out of the steppes was not accepted by most Western archaeologists, who were increasingly suspicious of any migration-based explanation for culture change. During the 1980s Gimbutas’s scenario of massive “Kurgan-culture” invasions into eastern and central Europe was largely discredited, notably by the German archaeologist A. Häusler. Jim Mallory’s 1989 masterful review of Indo-European archaeology retained Gimbutas’s steppe homeland and her three waves as periods of increased movement in and around the steppes, but he was much less optimistic about linking specific archaeological cultures with specific migrations by specific Indo-European branches. Others, myself included, criticized both Gimbutas’s archaeology and Bibikova’s interpretation of the Dereivka horses. In 1990 Marsha Levine seemed to nail the coffin shut on the horse-riding, Kurgan-culture invasion hypothesis when she declared the horse age and sex ratios at Dereivka to be consistent with a wild, hunted population.22
Brown and I visited the Institute of Zoology in Kiev in 1989, the year after Levine, learning of her trip only after we arrived. With the cheerful help of Natalya Belan, a senior zoologist, we made molds of dozens of horse P2s from many archaeological sites in Ukraine. We examined one P2 from Early Eneolithic Varfolomievka in the Caspian Depression (no wear), one from the Tripolye A settlement of Luka Vrublevetskaya (no wear), several from Mesolithic and Paleolithic sites in Ukraine (no wear), many from Scythian and Roman-era graves (a lot of bit wear, some of it extreme), and those of the cult stallion and four other horse P2s from Dereivka. As soon as we saw the Dereivka cult stallion we knew it had bit wear. Its P2s had bevels of 3.5 mm and 4 mm, and the enamel on the first cusp was deeply abraded. Given its stratigraphic position at the base of a Late Eneolithic cultural level almost 1 m deep, dated by ten radiocarbon dates to 4200–3700 BCE, the cult stallion should have been about two thousand years older than the previously known oldest evidence for horseback riding. Only four other P2s still survived in the Dereivka collection: two deciduous teeth from horses less than 2.5 years old (not measurable), and two others from adult horses but with no bit wear. So our case rested on a single horse. But it was very clear wear—surprisingly similar to modern metal bit wear. In 1991 we published articles in Scientific American and in the British journal Antiquity announcing the discovery of bit wear at Dereivka. Levine’s conclusion that the Dereivka horses were wild had been published just the year before. Briefly we were too elated to worry about the argument that would follow.23
It began when A. Häusler challenged us at a conference in Berlin in 1992. He did not think the Dereivka stallion was Eneolithic or cultic; he deemed it a Medieval garbage deposit, denying there was evidence for a horse cult anywhere in the steppes during the Eneolithic. That the wear looked like metal bit wear was part of the problem, since a metal bit was improbable in the Eneolithic. Häusler’s target was bigger than bit wear or even horse domestication: he had dedicated much of his career to refuting Gimbutas’s “Kurgan-culture” migrations and the entire notion of a steppe Indo-European homeland.24 The horses at Dereivka were just a small piece in a larger controversy. But criticisms like his forced us to obtain a direct date on the skull itself.
Telegin first sent us a bone sample from the same excavation square and level as the stallion. It yielded a date between 90 BCE and 70 BCE (OxA 6577), our first indication of a problem. He obtained another anomalous radiocarbon date, ca. 3000 BCE, on a piece of bone that, like our first sample, seems not to have been from the stallion itself (Ki 5488). Finally, he sent us one of the bit-worn P2s from the cult stallion. The Oxford radiocarbon laboratory obtained a date of 410–200 BCE from this tooth (OxA 7185). Simultaneously the Kiev radiocarbon laboratory obtained a date of 790–520 BCE on a piece of bone from the skull (Ki 6962). Together these two samples suggest a date between 800 and 200 BCE.
The stallion-and-dog deposit at Dereivka was of the Scythian era. No wonder it had metal bit wear—so did many other Scythian horse teeth. It had been placed in a pit dug into the Eneolithic settlement between 800 and 200 BCE. The archaeologists who excavated this part of the site in 1964 did not see the intrusive pit. In 2000, nine years after our initial publication in Antiquity, we published another Antiquity article retracting the early date for bit wear at Dereivka. We were disappointed, but by then Dereivka was no longer the only prehistoric site in the steppes with bit wear.25
Figure 10.9 Horse-related sites of Eneolithic or older age in the western and central Eurasian steppes. The steppe ecological zone is enclosed in dashed lines.
(1) Moliukhor Bugor; (2) Dereivka; (3) Mariupol; (4) Matveev Kurgan; (5) Girzhevo; (6) Kair Shak; (7) Dzhangar; (8) Orlovka; (9) Varfolomievka; (10) Khvalynsk; (11) S’yezzhe; (12) Tersek; (13) Botai
The oldest horse P2s showing wear facets of 3 mm and more are from the Botai and Tersek cultures of northern Kazakhstan (figure 10.9). Excavated through the 1980s by Victor Zaibert, Botai was a settlement of specialized hunters who rode horses to hunt horses, a peculiar kind of economy that existed only between 3700 and 3000 BCE, and only in the steppes of northern Kazakhstan. Sites of the Botai type, east of the Ishim River, and of the related Tersek type, west of the Ishim, contain 65–99.9%/horse bones. Botai had more than 150 house-pits (figure 10.10) and 300,000 animal bones, 99.9% of them horse. A partial list of the other species represented at Botai (primarily by isolated teeth and phalanges) includes a very large bovid, probably bison but perhaps aurochs, as well as elk, red deer, roe deer, boar, bear, beaver, saiga antelope, and gazelle. Horses, not the easiest prey for people on foot, were overwhelmingly preferred over these animals.26
Figure 10.10 A concentration of horse bones in an excavated house pit at the Botai settlement in north-central Kazakhstan, dated about 3700–3000 BCE. Archaeozoologist Lubomir Peske takes meas urements during an international conference held in Kazakhstan in 1995 “Early Horsekeepers of the Eurasian Steppe 4500–1500 BC.” Photo by Asko Parpola.
We visited Zaibert’s lab in Petropavlovsk, Kazakhstan, in 1992, again unaware that Marsha Levine had arrived the year before. Among the forty-two P2s we examined from Botai, nineteen were acceptable for study (many had heavily damaged surfaces, and others were from horses younger than three years old). Five of these nineteen teeth, representing at least three different horses, had significant bevel measurements: two 3 mm, one 3.5 mm, one 4 mm, and one 6 mm . Wear facets on undamaged portions of the Botai P2s were polished smooth, the same kind of polish created by “soft” bits in our experiment. The five teeth were found in different places across the settlement—they did not come from a single intrusive pit. The proportion of P2s exhibiting bit wear at Botai was 12% of the entire sample of P2s provided, or 26% of the nineteen measurable P2s. Either number was just too high to explain by appealing to a rare natural malocclusion (figure 10.11). We also examined the horse P2s from a Tersek site, Kozhai 1, dated to the same period, 3700–3000 BCE. At Kozhai 1 horses accounted for 66.1% of seventy thousand identified animal bones (others were saiga antelope at 21.8%, onager at 9.4%, and bison, perhaps including some very large domesticated cattle, at 2.1%). We found a 3 mm wear facet on two P2s of the twelve we examined from Kozhai 1. Most of the P2s at Botai and Kozhai 1 did not exhibit bit wear, but a small percentage (12–26%) did, consistent with the interpretation that the Botai-Tersek people were mounted horse hunters.27
Figure 10.11 Three horse with bit wear from the Botai settlement. The photos show extensive postmortem damage to the occlusal surfaces. The undamaged middle tooth showed smooth enamel surfaces but had a significant wear facet, like a horse ridden with a “soft” bit of rope or leather.
Botai attracted the attention of everyone interested in early horse domestication. Two field excavations by Western archaeologists (Marsha Levine and Sandra Olsen) have occurred at Botai or Botai-culture sites. The original excavator, Victor Zaibert, the Kazakh zoologist L.A. Makarova, and the American archaeozoologist Sandra Olsen of the Carnegie Museum of Natural History in Pittsburgh all concluded that at least some of the Botai horses were domesticated. In opposition, the archaeo-zoologists N. M. Ermolova, Marsha Levine, and the German team Norbert Benecke and Angela von den Dreisch concluded that all the Botai horses were wild.28 Levine found some pathologies in the Botai vertebrae but attributed them to age. Benecke and von den Dreisch showed that the Botai horses exhibited a narrow range of variability in size, like Paleolithic wild populations. The ages and sexes of the Botai horses were typical of a wild population, with a 1:1 ratio between the sexes, including all age groups, even colts and pregnant mares with gestating fetuses. Everyone agrees that whole herds of wild horses were killed by the Botai people, using herd-driving hunting techniques that had never been used before in the Kazakh steppes, certainly not on this scale. Were the hunters riding or on foot? Native American hunters on foot drove bison herds over cliffs before the introduction of horses to the Americas by Europeans, so herd driving was possible without riding.
Sandra Olsen of the Carnegie Museum concluded that at least some Botai horses were used for transport, because whole horse carcasses were butchered regularly over the course of several centuries in the settlement at Botai.29 How would pedestrian hunters drag eight-hundred-pound carcasses to the settlement, not just once or twice but as a regular practice that continued for centuries? Pedestrian hunters who used herd-driving hunting methods in the European Paleolithic at Solutré (where Olsen had worked earlier) and in the North American Plains butchered large animals where they died at the kill site. But the Botai settlement is located on the open, south-facing slope of a broad ridge top in a steppe environment—wild horses could not have been trapped in the settlement. Either some horses were tamed and could be led into the settlement or horses were used to drag whole carcasses of killed animals into the settlement, perhaps on sleds. Olsen’s interpretation was supported by soil analysis from a house pit at Botai (Olsen’s excavation 32) that revealed a distinct layer of soil filled with horse dung. This “must have been the result of redeposition of material from stabling layers,” according to the soil scientists who examined it.30 This dung-rich soil was removed from a horse stable or corral. The stabling of horses at Botai obviously suggests domestication.
One more argument for horseback riding is that the slaughter of wild populations with a 1:1 sex ratio could only be achieved by sweeping up both stallion-with-harem bands and bachelor bands, and these two kinds of social groups normally live far apart in the wild. If stallion-with-harem bands were driven into traps, the female:male ratio would be more than 2:1. The only way to capture both bachelor bands and harem bands in herd drives is to actively search and sweep up all the wild horses in a very large region. That would be impossible on foot.
Finally, the beginning of horseback riding provides a good explanation for the economic and cultural changes that appeared with the Botai-Tersek cultures. Before 3700 BCE foragers in the northern Kazkah steppes lived in small groups at temporary lakeside camps such as Vinogradovka XIV in Kokchetav district and Tel’manskie in Tselinograd district. Their remains are assigned to the Atbasar Neolithic.31 They hunted horses but also a variety of other game: short-horned bison, saiga antelope, gazelle, and red deer. The details of their foraging economy are unclear, as their camp sites were small and ephemeral and have yielded relatively few animal bones. Around 3700–3500 BCE they shifted to specialized horse hunting, started to use herd-driving hunting methods, and began to aggregate in large settlements—a new hunting strategy and a new settlement pattern. The number of animal bones deposited at each settlement rose to tens or even hundreds of thousands. Their stone tools changed from microlithic tool kits to large bifacial blades. They began to make large polished stone weights with central perforations, probably for manufacturing multi-stranded rawhide ropes (weights are hung from each strand as the strands are twisted together). Rawhide thong manufacture was one of the principal activities Olsen identified at Botai based on bone tool microwear. For the first time the foragers of the northern Kazakh steppes demonstrated the ability to drive and trap whole herds of horses and transport their carcasses into new, large communal settlements. No explanation other than the adoption of horseback riding has been offered for these changes.
The case for horse management and riding at Botai and Kozhai 1 is based on the presence of bit wear on seven Botai-Tersek horse P2s from two different sites, carcass transport and butchering practices, the discovery of horse-dung–filled stable soils, a 1:1 sex ratio, and changes in economy and settlement pattern consistent with the beginning of riding. The case against riding is based on the low variability in leg thickness and the absence of riding-related pathologies in a small sample of horse vertebrae, possibly from wild hunted horses, which probably made up 75–90% of the horse bones at Botai. We are reasonably certain that horses were bitted and ridden in northern Kazakhstan beginning about 3700–3500 BCE.
Horseback riding probably did not begin in northern Kazakhstan. The Botai-Tersek people were mounted foragers. A few domesticated cattle (?) bones might be found in some Tersek sites, but there were none in Botai sites, farther east; and neither had sheep.32 It is likely that Botai-Tersek people acquired the idea of domesticated animal management from their western neighbors, who had been managing domesticated cattle and sheep, and probably horses, for a thousand years before 3700–3500 BCE.
The evidence for riding at Botai is not isolated. Perhaps the most interesting parallel from beyond the steppes is a case of severe wear on a mesial horse P2 with a bevel much deeper than 3 mm, on a five-year-old stallion jaw excavated from Late Chalcolithic levels at Mokhrablur in Armenia, dated 4000–3500 BCE. This looks like another case of early bit wear perhaps even older than Botai, but we have not examined it for confirmation.33 Also, after about 3500 BCE horses began to appear in greater numbers or appeared regularly for the first time outside the Pontic-Caspian steppes. Between 3500 and 3000 BCE horses began to show up regularly in settlements of the Maikop and Early Transcaucasian Culture (ETC) in the Caucasus, and also for the first time in the lower and middle Danube valley in settlements of the Cernavoda III and Baden-Boleraz cultures as at Cernavoda and Kétegyháza. Around 3000 BCE horse bones rose to about 10–20% of the bones in Bernberg sites in central Germany and to more than 20% of the bones at the Cham site of Galgenberg in Bavaria. The Galgenburg horses included a native small type and a larger type probably imported from the steppes. This general increase in the importance of horses from Kazakhstan to the Caucasus, the Danube valley, and Germany after 3500 BCE suggests a significant change in the relationship between humans and horses. Botai and Tersek show what that change was: people had started to ride.34
Over the long term it would have been very difficult to manage horse herds without riding them. Anywhere that we see a sustained, long-term dependence on domesticated horses, riding is implied for herd management alone. Riding began in the Pontic-Caspian steppes before 3700 BCE, or before the Botai-Tersek culture appeared in the Kazakh steppes. It may well have started before 4200 BCE. It spread outside the Pontic-Caspian steppes between 3700 and 3000 BCE, as shown by increases in horse bones in southeastern Europe, central Europe, the Caucasus, and northern Kazakhstan.
A person on foot can herd about two hundred sheep with a good herding dog. On horseback, with the same dog, that single person can herd about five hundred.35 Riding greatly increased the efficiency and therefore the scale and productivity of herding in the Eurasian grasslands. More cattle and sheep could be owned and controlled by riders than by pedestrian herders, which permitted a greater accumulation of animal wealth. Larger herds, of course, required larger pastures, and the desire for larger pastures would have caused a general renegotiation of tribal frontiers, a series of boundary conflicts. Victory in tribal warfare depended largely on forging alliances and mobilizing larger forces than your enemy, and so intensified warfare stimulated efforts to build alliances through feasts and the redistribution of wealth. Gifts were effective both in building alliances before conflicts and in sealing agreements after them. An increase in boundary conflicts would thus have encouraged more long-distance trade to acquire prestigious goods, as well as elaborate feasts and public ceremonies to forge alliances. This early phase of conflict, caused partly by herding on horseback, might be visible archaeologically in the horizon of polished stone mace-heads and body decorations (copper, gold, boars-tusk, and shell ornaments) that spread across the western steppes with the earliest herding economies about 5000–4200 BCE.36
Horses were valuable and easily stolen, and riding increased the efficiency of stealing cattle. When American Indians in the North American Plains first began to ride, chronic horse-stealing raids soured relationships even between tribes that had been friendly. Riding also was an excellent way to retreat quickly; often the most dangerous part of tribal raiding on foot was the running retreat after a raid. Eneolithic war parties might have left their horses under guard and attacked on foot, as many American Indians did in the early decades of horse warfare in the Plains. But even if horses were used for nothing more than transportation to and from the raid, the rapidity and reach of mounted raiders would have changed raiding tactics, status-seeking behaviors, alliance-building, displays of wealth, and settlement patterns. Thus riding cannot be cleanly separated from warfare.37
Many experts have suggested that horses were not ridden in warfare until after about 1500–1000 BCE, but they failed to differentiate between mounted raiding, which probably is very old, and cavalry, which was invented in the Iron Age after about 1000 BCE.38 Eneolithic tribal herders probably rode horses in inter-clan raids before 4000 BCE, but they were not like the Huns sweeping out of the steppes on armies of shaggy horses. What is intriguing about the Huns and their more ancient cousins, the Scythians, was that they formed armies. During the Iron Age the Scythians, essentially tribal in most other aspects of their political organization, became organized in their military operations like the formal armies of urban states. That required a change in ideology—how a warrior thought about himself, his role, and his responsibilities—as well as in the technology of mounted warfare—how weapons were used from horseback. Probably the change in weapons came first.
Mounted archery probably was not yet very effective before the Iron Age, for three reasons. The bows reconstructed from their traces in steppe Bronze Age graves were more than 1 m long and up to 1.5 m, or almost five feet, in length, which would clearly have made them clumsy to use from horseback; the arrowheads were chipped from flint or made from bone in widely varying sizes and weights, implying a nonstandardized, individualized array of arrow lengths and weights; and, finally, the bases of most arrowheads were made to fit into a hollow or split shaft, which weakened the arrow or required a separate hollow foreshaft for the attachment of the point. The more powerful the bow, and the higher the impact on striking a target, the more likely the arrow was to split, if the shaft had already been split to secure the point. Stemmed and triangular flint points, common before the Iron Age, were made to be inserted into a separate foreshaft with a hollow socket made of reed or wood (for stemmed points), or were set into a split shaft (for triangular points). The long bows, irregular arrow sizes, and less-than-optimal attachments between points and arrows together reduced the military effectiveness of early mounted archery. Before the Iron Age mounted raiders could harass tribal war bands, disrupt harvests in farming villages, or steal cattle, but that is not the same as defeating a disciplined army. Tribal raiding by small groups of riders in eastern Europe did not pose a threat to walled cities in Mesopotamia, and so was ignored by the kings and generals of the Near East and the eastern Mediterranean.39
The invention of the short, recurved, compound bow (the “cupid” bow) around 1000 BCE made it possible for riders to carry a powerful bow short enough to swing over the horse’s rear. For the first time arrows could be fired behind the rider with penetrating power. This maneuver, later known as the “Parthian shot,” was immortalized as the iconic image of the steppe archer. Cast bronze socketed arrowheads of standard weights and sizes also appeared in the Early Iron Age. A socketed arrowhead did not require a split-shaft mount, so arrows with socketed arrowheads did not split despite the power of the bow; they also did not need a separate foreshaft, and so arrows could be simpler and more streamlined. Reusable moulds were invented so that smiths could produce hundreds of socketed arrowheads of standard weight and size. Archers now had a much wider field of fire—to the rear, the front, and the left—and could carry dozens of standardized arrows. An army of mounted archers could now fill the sky with arrows that struck with killing power.40
But organizing an army of mounted archers was not a simple matter. The technical advances in bows, arrows, and casting were meaningless without a matching change in mentality, in the identity of the fighter, from a heroic single warrior to a nameless soldier. An ideological model of fighting appropriate for a state had to be grafted onto the mentality of tribal horseback riders. Pre-Iron-Age warfare in the Eurasian steppes, from what we can glean from sources like the Iliad and the Rig Veda, probably emphasized personal glory and heroism. Tribal warfare generally was conducted by forces that never drilled as a unit, often could choose to ignore their leaders, and valued personal bravery above following orders.41 In contrast, the tactics and ideology of state warfare depended on large disciplined units of anonymous soldiers who obeyed a general. These tactics, and the soldier mentality that went with them, were not applied to riders before 1000 BCE, partly because the short bows and standardized arrows that would make mounted archery truly threatening had not yet been invented. As mounted archers gained in firepower, someone on the edge of the civilized world began to organize them into armies. That seems to have occurred about 1000–900 BCE. Cavalry soon swept chariotry from the battlefield, and a new era in warfare began. But it would be grossly inappropriate to apply that later model of mounted warfare to the Eneolithic.
Riding began in the region identified as the Proto-Indo-European homeland. To understand how riding affected the spread of Indo-European languages we have to pick up the thread of the archaeological narrative that ended in chapter 9.