Battle

On April 9, 2005, 2.6 million American cable-TV viewers were treated to a fifteen-minute spectacle of blood and carnage: the finale of Spike TV’s new reality show The Ultimate Fighter. Two young contestants—a Chicago personal trainer called Stephan “The American Psycho” Bonnar, and an ex-policeman from Georgia, Forrest Griffin—entered the cage to trade punches, kicks, and knee-strikes in an all-out brawl that left the face of the winner, Griffin, looking like a promo shot from Stephen King’s Carrie. Forrest Griffin would go on to a highly successful career in no-holds-barred martial arts (he later knocked out Brazilian fighter, Edson Paredao, in their “Heat Fighting Championship” bout after Paredao had fractured Griffin’s arm with a torturous “joint lock”) and Bonnar scored a fight contract, too, but the real winner was the supposedly brutal new sport of ultimate fighting. Within three years the Ultimate Fighting Championship, or UFC, would displace the National Hockey League championship as the fourth-most popular sports event in the United States, and attain a regular audience of over 4.5 million. The UFC today is a rolled-gold, blood-spattered American success story, and its president, Dana White, credits it all to the impact of that Griffin vs. Bonnar fight.

What makes the story truly remarkable is that just 10 years earlier the sport looked dead and buried. The original UFC had started in 1993 as a boy’s-own fantasy fight club pitting martial artists from different schools against one another to see who would win a real-life, no-rules street fight. But the apparently brutal mismatches that resulted—as when 176-pound Brazilian Jiu-Jitsu fighter, Royce Gracie, battled 250-pound American wrestler, Dan Severn—led U.S. Senator John McCain to campaign successfully for the competition to be banned in 37 U.S. states (though the smaller Gracie, in fact, won that fight). The competition was even dropped from its cable-TV lifeline for almost 5 years. Under Dana White’s patient tutelage, however, it managed to rehabilitate itself. By introducing weight divisions, gloves, and rules banning strikes to the eyes, head, and groin, the UFC was eventually able to obtain licenses from almost every American state’s governing athletic commission. UFC fights now take place in 39 American states, and have not only returned to cable TV but have also spawned several imitators on major free-to-air networks.

What is the secret of the UFC’s phenomenal appeal? Fans of Mixed Martial Arts, or MMA as ultimate-fighting enthusiasts prefer to call it, insist it is the superb athleticism of the combatants. That MMA fighters are extremely fit is undoubtedly true: I myself once watched a 265-pound giant Tongan UFC gladiator, Australian-born Soa “The Hulk” Palelei, train for a super-heavyweight “King of the Cage” title fight, including running speed ladders that would have left many a sprinter gasping in his wake. (Some UFC training is, however, quite unconventional: at one point I heard a strange thump and looked up to see Soa, complete with green mohawk and tribal tattoos, smashing a sledgehammer repeatedly into a giant tractor tire.) Yet the UFC’s marketing tells a different tale. UFC fights are given apocalyptic names—“Judgement Day,” “Nemesis,” “Shootout,” “The Uprising,” and so on—which hark back to the competition’s original tagline: “There Are No Rules!” Such names clearly aim to invoke an inbuilt male love of uninhibited violence. Anyone who doubts that there is such an inbuilt love should read testimony from those English soccer hooligans who say things such as: “I get so much pleasure when I’m having aggro [a fight] that I nearly wet my pants,” and “For me fighting is fun. I feel a great emotion when I hear the other guy scream in pain.”1 The genius of the UFC is to cater to this instinctual delight by providing a legalized fantasy spectacle of men pitted against one another in caged fights to the possible death.

Except that…nobody ever actually dies. Or even (Forrest Griffin notwithstanding) gets particularly injured.

I first stumbled across this intriguing little tidbit on that same research visit to Soa Palelei’s “King of the Cage” event. Soa’s fight had a solid undercard of seven warm-up bouts, ranging from featherweight to the heavyweights just below him, and I managed to sneak into the fighters’ rooms and interview every one, beaten or victorious, as they walked, limped, or were dragged in afterward. They were incredibly gracious to talk to me, considering the hammering some had just taken and the craziness of my questions, but the consistent refrain I received was that not one had ever been seriously injured in any MMA fight. Not even the guy I found mashing his purple, welted face into a handful of ice had anything to report—“Nah, mate, this shiner’s the worst I’ve ever had, and even it’ll be a memory by Sunday.” Considering the carnage I’d been expecting, this was a puzzle, but I thought maybe it was just because these were younger fighters, too green to have been really hammered yet. So I slipped into the heavyweights’ rooms to interview Brian Ebersole, a veteran cage fighter with more than fifty light-heavyweight bouts under his belt. Surely there would be tales of horror and maiming somewhere among those? I found Ebersole, whom I’d met earlier at the fight weigh-in, flat on his back, preparing for his fight by meditating (to keep his heart rate down) and smearing his face with Vaseline.

“Hey, man, what’s with the Vas?” I asked.

“Protects the face from cuts,” he grunted. “The punches just slide off, and it keeps the edge of the glove from tearing your skin, too.” (MMA fighters wear lightly padded, fingerless gloves that allow their hands freedom to grapple and choke.)

Aha! Finally, I was getting somewhere. Incredibly, however, Ebersole also claimed his fifty-plus cage fights had left him completely unscarred.

“I’ve had a couple of sprains, but no breaks to speak of,” he shrugged.

“Not even a broken nose?” I pleaded. By this stage I was getting desperate.

Ebersole just laughed. “It usually cops some hits and bleeds a bit,” he said, turning so I could see its straight profile. “But this baby’s never even been bent, and I can guarantee you it won’t be tonight either.”

Sure enough, half an hour and three brutal, flailing rounds later, Ebersole returned to the rooms victorious—his face spattered with his own bright, arterial nasal blood, but the nose itself still proudly unbroken.

A review of the medical research quickly confirmed this puzzling find: ultimate fighting really is a ridiculously safe form of combat. A 2007 study of competitive-fighting injuries as reported by the emergency departments of 100 American hospitals, for example, found that martial arts resulted in far fewer injuries than either wrestling or boxing.2 (Competitive basketball, incredibly, has an injury rate seven times that of martial arts.) A mere 1 percent of those martial arts injuries were serious enough to require hospitalization. True, other studies have found a more equal rate—such as a 2008 survey of the 635 official MMA fights that took place in Nevada between 2002 and 2007, which found an injury rate of 23.6 injuries per 100 fights, compared to roughly 25 per 100 bouts in boxing3—but this disguises the fact that boxing injuries tend to be far more severe than those suffered in the ultimate-fighting cage. The 2007 American emergency-room study found that while almost 90 percent of boxing injuries were to the head and upper body (the most dangerous injury sites), less than 50 percent of martial arts injuries were to these areas. This seems to tally with the results of a 2006 Johns Hopkins Medical School study that found the knockout and concussion rate in MMA fights to be about half those of boxing bouts. This may sound, to some ears, a little strange: why should boxing with gloves lead to more head injuries than the bare-knuckle brawling of ultimate fighting? The secret lies in our misunderstanding of what gloves are for—they don’t so much protect the punchee’s head as the puncher’s hand, allowing him to strike that much harder. Former UFC gladiator Ken Shamrock confirms this, saying:

When fighting bare-knuckle, if you slam your fist again and again into the head or face of your opponent, all you will do is to fracture your hand. Trust me here. I have done this more than once.4

By way of further proof, the first result of the UFC’s introduction of light gloves in 1997 saw also an immediate increase in the (admittedly still low) rate of fights ending in knockouts.

It is probably this low incidence of knockouts, coupled with strict rules that lead to early referee intervention, that accounts for the virtually zero mortality rate in modern ultimate fighting. The UFC itself has never had a death in the octagon (the UFC’s ring), although boxing over the same period worldwide has seen more than eighty.5 True, there have been three fatalities in non-UFC MMA competitions, but each of these was in unusual and extenuating circumstances. American MMA fighter Douglas Dedge, for example, who died from brain injuries after a no-holds-barred fight in Ukraine in 1998, seems to have had a pre-existing condition, possibly a skull fracture, that caused doctors to strongly advise him against fighting.6 Similarly, a Korean fighter called Lee died in an event called “Gimme 5” in 2004, but from a heart attack rather than any injuries. The one confirmed death in an officially licensed bout, that of Texan Sam Vasquez in 2007 by subdural brain hemorrhage, was found to be probably caused by a collision with a cage post rather than strikes to the head (the Texan Medical Advisory Committee later recommended the adoption of standardized post padding). Another probable factor in the UFC’s so-far-nonexistent death rate is the use of choking and strangulation techniques borrowed from judo. Though these sound, and look, brutal (the triangle choke, for example, involves strangling the opponent by crushing his neck in the crook of one leg, thereby squeezing his carotid arteries and shutting down his brain’s blood supply), they rarely result in injuries other than temporary unconsciousness. A 1998 study in the Italian Journal of Neuropsychology, for example, found that long-time judo practitioners showed almost no later loss of brain function, compared to professional boxers, of whom as many as 87 percent did.7

Clearly, in terms of sustaining serious injuries, modern no-holdsbarred fighting is about as ultimate as sword fighting with cardboard sabers. Yet has mano-a-mano fighting always been this distressingly wussy? What about in days gone by—were deaths and injuries more common in ancient ultimate fighting?

Curiously, there was such a sport in the Western tradition, and it even went by a very similar name. In 648 BCE a no-holds-barred combat sport event called Pankration “All Powerful” or “Anything Goes,” was introduced to the Greek Olympic Games, eventually becoming so popular that it was turned into the games’ finale. The Pankration was a brutal striking and grappling fight with no time limits, no weight divisions, and just two rules: no eye gouging and no biting (the Spartans, true to form, permitted even these). Everything else was allowed—breaking limbs, strangulation, fish-hooking (inserting fingers into orifices and ripping), and the wrenching and breaking of fingers and toes. (A Greek vase dated to 520 BCE, for example, clearly shows one pankratiast kicking another in the testicles.) A careful reading of Greek literature shows that, as far as Pankration was concerned, the “ultimate” tag was, in this case, completely justified: ancient pankratiasts really did often pay the ultimate price for entering the arena.

A volume of Olympian Odes by the sixth-century BCE Greek poet Pindar, for example, records that “very many [pankratiasts] died in the contests.”8 A more specific reference is found in the later works of Philostratus, who records a letter from a trainer to his pankratiast pupil’s mother, telling her that “if you should hear your son has died, believe it.”9 The great first century Greco-Jewish philosopher Philo agreed, writing that “many times [wrestlers and pankratiasts] endured to the death.”10 Philo even cites the amazing tale of two pankratiasts who attacked each other so ferociously that both died simultaneously. This is possibly exaggeration, since Philo doesn’t claim to have witnessed the fight himself, yet there is one undeniably genuine instance of a pankratiast dying just as he secured victory: the case of Arrichion, a three-time victor in the Olympic Pankration, who won his third crown by surviving strangulation just long enough to dislocate or break his opponent’s ankle (the sources here are unclear) but then died. Judges crowned his corpse the winner. Given that all these references were to deaths that took place at the actual Olympics (which, though famous, were just one of a multitude of Greek games at which the Pankration was fought), it seems likely that the mortality rate in ancient Greek ultimate fighting was very high indeed.

Perhaps the final proof of its lethality, though, is the feared pankratiast Dioxippus, who in 336 BCE won by default since nobody dared even to get into the arena with him. Clearly, would-be competitors understood that losing to this champion meant not just defeat, but probably death.

Incredibly, however, Pankration was not even the most lethal ancient Greek Olympic combat sport. That honor, as in modern times, goes to boxing. One competitor in both Pankration and boxing, Kleitomachos of Thebes, for example, asked Olympic officials to hold the boxing after the Pankration, since he had a greater chance of being wounded or killed in the former than the latter.11 Once again, the ancient Greek literature confirms his concerns were only too well founded. Four definite references to deaths in Greek athletic boxing have come down to us (the true number was certainly many more), some of them stunningly gruesome. One work by Greek writer and geographer Pausanias, for instance, describes the victory (but then disqualification) in the 496 or 492 BCE Olympic Games of the boxer Cleomedes, who killed his opponent, Iccus, by driving his hand into his stomach and disemboweling him.12 This sounds incredible, but ancient Greek boxers were, in fact, allowed a variety of lethal, bare-handed chops, slaps, and strikes forbidden to modern fighters. There is also evidence from Asian martial arts that such a feat is possible: tae kwon do masters maintain that a “spear-hand” thrust, followed by savage grasping and tearing at the impact point can rip skin and muscle.13 Another possibility is that Cleomedes penetrated Iccus’s chest cavity, which is much more easily ruptured. But perhaps the final proof of Cleomedes’ gory feat is the fact that the very same thing happened again a century later (this time at the Nemean Games) when a boxer called Damoxenus killed his opponent, Creugas, in a penalty punchoff after a grueling, day-long struggle, with another disemboweling spear-hand to the gut.14 The deceased Creugas, like Iccus before him, was posthumously crowned victorious, but this, too, demonstrates how lethal Greek boxing was—Cleomedes and Damoxenus were stripped of their crowns not for killing but for foul blows.

Those who did survive the boxing or Pankration seem, moreover, to have frequently suffered severe injuries. Although no skeletons of confirmed ancient boxers and pankratiasts have yet been found, clearly broken bones would have been very common among them. Apart from the possible ankle fracture by Arrichion, we also have records of a man called Sostratus who won three Olympic Pankration competitions by simply grabbing his opponent’s fingers and bending them back until they either broke or the pain became unbearable.15 This, in the modern UFC, is called small-joint manipulation, and is outlawed due to the crippling injuries it causes (wrist fractures, apparently, are a frequent result). The Greco-Roman father of modern medicine, Galen, writes disparagingly of pankratiasts whose eyes have been knocked out. Vase paintings dating back to the sixth century BCE show that despite the ban on gouging, fighters could punch with thumbs and fingers extended. Boxing injuries seem to have been even more catastrophic—so bad that a second-century manual on dream interpretation, the Oneirocritica, lists dreams of boxing as bad omens foreshadowing serious bodily harm. The face of a famous first-century BCE bronze statue of a boxer found in Rome bears a broken nose, cauliflower ears, and numerous gaping cuts—the Greeks jokingly called these last “ant tracks.” The statue’s hands show where such cuts came from: Greek boxers wore sharp leather thongs wrapped around their knuckles, not to protect their opponent’s head, but to damage it. (The facial wounds that resulted were so severe that one ancient Greek boxer apparently failed to inherit his dead father’s estate because he no longer resembled his own portrait enough to prove his identity.) Even these cruel instruments, however, were nothing compared to the barbaric refinements that the Romans introduced into boxing. Their gloves, the infamous caestus (also known as “limb-breakers”), featured projecting metal spikes, lumps of lead sewn into them, and jutting metal plates with serrated, saw-like edges. Fights using these must have caused almost as many violent injuries as those suffered in straight-out gladiatorial contests.

Given the nonexistent mortality rate of modern ultimate fighting, it seems positively cruel to compare it with actual Roman gladiatorial death fights—truly the ultimate in ultimate fighting. There’s also the fact that gladiators, unlike UFC fighters, used weapons. Yet boasts by UFC competitors of their willingness to die in the octagon make at least one comparison fair: how does their supposed readiness to face death compare to that of the Roman gladiators? Clearly, for a start, their follow-through doesn’t quite measure up to their hype. UFC star Ken Shamrock, for example, once vowed at a UFC fight to “get my respect or die” (he presumably got his respect, since he is alive and fighting at the time of writing). Then there is Aleksander Emelianenko, an expert in the Russian combat sport of Sambo and likewise still alive, who once told an interviewer he was ready to fight “…on foot or on horseback. With maces or poleaxes. To first blood or to death.” Such vows sound impressive, until you compare them to the sacramentum gladiatorum, the sacred oath by which Roman gladiators agreed to be uri, vinciri, verberari, ferroque necari patior, or “burnt, bound with chains, beaten and put to death by sword.”16 What’s more, the gladiators really did follow through. Even low estimates of gladiatorial mortality rates, taken from the least bloodthirsty periods of the empire’s history, put their chances of dying in the arena at one in every nine appearances. Nor were these quick, simple deaths in the heat of combat. If defeated and given the crowd’s shouted order IUGULA! (“lance him through”), the defeated gladiator was expected to kneel, clasp his opponent’s thigh, and offer his neck as the victor drove a sword into it or cut his throat. He was also required to maintain a stoic silence, neither screaming nor begging for mercy. Nor did his travails end here. If he somehow managed to hang onto life through this treatment he was then dragged away, through the “Gate of Death” by the libitinarii (“funeral men”), and then killed with a blow to the temple by a hammer-wielding servant playing the part of Dis Pater, the Roman god of the underworld. (A second century gladiators’ cemetery recently excavated in Ephesus, Turkey, shows that 15 percent of gladiators received these blows.17)

Equally lethal hammers were employed in contests in prehistoric Australia, again by men demonstrating a readiness to die far exceeding that of modern UFC champions. A survey of ninety-four skulls of prehistoric Australian Aboriginal men held by the Adelaide Museum in South Australia, for example, showed fifty-four to have severe fractures from strikes by knobkerries, or fighting clubs.18 Remarkably, these probably came from the brutal dispute-resolution process first recorded by nineteenth-century anthropologist John Fraser, in which Aboriginal men took turns to kneel and receive a blow to the head, the loser being the first to die or otherwise become incapacitated. It is unclear exactly what the death rate was in these fights, but it must have been high. (As an interesting aside, this cultural practice, according to paleoanthropologist Peter Brown, may also have left its stamp on the skeletal form of modern Aboriginal people: they have the most robust skulls of any living Homo sapiens—possibly due to this selective pressure.)

Clearly, then, we modern males have mouths far bigger than our hearts—at least where fighting and dying are concerned. But what about the sheer love of the fight itself? Are we really the hot-headed, bare-knuckled brawlers, ready to fight at the drop of a hat, we’re so often cracked up to be? Some media reports would have it so. English newspapers, for example, often describe the weekend streets of London as charnel houses of bloody, booze-fueled brawling. Some statistics, certainly, bear this out—violent incidents on streets and in hotels rose from 39 percent of all UK violence to 49 percent between 1996 and 2004.19 A 2006 study by the University of London, similarly, found that one in eight young men admitted having indulged in some form of recreational violence in the past year, and a five-year survey of fifty-eight British hospital emergency wards found an average 0.75 percent of the male population presented annually with injuries from assaults or other violence.20

These figures do seem to indicate an impressive level of aggression, but how does this aggression compare to the drunken brawling of earlier cultures? An exact comparison might seem impossible, since most such cultures have long since vanished, yet there is, surprisingly, one study that does allow a close evaluation. In the 1960s and 1970s anthropologist Mac Marshall conducted a study of male alcohol abuse and violence in the Truk group of the Micronesian Caroline Islands. Trukese culture was still heavily traditional at the time of Marshall’s arrival, partly due to the intense aggression of the islands’ men, who had long kept colonization at bay (Truk was known to ancient mariners as “dread Hogoleu”). Marshall described a violent drinking culture that set aside “battleground” areas in almost every village for drunken weekend brawling.21 Young men would strut these “battlegrounds” issuing high-pitched war cries, giving swinging kung-fu kicks (Bruce Lee had been quickly adopted as a warrior role model by the modern aggressive Trukese), and generally seeking opportunities for violence. Clan loyalties meant that the inevitable fights quickly became all-in brawls involving multiple armed participants. While he gives no direct statistics, Marshall implies participation in these brawls far exceeded the one-in-eight ratio of young British males listed above. Similarly, Marshall does not calculate injury and death rates, but the Trukese fondness for fearsome homemade weaponry (such as nanchaku made from steel pipes) coupled with their disregard of injury (Trukese warriors often deliberately sliced their own arms open to show their bravery to enemies) suggests far more than 0.75 percent of the islands’ men acquire wounds severe enough for hospitalization in any given year—or indeed, on any given weekend.


image City of fight

To the modern traveler Venice is a city of high culture: the “Queen of the Adriatic,” the “City of Light.” Particularly charming are its beautiful stone bridges such as the “Bridge of Sighs”—the covered limestone walkway from which criminals were given their last sight of the medieval city before they were thrown into the dungeon. Yet few know that these same bridges were once the scene of brutal mass pugni “fistfights” in which thousands of men from the city’s two main factions, the Castellani and Nicolotti, beat, stabbed, and drowned one another for fun. From 1369 to 1710 CE, great mobs of fishermen, arsenal workers, porters, and tanners held battagliole sui ponti, “little wars on the bridges,” for possession of the tiny stone arches that marked the boundaries of each faction’s territory. Such wars usually began with scores of mostre, or individual fistfights, in which champions such as Magnomorti, “Eats the Dead,” Zuzzateste, “Sucker of Heads,” and Tre Riose de Cul, “Three Ashole Roses” or i.e. “Three Farts,” fought to bloody their opponent’s face or throw him into the canal. Such brutal fist fests usually failed to satisfy the bloodlust of the tens of thousands of onlookers though, leading them to take matters into their own hands by pelting the combatants with roof tiles and rushing onto the bridges with fists, sticks, and daggers flailing. As ferocious as these fights were, the pugni were an improvement on the earlier guerre di canne, in which fighters charged each other en masse with staves of cane sharpened and hardened by repeated dipping in boiling oil (fighters also wore specially designed armor and helmets). The battagliole sui ponti only began to decline in the 1650s, when the loss of their best fighters to the war with the Turks led the Castellani faction to suffer repeated defeats. The coup de grace came in 1705, when the brawlers proved so unwilling to desist from their recreational violence that they wouldn’t even save the church of San Girolamo from burning down. The city’s secretive governing body, the Council of Ten, didn’t see the joke and shut the pugni down five years later. image


Even the prehistoric Trukese, however, probably couldn’t match the aggression of another group of big-drinking brawlers—the pre-modern Irish. The Victorian-era boyos’ fondness for recreational violence was simply mind-boggling. Of the 1,932 homicides reported to police between 1866 and 1892, for example, 41 percent were from brawling for fun.22 At roughly 35 deaths per year (and no doubt countless injuries) this might seem unbelievably high, but for two factors. First, the Irish didn’t fight bare-handed, but with lethal stick weapons such as the lead-filled, knobbed, blackthorn-wood shillelagh—a cross between a walking stick and a long croquet mallet.23 Second, they also didn’t fight alone, but in massive gangs, sometimes of hundreds or even thousands, known as factions. These armed hordes might form on the flimsiest of excuses: Limerick, for example, was the battleground of the “Three-Year-Olds” and “Four-Year-Olds,” who took opposing sides in a 30-year debate over the age of a calf. Equally absurd were the pretexts used to get a stick fight going—Irish author William Carleton describes challengers strutting before opposing factions at county fairs (a frequent venue for faction fights) swinging their shillelaghs and shouting, “Ram’s horns! Who dares say anything’s crookeder than ram’s horns?” or, “Black’s the white of my eye. Who dares say black’s not the white of my eye?” The fact that these were consensual fights, engaged in purely for fun, is shown by the refusal of Irish courts to convict those who killed in faction fights—only 8 percent ever received prison sentences of more than 2 years.

In the small wars of the arena and the street, then, we modern males obviously would have been judged unfit even for the reserves. But what about in real wars? How does the savagery of modern combat compare with that of ancient war? Again, media reports often give the impression that we modern, war-mongering males simply blow our ancient counterparts away. In 2006, for example, newspaper and TV outlets worldwide reported that the 2003 invasion of Iraq had officially become the most destructive war in America’s history, based on casualty estimates from a Johns Hopkins Bloomberg School of Public Health study published in The Lancet that same year.24 That study estimated 654,965 Iraqi civilians and combatants had perished as a result of coalition military activities in Iraq between March 2003 and July 2006—an overall death rate of 2.5 percent for those forty months, or 0.79 percent annually. (Meaning, basically, that almost 1 person in 100 died from military violence every year.) Several commentators pointed out that this appalling statistic represents almost double the percentage population loss of the United States in the American Civil War, making the second Gulf War the most violent in U.S. history.25 While The Lancet study’s figure is contentious, it seems fair to take it as a baseline for comparison.26 Such carnage, after all, seems only too believable in light of the devastating advances in the destructive power of modern weapons.

Remarkably, however, these concerns turn out to be simple reruns of arguments that have consumed every generation before us. Members of America’s “Greatest Generation” (those born in the first quarter of the twentieth century) for example, often proudly declare World War II, which they fought and won, the most devastating in world history. In terms of percentage of population lost, however, this is not strictly true. Just 3 percent of the world’s 1938 population died in World War II—approximately 0.5 percent per year. This pales in comparison to the population loss suffered by Germany in the early seventeenth century due to the Thirty Years War, which may have reached 30 percent, or an average 1 percent of population per year.

The tendency to overestimate current catastrophe seems, in fact, to be a simple resurfacing of the old “Golden Age” concept first introduced by the Greek poet Hesiod, who wrote of an “Age of Gold” when men lived forever in perfect peace—vastly different to the degenerate Greeks of his day, who lived in a violent “Age of Iron.” Another way of putting it is that the atrocities each generation experiences directly burn far brighter in its memory than those it merely reads about. Though naïve, this is perhaps forgivable, considering exactly the same mistake was made by scientific anthropologists for most of the last century. Archaeologist Lawrence Keeley, in his book War Before Civilization, describes the mistaken notion of a “pacified past” that so blinkered twentieth-century archaeologists that they described finds of ruined Neolithic forts—walled with palisades and littered with the flint arrowheads of their attackers—as “symbolic” enclosures (the frequent scatterings of broken human bone were described as possible funeral by-products). In fact, as Keeley’s own research shows, the idea that prehistoric males were pacifists couldn’t be more wrong.

His survey of annual death rates from war among twenty-three prehistoric societies around the world, for instance, reveals an average annual mortality rate of 0.56 percent.27 This already seems appallingly close to The Lancet study’s figures, yet several statistical quirks indicate prehistoric casualty rates were often higher. First, 0.56 percent is an average rate—at least five of Keeley’s societies considerably exceed The Lancet study’s rate.28 The second quirk is that many of those groups that Keeley records as having a lower rate than wartime Iraq often maintained their rate for decades. Keeley describes two New Guinean societies, the Mae Enga and the Tauade, who averaged an annual death rate of 0.32 percent for over fifty years. Given that, at the time of writing, the casualty rate in Iraq is falling dramatically, it seems highly possible the peak casualty rate in the second Gulf War will be confined to the five years between 2003 and 2008. If so, it might take just another five years for Iraq’s annual war mortality rate to equal that of the Mae Enga and Tauade. Within another five (fifteen years in total) it might well fall to half. Even faced with the devastating power of U.S. cluster-bomb artillery shells and four-thousand-round-per-minute mini-guns, modern Iraqis might well have statistically better odds than the average ancient hunter–gatherer did of escaping death through military violence.

Why was prehistoric warfare so lethal? One reason is the almost complete absence of prisoners of war. Keeley was unable to find more than a handful of prehistoric societies that took defeated warriors captive—the exceptions being those such as the Iroquois, who waged war specifically to assimilate prisoners, and the Meru herders of Kenya, who might ransom them for cattle. Most, however, killed prisoners outright. If they did save them, temporarily, it was usually for later torture, sacrifice, or trophy taking (such as in the case of the Colombian Cauca Valley chief who proudly showed Spanish explorers his collection of four hundred smoke-dried corpses of his victims, all arranged in gruesome poses with weapons).29 The usual aim of prehistoric warfare was simple annihilation—of the warrior himself and, sometimes, his entire social unit. Keeley reports, for example, that the subarctic Kutchin frequently sought to exterminate whole villages of their adversaries, the Mackenzie Eskimo, sadistically leaving just one male, “The Survivor,” alive to spread word of the massacre. Sometimes defeated warriors even faced the ultimate annihilation—they were eaten. Contrary to the belief that “culinary” cannibalism (eating human flesh for food, rather than ritual) was unheard of in prehistoric societies, eating the loser was clearly a major motivation for war among some groups. The return of a war party with bakolo—dead prisoners for eating—in Fiji, for instance, was a cause for wild celebration and feasting using specially carved “cannibal forks.” Anthropologist Robert Carneiro, reporting the case of one Fijian chief, Ra Undreundre, who “buried” over nine hundred of his enemies in his stomach, estimates that almost 100 percent of war dead in Fiji became food.30 English missionary Alfred Nesbitt Brown, similarly, described early nineteenth-century Maori war parties singing, on their way to the slaughter, of “how sweet the flesh of the enemy would taste.”31 Ethnographer Elsdon Best, likewise, confirmed that Maori war parties literally lived off their enemies, describing a procession he saw in which twenty female captives bore baskets heavy with the flesh of their murdered clansmen.


image On the wings of eagles

Viking warriors didn’t just kick a man when he was down, they hacked him. Warriors unlucky enough to be bested by a ninth-century Scandinavian swordsman might find a fate worse than death awaited them: human sacrifice via the gruesome “Blood Eagle Rite.” Ancient Viking poetry gives us the history of this horrific ordeal. Ragnars Saga (“Hairy-Breeks’s Tale”) records that the English king of Northumbria, Aella, had the image of an eagle carved on his back at sword-point by the great Danish Viking Ivarr. Other sources also have the sadistic Ivarr pouring salt into the wound. A later epic poem, Pattr af Ragnar’s Sonum, says that Aella’s spine was slashed, his ribcage ripped open and his still-breathing lungs pulled out to simulate an eagle’s wings. According to later sagas, other victims of the Blood Eagle included King Haraldr Harfagri of Norway, King Maelgualai of Ireland, and even King Edmund of England. Unfortunately, though, scholars have since pointed out that the supposed ritual may be a simple mistranslation of the original poet’s reference to the eagles that perched on Aella’s back and consumed his corpse. Still, whether dismembered by surgical broadsword or left for bird food, Aella’s fate shouts one message loud and clear: Ivarr, like any other Viking warrior, was best avoided. image


Another reason for the high mortality rate of prehistoric war was the surprising effectiveness of primitive weaponry. Numerous authors have testified to the impressive speed and precision of the ancient Turko-Mongol composite bow. Unlike the slow and feeble early Western musket, the bow shot 10 projectiles per minute to ranges of over 550 yards (a stone monument found in Siberia records that in the 1220s Genghis Khan’s nephew, Yesüngge, hit a target there from 586 yards away using a composite bow).32 Yet even supposedly primitive bows could be devastatingly lethal. The simple flint arrowheads used throughout the prehistoric world, for example, had ragged, tearing edges sharper than modern steel. Keeley reports that in prehistoric North America these basic arrowheads were so deadly, and so commonly used, that up to 40 percent of all deaths were caused by them. Prehistoric warriors also took fiendishly inventive measures to increase killing power. Many groups barbed their arrowheads to make them difficult to remove, and deliberately weakened their shaft attachments so they would break off in the wound. The aggressive Mae Enga achieved the same effect by capping theirs with hollow cassowary claws, which would likewise be left in the body to fester. Numerous groups tipped their arrows with poisons, such as the muriju plant sap of the Kenyan Giriama (which could stop an elephant’s heart in hours) or the snake venom of the ancient Sarmatians. More fiendish again was the use of microbial poisons to cause blood poisoning. Shoshone Indians, for example, buried sheep intestines filled with blood, left them to rot, and then dug them up and smeared the septic ooze on their war arrows. New Guinean groups daubed theirs with grease or human excrement, or wrapped them in orchid fibers. These devices meant the projectile didn’t have to kill instantly; it could destroy enemies later through septicemia. Even such derided weapons as the primitive sling were, in reality, viciously effective. Ancient literary sources record that the Roman army (which used slingers as auxiliaries) only recruited those who could hit a target at 200 yards. Enemies greatly feared these whizzing projectiles which, unlike arrows and spears, couldn’t be seen and avoided. Though sling stones rarely killed outright, they stunned even armored victims sufficiently that they could then be dispatched with club or spear (or, as in prehistoric Tahiti, with daggers made from a stingray’s tail).

Yet another reason for the high mortality rate of prehistoric warfare was the failure to distinguish between soldiers and civilians. Any modern Western air force that conducts a bombing raid faces a severe grilling in the press if civilians are harmed. Yet numerous historical accounts confirm that in ancient times it wasn’t just defeated warriors who were slaughtered after a lost battle—everyone was. Victories often turned into annihilating rampages through the losers’ territory. Anthropologist of the Pacific Islands, Douglas Oliver, reports that after eighteenth-century Tahitian battles it was common to see “infants…transfixed to their mothers, or pierced through the head and strung on cords…[and] women…disembowelled and derisively displayed.”33 So many were killed in these rampages that the losers’ territory apparently often stank of death for weeks. Keeley, similarly, records archaeological evidence from a mass grave at Crow Creek in South Dakota, dated to circa 1325 CE, which shows that over five hundred men, women, and children (60 percent of the population) were massacred there. Only the very young women were spared, probably for incorporation into the victors’ tribe.

Even in those societies that did recognize a distinction between civilian and military, ancient warfare was an extraordinarily lethal business. Authors Richard Gabriel and Karen Metz calculate that defeated soldiers in ancient Sumerian, Assyrian, Egyptian, Greek, and Roman armies had an average 37.7 percent chance of dying on the battlefield.34 (Death rates among victorious armies, at an average 5.5 percent, were much lower because the greatest slaughter always occurred when the losing side broke and ran.) Losses in some battles were truly catastrophic, as in the famous battle at Cannae, where the Romans lost seventy thousand men, or 95 percent of their engaged force. How does this compare to modern soldiering? Gabriel and Metz report that the death rate for American soldiers in twentieth-century wars averages 23–24 percent (with a dip to 14 percent during the Korean War due to the introduction of body armor). Interestingly, applying their method of analysis to the Iraq War gives a very similar death rate of 28.77 percent of American combat troops (at the time of writing). Though this is already a third lower than the death rate of ancient soldiers, it doesn’t sound dramatically less dangerous. There is, however, a secret buried in Gabriel and Metz’s approach—it drastically overestimates the death rate of modern soldiers by factoring in the modern military “teeth-to-tail” ratio. This is the ratio of fighting soldiers to support soldiers, which in the modern U.S. army is roughly 1:11. Gabriel and Metz confine their survey to those modern soldiers who actually fight, but in ancient armies every soldier fought (their teeth-to-tail ratio was almost 1:1). A modern soldier’s comparative chance of dying in battle should, therefore, really be divided by eleven. That gives us about 2.62 percent chance of death in battle for American soldiers in Iraq—a far cry from the terrifying odds faced by ancient Eurasian warriors. It is even further from those faced by Tahitian soldiers fighting at sea in canoes: the Dutch explorer, Moerenhout, wrote that no Tahitian naval battle ever ended with less than 75 percent casualties, even on the victors’ side.35

Ancient battle was also far more terrifying than its modern counterpart because of its immediacy. Ancient soldiers slashed, stabbed, and clubbed each other from mere inches away, unlike modern combatants, who may be miles distant. The presence of so many thousands of men—all screaming, slaughtering, bleeding, and dying—undoubtedly made ancient battlefields a true vision of hell. At Cannae, for example, the surrounded Romans endured four hours of bloodbath horror as Hannibal’s Carthaginians crowded them in so tight they couldn’t lift their weapons, then butchered them with sword and spear. The Roman historian Livy records that after the battle several dead Romans were found to have dug holes and buried their own faces in the dirt—to escape their terror through self-suffocation. There is also a special horror to the maiming injuries ancient soldiers faced, particularly from swords. Livy reports that the Greeks were appalled to see the slashed flesh, hacked limbs, and split skulls suffered by their dead countrymen after defeat at the battle of Cynoscephalae—the first time they had fought brutal Roman swordsmen. Nor were the horrifying projectile injuries of modern war entirely absent. The Jewish historian Josephus records some of the gruesome wounds dealt out by Roman ballista (or catapults): one Jewish soldier decapitated by a stray ballista stone, and a pregnant woman whose fetus was smashed from her womb and flung one hundred yards by another. Even the absence of such technology, however, did not restrict the truly horrific injuries primitive warriors might have to face. Champion fighters among those savage Tahitians, for instance, commonly clubbed their opponents completely flat, then cut a neck hole in their flattened corpses and wore them as grotesque ponchos!

Clearly, a very high level of bravery and aggression was needed to face these brutal trials. Would modern soldiers have been up to the job? Again, sadly, the answer is probably not. The famed U.S. Army combat historian Samuel “Slam” Marshall wrote in his World War II classic Men Against Fire that on average just 15 percent of American troops actually shot their weapons at the enemy, even when they themselves were under fire. Marshall put this down to the inherent nonaggressiveness of the American soldier and his consequent reluctance to take enemy life. An American officer who fought in the first Gulf War, Captain John Eisenhauer, confirmed this, stating that all his battalion’s soldiers had failed to fire on attacking Iraqis—except for those artillery gunners who could do so through a long-distance thermal sight. Subsequent studies to Marshall’s support this: the rate of fire of modern soldiers goes up in direct proportion to their distance from the enemy.36 So pronounced is the problem that the U.S. Army has adopted special training methods—such as pop-up firing ranges that create a game-like atmosphere—to raise the fire rate. Amusingly, the current U.S. Army proudly trumpets a 90 percent success rate from these measures—meaning 90 percent of their soldiers now actually fire the weapons it is their duty to wield.

One reason modern soldiers get away with such slackness is the speed of modern projectiles. Who, after all, can tell who fired any particular bullet among hundreds traveling at 933 yards per second? Tribal bowmen, on the other hand, had no such luxury: under the intimate gaze of their fighting brothers, they had no choice but to fight—and to win or die.

Although most, it seems, were so naturally aggressive they hardly needed the encouragement.

There is still one place in the world where the extraordinary belligerence of prehistoric tribesmen can be directly observed. In 1981 the crew of a Panamanian freighter, the Primrose, experienced it first-hand. Run aground one night in treacherous seas on a reef in the Bay of Bengal, the Primrose’s captain was relieved to see, come morning, that his ship had fetched up a mere few hundred yards from Sentinel Island, a lonely outpost of the isolated Andaman Islands. He shouldn’t have been. Just two days later he was forced to send an urgent message to the Indian Navy requesting an immediate airdrop of firearms for protection against a horde of tribesmen who had spent the day showering his freighter with arrows, and who were now, ominously, building canoes to bring their murderous fire even closer. The attacks eventually grew so severe that the Primrose’s crew had to be rescued by helicopter from the besieged freighter’s deck. Twenty-five years later, in 2006, two Indian fishermen were not so lucky; they were murdered by arrow fire under the horrified gaze of their colleagues after drifting too close to Sentinel Island (their bodies have never been recovered).

All these unfortunate seafarers had, unbeknown to them, stumbled upon an island of super-aggressive tribesmen whose reputation for violence stretches back to Marco Polo and beyond. One Arabic text from 851 CE said of the Andamanese that they:

…eat men alive. They are black with woolly hair, and in their eyes and countenance there is something quite frightful…they go naked and have no boats. If they had they would devour all who passed near them. Sometimes [with] ships that are wind-bound…in such cases the crew sometimes fall into the hands of the [natives], and most of them are massacred.37

British colonialists in the nineteenth century had their own confirmation of this, with attacks like those on the Proserpine, relentlessly assaulted by the Andamanese as it searched for the shipwrecked Emily, which had itself been attacked earlier. Holding the natives off with cannon shot, the search party found just one set of remains: “the corpse of the second officer [who] had been murdered and his corpse badly mangled, the top of his skull removed with a blunt saw-like instrument.”38 So aggressive are the Andamanese, even today, that they remain, effectively, the last uncontacted people on earth.

Incredibly, hostile societies such as the Andamanese even contain certain warriors so aggressive that their own side avoids them, too. In the Andaman Islands these hotheads, known as tarendseks, are widely loathed due to their habit of running amok and killing their own (such as the tarendsek recorded as murdering two of his tribe’s children simply because they disturbed his sleep). Among New Guinean Baruya tribesmen, similarly, berserk warriors known as aoulattas brave the enemy’s arrows and advance alone to smash those enemies’ skulls with their keuleukas (ancestral stone clubs), but likewise terrorize their own community with random murders.39 The prime examples of such super-aggressive warriors were, of course, the Viking berserkers. These ferocious fighters terrorized friend and foe alike in the Scandinavian world between the ninth and eleventh centuries ce. Disdaining armor, berserkers plunged into battle wearing simple wolf or bear skins (possibly even nothing at all) smiting and tearing enemies with unsurpassed fury. Contemporaries witnessed them howling like beasts and frenziedly biting their shields (one berserker apparently tore out an enemy’s jugular with his teeth). The key to the berserkers’ aggression was the berserkergang, the trancelike fury they entered, which apparently made them impervious to death and injury. Though used to devastating effect as shock troops by several early Norwegian kings, berserkers also exacted a heavy toll on their own people. Under King Eirik Bloodaxe in the eleventh century, for example, berserkers were outlawed due to their habit of challenging rich men to holmganga, the death duel, then slaying them and confiscating their property (and wives: berserkers were also noted rapists). Despite attempts to attribute the berserkergang to the drinking of psychoactive substances such as wine spiced with the bog myrtle plant, it appears the wild men’s frothing fury really was a simple case of hyper-aggressiveness, possibly genetic in origin.


image Beating the chest

The ferocious Yanomami Indians of the Brazilian and Venezuelan Amazon don’t have to go far for a fight—just as far as the nearest Yanomami man. So often do Yanomami men fight that they have developed a five-stage system of aggressive brawling, starting with the chest-pounding duel. In this ritual, combatants stand stock-still while their opponent adjusts their arms and chest for maximum vulnerability, takes a run-up, then smashes his fist into the man’s left pectoral muscle, right over the heart. The victim must withstand four or five of these blows, which raise painful and bloody bruises, to earn the right to retaliate—the winner being whoever doesn’t collapse or step away. If chest fighting doesn’t solve the dispute, combatants escalate things to a side-slapping duel, where open hands are swung at maximum velocity into the vulnerable spot between the opponent’s ribs and pelvis—a blow that frequently results in unconsciousness. The next step is a club fight, where fighters take turns cracking each other over the skull with heavy wooden staves; most Yanomami men tonsure the top of their heads to display their proud scars from this ordeal. If that doesn’t help, they might then step it up a notch to an axe fight using the blunt edge of their stone axes—though this is so often lethal they might as well just use the sharp side. If the opponents still aren’t sick of fighting (which is often the case), the combat will then turn really serious: bows will come out, along with six-foot arrows tipped with lethal curare poison. Ouch! image


Such fighting spirit is clearly lacking in modern soldiers, but so are other martial qualities. A quick scan of historical literature shows that modern grunts are seriously short on strength and endurance, too. The U.S. Army, for example, proudly highlights its physical fitness standards—infantry recruits are expected to be able to run 12 miles in 4 hours by the end of basic training. Yet this is couch-potato stuff. Members of the Yuan Dynasty’s Imperial Guard in ancient China had to run 56 miles in 4 hours for their fitness test.40 Alexander the Great’s Macedonians, similarly, ran between 36 and 52 miles a day, for 11 days straight, in their pursuit of the defeated Persian king, Darius. The most leisurely pace of Roman armies, likewise, was 18 miles per day, but they often covered much more distance. In 207 BCE, for instance, the Consul Claudius Nero marched a Roman legion 310 miles in 6 days, at the rate of almost 50 miles per day, to meet and defeat Hannibal’s brother, Hasdrubal.41 This represents a dogtrot of 6 miles per hour, or about half the speed of modern Olympic marathon gold medallists—day after day after day. What’s more, marathon runners wear only light clothes, but Roman legionaries marched in full armor and carried extensive baggage. These days the U.S. Army, and even the Marines, limit every soldier’s load to a third of the average American recruit’s bodyweight which at an average 153 pounds works out to an approximate 50-pound load. Based on this rule, Nero’s soldiers, who weighed an average 145 pounds, should have carried just over 47 pounds for their ultra marathon; they actually carried up to 100 (two-thirds of their bodyweight).42 Other warriors, unburdened by such staggering loads, ranged even further, and faster. Shaka Zulu’s impis, for example, commonly ran over 50 miles a day when on campaign. One war leader of the East African Ruga-Ruga, Mirambo (“Heaps of Corpses”), was similarly once recorded as running sixteen miles to attack a village, conquering it, and then running 30 miles to assault another.43


image Adidas army

Former Australian Army physical trainer, Captain David Sanders, has no time for grizzled old soldiers claiming things were tougher in their day. “Infantry training in the Aussie Army today is just like it always was—tough as nails,” he insists. “Route marches are just as hard, drills are just as tough, and sergeant-majors are the same as they always were—total bastards.” If anything, Sanders reckons, soldiers in the new millennium are getting better, because now (as opposed to the long post-Vietnam lull) they actually fight. There is one area in which he will admit modern soldiers fall short, however. “From the eighties onward we started seeing a lot of injuries like shin splints and stress fractures in the lower leg bones,” he muses. The reason? “In the old days kids used to walk around barefoot or in hard-soled leather shoes. It made their bones hard. From the eighties on they started wearing runners, so they all grew up with bones like chicken drumsticks.” image


There was a reason ancient and primitive soldiers were so much fitter than slothful modern grunts: training. To make his soldiers fit, Shaka took them on grueling patrols ranging hundreds of miles—barefoot. When some complained of injury from “Devil’s Thorns”—long spines sharp enough to puncture tires—Shaka made them stamp for hours on a parade ground scattered with the thorns, killing any man who failed to dance. Wu dynasty soldiers in sixth-century BCE China trained with eighty-mile runs, without a break, wearing full armor and weaponry. Roman legionary recruits, similarly, trained by marching twenty-four miles in five hours, carrying their full armor and pack weight of up to one hundred pounds. They also sometimes constructed and tore down a complete legionary camp—a procedure which took three hours—three times in one day. The only modern soldiers who even come close to this are the elite and Special Forces units. U.S. Rangers, for example, undertake sixteen-mile training runs in four-and-a-half hours, wearing a forty-pound pack. Soldiers graduating from the U.S. Special Operations Command’s assessment program, the SFAS, likewise do training runs of eighteen miles in four-and-three-quarter hours, wearing a pack weighing fifty pounds. This is commendable, of course, but it just damns our ordinary soldiers even further—physically at least, every peasant grunt in the Roman army was a match for modern elite Special Forces.

Nor do we have a monopoly on actual Special Forces, either. A whole entertainment industry has grown up praising the virtues of modern special-operations soldiers: their physical toughness, lethality, bravery, and ability to operate in enemy environments. Yet, one thousand years ago, for example, the Middle East was stalked by a sect of shadow warriors who made Delta Force look like boy scouts on jamboree. These were the fida’is (“those who lay down their lives”): the secret agents of the heretical Assassin sect of the Shia Muslims. From the eleventh to the thirteenth centuries ce these fanatical warriors terrorized the orthodox Muslim world by secretly infiltrating the retinues of high officials and suddenly murdering them in brutal and highly public fashion. Among their numerous victims were a Prime Minister of Persia and the Crusader King of Jerusalem, Conrad (whom they assassinated while in the disguise of Christian monks). Even the mighty Muslim warrior Saladin suffered so many attempts on his life from fida’is that he began sleeping in a specially constructed wooden tower. A report from a fourteenth-century Crusader priest, Bocardus, records the general awe in which the fida’is’s infiltration skills were held:

…the Assassins, who are to be cursed and fled…are thirsty for human blood…by imitating the gestures, garments, languages, customs, and acts of various nations…thus, hidden in sheep’s clothing, they suffer death as soon as they are recognized…I cannot show how to recognize them by their customs or any other signs, for in these things they are unknown… 44

So practiced in the dark arts of secret war were the fida’is that orthodox Muslim rulers never quite knew when their most trusted retainers might turn out to be Assassins. Sharaf al-Mulk, the Prime Minister of the Khwarezmian Empire (modern Iran), for example, was appalled to discover in 1227 CE that no fewer than five fida’is had secretly infiltrated both his stable and his office of heralds. So terrified was he that others would remain undiscovered, he paid the Assassins’ religious leader 50,000 dinars’ blood money when the Sultan made him burn those five alive.

Even the fida’is Assassins, however, couldn’t match the secret skills of the archetypal ancient special-operations soldiers: the ninja. These silent warriors were the scourge of Japan’s aristocracy between the fifteenth and seventeenth centuries, when their stealthy way of war frequently defeated the aristocrats’ samurai protectors. Ninja, or shinobi as they were more commonly called in medieval Japan, came from secret ninjutsu training schools in the wild Iga and Koga provinces. These schools had their origin in the wanderings of a defeated twelfth-century samurai, Daisuke, who met up with the Chinese warrior–monk Kain Doshi, himself a refugee from the collapsing Tang Dynasty in China, in the Iga Mountains. The military doctrine that emerged from Daisuke’s training at Kain Doshi’s feet emphasized deception, suppleness, speed, and surprise. Shinobi assassins (for assassins they usually were) used similar tactics to the Shia fida’is, disguising themselves as wandering Buddhist monks, washerwomen, or even itinerant puppeteers to get close to their targets. Their weaponry far outstripped the favored dagger of the fida’is, however. In addition to their fearsome unarmed combat techniques (which eventually gave rise to karate), and their shortened samurai swords, shinobi killers might carry powdered sand and pepper to blind their enemies. Their mouths might spit needles. Their hands and feet might feature tekken, banded metal claws that served equally well to climb a castle wall, catch a sword blow, or stab an opponent. Tucked into their belt might be the fearsome shinobi gama, a long chain flung at victims to immobilize them so they could be hacked to death with the razor-sharp sickle at its end. Shinobi also employed a bewildering array of climbing and mobility devices, including hooked ropes and collapsible climbing poles, and wooden flotation shoes that allowed them to cross moats. So murderous did shinobi activity become that some aristocrats found their only defense was to construct elaborate “ninja-proof” houses. One such was the famous Nijō Castle in Kyoto, which featured a “nightingale” floor whose specially sprung floorboards “sang” whenever a would-be assassin visited in the dead of night. Yet even these drastic measures failed to stop some shinobi killers. A ninja called Ishikawa Goemon, for example, is said to have penetrated the castle of the famous aristocrat Nobunaga and dripped poison down a thread into his mouth as he slept. Nobunaga, however, survived this attempt and sent his own shinobi to murder his rival, Kenshin—which the shinobi reportedly did by concealing himself in Kenshin’s lavatory sewage pit for several days until the chance came to kill the nobleman with a spear thrust to the anus.45

Without taking anything away from our many brave and dedicated modern special-ops soldiers, such feats make our bumbling search for Osama bin Laden look like a page straight out of Where’s Waldo?

Mention of bin Laden seems appropriate here, for his name is often cited by those who claim that another aggressive activity of Homo masculinus modernus, terrorism, has reached new, unparalleled heights of destruction. One post-9/11 academic work on terror, for example, pointed out that the 2,974 casualties in the September 11, 2001, attacks constituted an almost forty-fold increase in casualties over the 76 terrorist bombings recorded between 1950 and 2000.46 Yet the existence of the Shia Assassins—the historical terrorists par excellence—shows that terrorism was not unknown in the ancient world. Sun Tzu, the famous seventh-century BCE Chinese military strategist, even coined a proverb summarizing the primary aim of political terror: “To kill one and frighten ten thousand.” While there are obvious difficulties in comparing terrorism across vastly different times and cultures (not least because of different killing technologies), al Qaeda and bin Laden have themselves given us two standards by which to measure. Al Qaeda jihadists boast that their attacks will induce surrender of the West through morale-shattering, spectacular, mass-casualty attacks. We are thus entitled to ask how well they have succeeded on two fronts: their number of casualties, and their achievement of those strategic goals. More importantly, for this book at least, we can also ask how well their efforts in both of these compare to those of ancient terrorists.

With which historical terrorists should al Qaeda be compared? Perhaps the best fit would be the medieval Mongols. This north-Asian tribe of ferocious horsemen was, like al Qaeda, an ethnically based group that aimed to forge a universal empire. The Mongols also, again like al Qaeda, employed explicit terrorism in that quest. The main difference between the two is how phenomenally successful, by comparison, the Mongols were. (Successful, in this case, does not necessarily mean admirable.) Under their ferocious leader, Genghis Khan, the Mongol tribe, which numbered at most 850,000 people in 1260 CE, went on to control a Eurasian empire of over 100 million souls. In fact, the latter number would have been even higher if the Mongols hadn’t killed so many, often slaughtering the entire population of cities they conquered. At Merv in Turkmenistan, for example (the largest city in the world at that time), Genghis Khan’s son Tolui killed every single inhabitant except a handful of artisans, whom he enslaved. Exact figures are unclear (estimates range from 400,000 to 1.3 million), but this was clearly an incredible feat of extermination, considering it all had to be done by hand and took five days (Tolui apparently assigned 300 to 400 victims to each Mongol warrior for decapitation). The same fate befell Iran’s Nishapur (whose citizens had unwisely killed Genghis’s son-in-law, Tokuchar), where separate pyramids of men’s, women’s, and children’s skulls were piled up outside the city walls.47 (This atrocity was, unbelievably, exceeded by the later Turkish conqueror Timur, who constructed a tower of living victims, each cemented in place, after his conquest of the city of Sebsewar.) The Mongols were so thorough that they often returned to such cities days later to kill any refugees who had managed to avoid the first massacre. It was by means such as this that the Mongols killed somewhere between 30 and 60 million people over the 90-year period of their major conquests. Al Qaeda and its affiliates, by comparison, succeeded in killing 14,602 people worldwide in 2005 (the rate has since dropped).48 Multiplied by 90 years, even this high figure would result in 1,314,180 casualties, considerably less than the Mongols.

Then there is the question of aims. Al Qaeda’s strategic aim has demonstrably failed, resulting, in fact, in a ferocious renewal of the Western will to fight—to wit, the war on terror. Mongol terrorism, by contrast, was devastatingly effective. Several ancient sources record the paralyzing effect Mongol atrocities had on future victims, frequently leading whole cities to surrender without a fight. Arab historian Ibn al-Athir, to quote one, wrote:

Stories have been related to me…which the hearer can scarcely credit, as to the terror of them [the Mongols]…so that it is said a single one of them would enter a village…wherein were many people, and would continue to slay them one after the other, none daring to stretch forth his hand…I have heard that one of them took a man captive but had not any weapon wherewith to kill him; and he said to his prisoner, “Lay your head on the ground and do not move” and he did so and the Tartar went and fetched his sword and slew him therewith.49

Without making light of the evil that modern Islamic jihadists have inflicted on the world, comparisons like these make it plain that Osama bin Laden wouldn’t have even made noyan (“captain”) in the army of Genghis Khan.

Fortunately, this means most of us in the Western world will never suffer terrorist violence. Instead, any lethal violence we do suffer will probably be individual, criminally motivated homicide. This type of violence is universally a male domain: U.S. Bureau of Justice Statistics figures for 2005 show that men committed 88 percent of murders in the United States, and were victims in 74.9 percent. Despite the marked decline in homicide rates in the United States over the past fifteen years (total murders in the United States were 24,526 in 1993, and only 16,692 in 2005), we often assume aggressive male homicide is a disease peculiar to modern life. Yet how accurate is this? Consider gang violence, for example. In 1996, the Los Angeles County Gang Information Bureau estimated the county’s gang population at 150,000 members; that same year the number of gang members killed in intergang violence was just 803, making for an annual rate of 0.53 percent. This is considerably below the average death rate from violence recorded by archeologist Lawrence Keeley for prehistoric societies. Even the baddest-assed Crip from the meanest street of South-Central L.A., it seems, stands less chance of being capped than an ancient or tribal male did of being shanked with a flint, bronze, or sharpened bone blade.

Clearly, very few modern males really have the fight stuff. So why then do we aspire to it? Why have so many young American males taken up training for MMA, or ultimate fighting, that the U.S. Army now runs a championship tournament out of Fort Benning simply to cash in and swing recruits its way? The simple answer is: it’s written in our genes. However feeble we are now, modern-male bodies still bear the physical stamp of the fighters we once were. Consider, for example, our sexual dimorphism: the degree to which male Homo sapiens’s bodies differ from those of female Homo sapiens. Men are, on average, 9 percent taller than women worldwide. They also weigh 20 percent more, with much of the gain attributable to that 50 percent increase in upper-body muscle that men have over women. As in most mammal species that show a sex-based difference in size, there is a simple reason for this: fighting.50 The evidence of our bodies, therefore, is that male-on-male violence has been a part of our lineage for a very long time. Several other features of the male physique also seem to be adaptations to fighting. True, the fact that men have a 30 percent greater aerobic capacity than women could just as easily be an adaptation to hunting as to mano-a-mano combat. It is hard, though, to see the fact that men’s blood carries higher levels of coagulation factors such as thrombin and vitamin K (which promote wound healing and reduce pain sensitivity) as anything other than an adaptation to those times when two ancient male hominins decided there was nothing for it but to take it outside and let their fists do the talking.

Then there is the emotional machinery we have inherited. Many a woman has shaken her head at the tendency of her man to react with extreme aggression and violence to seemingly trivial insults. On the surface, of course, she’s right: a study of male-on-male homicide in Victoria, Australia, found that a large number were confrontational killings arising from insignificant slights to the aggressor’s “honor” such as jostling or staring.51 It’s easy to write this off as male stupidity, but the truth seems to be that this short fuse is actually hardwired into the male brain—and for good reason. More than one anthropologist has pointed out that it is only the threat of massively disproportionate violence in response to minor infringements that guarantees social order among men in tribal societies, which have no central government to dispense justice. Only through a balance of terror can every man’s drive to deceive and take advantage of his fellows be deterred. Thus, as in any system of mutually assured devastation, a tribal male who tries to opt out by refusing to fight to the death at the drop of a headdress will soon find himself losing out to his more aggressive brothers.

But what, exactly, can he lose? What, in other words, is he fighting for?

The short answer, as any evolutionary psychologist (or bartender, for that matter) can tell you, is: women. The struggle to transmit genes to the next generation can be, for tribal males, a brutal, winner-takes-all contest well worth fighting and dying for. (Females, in contrast, suffer no such pressure—almost every fertile tribal woman is virtually guaranteed to have children, providing she stays alive.) True, some masculine über-violence is clearly aimed simply at personal, rather than reproductive, survival. A survey of homicides among ancient Scottish and Icelandic Vikings, for example, shows that victims’ families were almost seven times less likely to attempt a revenge killing if the murderer was known to be a dangerous berserker. But this itself could be looked at as a reproductive strategy—another study found berserkers also fathered significantly more children than other Viking warriors.

Again, the secret sexual drivers of male-on-male violence can probably be more clearly studied in our closest relatives, the chimps. One survey of a chimp troop in Tanzania, for instance, revealed that the three top males only ever fought when their surrounding females were in season. The reproductive consequences of these fights were also striking: the sexual share of the alpha male, Kasonta, plummeted from 85.66 percent of matings to just 12.99 percent when he was overthrown by the beta male, Sobonga. (Interestingly, just as in the case of de Waal’s chimp, Yeroen, the next-best place to be, after alpha, was the gamma, or third, position. Kamenafu, the weaker gamma chimp who played Kasonta and Sobonga off against one another, was at one stage able to sneak in 51.4 percent of the sexual encounters as the price of his support.) Even the violence of male chimps against other troops of males, which is far more lethal than their intragroup fighting, seems to have a sexual function. This is somewhat obscured by the fact that male chimps on war patrol also attack foreign females, unless they are in season, but a study by the Jane Goodall Institute’s Center for Primate Studies found that aggressive troops of males, by extending their ranges and increasing food availability, boosted both the reproductive rates of their resident females and their own number of sexual contacts with them.

Are the reproductive consequences of male violence as positive in Homo sapiens? The terrifying Mongols, once more, give us devastating proof that they are. It was their Khan, Genghis, after all, who said, “Man’s greatest joy is to slay his enemy, plunder his riches, ride his steeds, see the tears of his loved ones, and embrace his women.” The evidence, too, is that Genghis was not backward in embracing his embracing opportunities: a 2003 genetic research project on men living in the lands of the former Mongol empire found 8 percent of them carry identical Y-chromosomes—since Y-chromosomes are passed from father to son unchanged, this means sixteen million Eurasian men are direct descendants of Genghis and his close male relatives! On a smaller scale, many a Greek and Roman soldier emulated these feats in the inevitable rape orgies that followed conquest of an enemy city (Agamemnon, in the Iliad, tells his Greek troops, “let there be no scramble to get home, then, till every man of you has slept with a Trojan wife”). The reproductive pay-off of male violence is also explicit among the headhunting cultures of pre-colonial Borneo, in which a man was not allowed to marry until he had taken a head (and might then present it as his bridal gift).

A frequent criticism of the sexually driven theory of male violence has, however, been that no evidence exists of specific genes for aggression. In fact, though, this is no longer true. In the early 1990s scientists discovered that male mice carrying mutations in a gene sequence called MAOA displayed excessive circulating neurotransmitters, such as serotonin, in their brains, resulting in extreme aggression. The same mutation was soon also identified in humans—first in the males of one particular Dutch family who all displayed similarly high levels of impulsive aggression.52 A later back-up study on antisocial children found that the gene variant was, in fact, reasonably widespread, and could be used to predict whether abused children would go on to develop aggressive, antisocial personality disorders.53 Interestingly, a 2000 research project then found high levels of this genetic complex in Macaque monkeys, too: the most widespread genus of primates worldwide after humans.54 It seems a remarkable coincidence, as the authors pointed out, that the two primates with the highest levels of this aggressive “warrior gene” have been the most successful of all ape and monkey colonizers.

This doesn’t, of course, prove that all male aggression is controlled by this particular gene. It does, however, show a mechanism by which aggression can be, and probably is, regulated by natural selection.

That being the case, what is the evolutionary significance of the decline in the fighting ability of modern males, as documented here? Has there been a genetic change to the fighting heart of Homo masculinus modernus? So far, I think, probably not. The fact that modern men no longer go toe-to-toe in revenge-driven death matches has more to do with the fact that we have surrendered our right to take bloody revenge to the state, which now punishes our enemies for us (or indeed, us for them, depending on the degree to which we give in to our instinctual aggression). This cultural change has, however, upended the selective landscape. As in all matters BRAWN and BRAVADO, our BATTLE instincts are more likely to eliminate us from the gene pool these days than to have us sweep its reproductive stakes. Does that mean we have doomed Homo masculinus modernus to an ever-feebler future? Will hotheaded young duelists, instead of firing at ten paces, start bitch-slapping one another over matters of mortal honor—retiring to the nearest hospital at the first sign of a broken nail? A long-term experiment in breeding silver foxes at Novosibirsk in Siberia seems to indicate yes: researchers there were able to breed heritable aggression completely out of their foxes within forty years.55

Yet several things may save modern males from this fate. Our increasing ability to tailor drug treatments to specific genetic conditions, for example, will probably allow those human males who bear the MAOA mutation to regulate their brain serotonin levels, thereby saving them (and their genes) from the potentially fatal consequences of their impulsive aggression. Then there is the awkward yet incontrovertible fact—so distressing to those fathers whose teenage daughters fall head-over-heels for sociopathic, wife-beater-in-waiting young punks—that women are somewhat sexually attracted to aggression in men. A 1987 study of female university students, for example, found that almost all rated dominant males (who employed aggression as a strategy in achieving their dominance) as significantly more attractive than non-dominant men (even though the women also expressed strong distaste for the aggression itself).56 But we don’t need to just take the word of these obscure eggheads and their love-struck human lab rats for it. No less an authority than Tony Soprano, the TV mafioso, confirmed it by reproaching his wife, Carmela, for her hypocrisy in claiming she didn’t care that he had just lost a fight to underling Bobby, reminding her of their young days:

You were there, in the crowd in the parking lot that night at Pizza World when I took Dominic Tedesco. I didn’t even know your name, but I remember our eyes met. And you were blown away.57

With female mate preferences like these in operation it is unlikely that male aggression will disappear from the gene pool any time soon. Then there are the surprising ways in which pathological aggression can be turned to both the individual’s and society’s advantage in the modern world. Several studies have found, perhaps unsurprisingly, that criminal bombers exhibit physiological characteristics in common with psychopaths, among them pathologically low heart rates (indicating very high thresholds for response to stress). More surprising is that so, too, do the most successful bomb-disposal experts.58 No studies of the relative reproductive rates of criminal bombers and bomb-disposal experts have, as far as I know, been done, but this does at least add one more piece of evidence that male aggression in the modern world—and the qualities associated with it (in this case, an extraordinarily pathologically steady hand) need not be a one-way ticket to genetic oblivion.

Another way, of course, that we modern men can turn what aggression we still have into chick-pulling, gene-propagating success is through sports. Many sociologists, in fact, claim the reason we males have become so docile is that all our violence now goes into that form of ritualized combat. Some even insist the rise of sport explains the civilizing process of the past two hundred years. It’s an intriguing theory, but considering the degree of violence and aggression we’ve witnessed in our forebears, it would seem to require modern male sport to be better, faster, stronger, and harder than sports in history ever were. Is there any evidence that they are?

Hmm. I’ve got a bad feeling about this.