was seventeen years old when I discovered I was the great48-grandson of Charlemagne—King of the Franks and Holy Roman Emperor. Where I grew up, it’s not unusual to find out such things. The culture of Charleston, South Carolina, is built around the pride associated with a handful of family histories. Like most of my friends downtown, almost without my knowing it, my youth was an unconscious state of perpetual genealogical questing. Might I be the descendent of a signer of the Declaration? Robert E. Lee’s messenger? I bugged my mom and aunts and uncles. Who am I really? Might my childhood friends turn out to be third cousins? In Charleston, that last one’s almost too easy.
My mother grew exhausted with my pestering and sent me to see Mary Pringle, an ancient cousin and amateur genealogist. Primed with curiosity, I arrived at cousin Mary’s elegant antebellum home on a hot summer day. After some iced tea and pleasantries, I was presented with a large, unwieldy sheet of paper, bearing a set of concentric circles. In the center, Mary wrote my name, and in the immediate outer circle, divided in half, she wrote the names of my parents. In the next circle, divided now into fourths, she wrote the names of my four grandparents. We filled it out as far as we could in every direction, and in that area where her family and mine converged—her life’s work—a seemingly unbounded wedge flew backward to Scotland and England, until my ancestors were hobnobbing with William Shakespeare and Mary Queen of Scots. “This line,” Mary said, pointing to one of the ancient British earls we could claim, “leads in a direct line all the way to Charlemagne.”
This revelation was too much past to absorb and too much pride to possess. I wanted to ask her what the Holy Roman Emperor had left me in his will. But Mary’s tone was solemn, nearly religious:
“Understand that you are the direct descendent of King Charlemagne,” she murmured. The room felt still, as the rest of the universe slowly wheeled about on its gyre—around me, just like on the paper.
I left Mary Pringle’s house feeling pretty, well, rooted. It’s an important experience for most people—knowing where they come from. And being heir to Charlemagne would serve me just fine on the young gentleman’s party circuit. Over the next few years, I became as cunning at hefting this lumbering chunk of self-esteem into passing conversation as a Harvard grad slyly alluding to attending “school in Cambridge.”
Roots are important to us—us being all Americans—because they are the source of so much of our national anxiety of not quite belonging. Has any passenger manifest been more fretted over than the Mayflower’s? The only use of the Internet by Americans that’s competitive with porn, according to several studies, is genealogy. The most significant television miniseries, Roots, spawned a wave of pride among African-Americans (and arguably even that hyphenated name) and is partly responsible for the ongoing effort to drain the word “white” of its racist intimations by recasting it as “Irish-American,” “Scottish-American,” “Italian-American,” and the like. For everyone—including Native Americans who itchily remind the rest of the nation that they might also be called First Americans—there is a deep anxiety about rootedness and its claims. When Bill Frist was elevated to majority leader of the Senate in 2003, he had just self-published a book. Its title cries out as much with this anxiety as it does with pride: Good People Beget Good People: A Genealogy of the Frist Family.
The truth is, this anxiety can never really be quelled. About three years after I had tea with Mary Pringle, I was in a college calculus class when the teacher made a point about factoring large numbers. He decided to dramatize it by giving an example from the real world, explaining how redundancy affected genealogy in a process called “pedigree collapse.” He noted that if you run your line back to a.d. 800, the number of direct ancestors you would have, on average, is 562,949,953,421,312. That’s half a quadrillion people, which is more than five thousand times the total number of humans (106 billion) who have ever lived.
How, he asked, could this be? Well, when one goes back in time, the number of ancestors expands arithmetically: 2 grandparents, 4 great-grandparents, 8 great-great-grandparents. But soon enough, one’s ancestors assume duplicate places on the family tree. Otherwise, the law of arithmetic progression creates all kinds of crowding problems. The number of ancestors one has by A.D. 1300 is just over 268 million people, or roughly the total population of humans on the planet at that time. Beyond that year, of course, the whole thing starts to collapse inward, and then it rapidly implodes through super-redundancy into the smaller populations that existed then.
The upshot, the teacher explained, is that nearly everyone currently living anywhere on the planet can claim (and he paused for emphasis) “to be the direct descendant of Charlemagne.”
The room felt still, if not absurd, as the rest of the universe slowly creaked about me on its gyre, laughing. “The mathematical distinction,” the teacher added, “would be to not have Charlemagne as a direct ancestor.”
When we use the term “rank amateur,” the meaning we’re aiming at is this level of silliness. Rank amateurism makes delightful, if often insufferable, sense. I’m talking of the kind of insupportable rant your insane uncle unloads at Thanksgiving Day dinner—until someone pierces the bubble of absurdity.
This kind of amateurism is like so many urban legends. There is a surface logic, which is really an anxiety about some current topic (organ transplants, normalizing homosexuality, black people) that becomes a belief centered on some apocryphal anecdote about a missing kidney, gerbilling, or naming a son Nosmo from a half-obscured No Smoking sign. Then one day, someone—often cruelly—reveals that no actual account of any of these stories really exists, suggesting that you were a schmuck/homophobe/racist for believing anything that stupid and there you are. An amateur at knowing.
If “amateur” in Europe still generally means an earnest and uncredentialed aspirant—like, say, an amateur astronomer—then one of the fugitive meanings unleashed when the word jumped the pond is this one: complete dolt.
The intellectual slapstick of the rank amateur can practically be broken down into neat categories, because there are a few ongoing scandals that make it easy to chart the rambling topography of getting lost, amateur-style. Few of them are as ripe as the anthropological pursuit of the answer to any of the “first” questions—such as Who was the First Primate, Human, European, American? Amateur anthropologists tend to flock to these unknowable, ultimately meaningless mysteries.
As a result, the well-worn trails of the lost provide a kind of map of amateur waywardness. The quixotic way of the amateur is marked by a number of stations, like the road to Golgotha. Maybe the first one is that no matter how often you learn this lesson, it’s easy to fall for the next charlatan’s argument coming down the road.
Not long after swearing off the inanity of long-distance genealogy, I read in the New Yorker and Newsweek and all the top magazines that some serious amateurs had discovered that my great640-grandfather was the first man to set foot on the continent of North America some sixteen thousand years ago. Who wouldn’t want those bragging rights?
On a cool leafy hillside above a trickling Cross Creek in remote Pennsylvania, the sun crept through the trees, primordial. Nestled into the slope above, an open rock-shelter seemed just the place where any self-respecting Homo sapiens might set down his basketry and spear and light a fire.
Today, there’s a parking lot at the hill’s base and a set of sturdy stairs that lead to a wooden enclosure built by James Adovasio. He’s the Mercyhurt College archaeologist who’s been excavating this controversial site since the mid-1970s. Adovasio was guiding a rare tour for a dozen or so amateurs. A brawny variation of Martin Scorsese, he arrived in full archaeological drag: sleeveless flak jacket, boots, work pants, mystical belt buckle.
Inside the shelter today, there’s an office, electricity, good lighting, and a suspended boardwalk so that visitors and workers don’t stomp over all the evidence. Enormous squared-out holes plunge down into dense earth where tiny round markers dangle like pinned earrings in the stone. It was here that Adovasio found his controversial evidence, stone tools that carbon-date to 16,000 ± 150 years B.C.
This date puts the tools’ owners here four millennia before the end of the last ice age, which is around the time the first humans were traditionally believed to have arrived. Adovasio asked us to notice a pencil-thin black line in the stone. No one could really see it. So Adovasio splashed water on it, and the line darkened into little more than a pencil swipe across the rock.
“This is a fire pit,” he declared. All of us moved closer to the rail to squint and then decided, as with so much prehistoric archaeology, that we’d just take his expert word for it. He described the scene that once occurred here. Folks sat around the fire and cooked deer and squirrel while snacking on hackberries and nuts. Maybe they battered some rocks into spear points or wove some grasses into primitive baskets. In the chilly rock-shelter, it was easy to look around and imagine this ancient gathering. Typically, the prehistoric picture show that plays on the cave wall of our minds involves cavemen pursuing mastodons with spears. Instead, here we were in their kitchen, where people sat around the fire, eating and talking. Away from the picturesque hunt. Quiet time, culture time, story time.
Now, 16,000 ± 150 years later, we were once again gathered here for story time. But Adovasio was not alone in trying to tell this story. Helping him, sort of, was the fat guy in front of me. He was just one of the crowd, like me, but he had spent much of the tour loudly explaining—allegedly to his long-suffering girlfriend, but really to the confederacy of dunces that was the rest of us—just how much he knew about this place. He wore a fanny pack the size of a car tire cinched above pastel shorts, robin’s egg blue socks, and black tennis shoes. His XXL T-shirt declared: KLINGON ASSAULT GROUP.
He had already sneeringly uttered the phrase “politically incorrect” several times to signal that he was no victim of conventional wisdom but a man of daring opinions. He had let everyone in the place know that he very well intended to ask Adovasio the tough questions. So now the time had come: “Professor Adovasio, does working here in the rock-shelter in western Pennsylvania keep you safe from resentments with Native Americans?” He made an interrogative honking noise.
“No,” Adovasio insisted, “Native Americans have an intense interest in this site.” Adovasio segued quickly into a shaggy dog story about a certain Indian gentleman who was nothing but supportive. I looked at the dozen or so of us, all white folks in their forties and fifties, and none of us seemed a bit mystified about why Native Americans might be resentful. Perhaps that was why Adovasio didn’t feel obliged to really address this issue. His work, after all, suggests that the Native Americans were late getting here and that before Asians crossed the Bering Strait to settle North America around the commonly agreed time of thirteen thousand years ago, there were other people—from somewhere else—already here. He also knew that even more fresh evidence now suggests that these earliest people, and hence the true First Americans were, in the scientific jargon, “Caucasoid.” That is, white people who looked just like the Klingon ± 200 lbs.
Our continent’s creation story about the Asian hunter-gatherer crossing the Bering Strait is only about a century old and owes its origin to a black cowboy named George McJunkin. He had escaped slavery and fled out west. He taught himself book learning and herded cattle while pondering the world about him. An amateur scientist, McJunkin was said to ride a horse fixed with a big rifle scabbard in which he holstered his telescope.
McJunkin was well read enough to know that some old bones he found in Clovis, New Mexico, in 1908, were extinct animals. Twenty-five years later, experts investigating McJunkin’s discoveries found embedded in some of the bones of these ancient bisons a flat, rounded arrowhead with a bit of fluting at the base to assist fastening it to a spear. It would eventually become known as the Clovis point—the oldest spearhead ever found on the continent.
What makes the Clovis point so special is that it is found in massive numbers all across the continent and reliably enough at a level where organic material radiocarbon dates to roughly twelve thousand years ago. How massive? Take Bell County, Texas. The area north of Austin—known as the Gault site—must have been a well-known pit stop among the Clovis tribes. The place has yielded more than a million stone artifacts, more than half of them from the Clovis era.
“The whole idea of archaeology is that there must be enough redundancy in the record,” Richard Burger, a professor of anthropology at Yale, told me. Why? Because there is no other way to prove the case in archaeology, no other path to certainty.
“Archaeologists can’t do experiments,” Burger said. “Unlike lab science, we can’t mix carbon and sulfur and conclude that such and such happens. So we have something else that approaches that. We take advantage of redundancy so that the evidence repeats itself in broad patterns. With Clovis, this happens with confidence.”
In the last two decades, though, the confident tellers of Clovis Man’s story have been challenged by academic renegades devoted to identifying a new “First American.” There are at least four major sites (and some minor ones) in the Americas that claim to have found man-made objects dating tens of thousands of years before Clovis time. These theorists argue that while Clovis Man might still have crossed the Bering Strait thirteen thousand years ago, there is evidence that somebody else was already here. Given their natural caution, academics generally stop right there.
In the meantime, though, less credentialed theorists have stepped forward to identify the pre-Clovis somebody. This new theory holds that white people settled this continent first and that Native Americans are just another crowd of later visitors, like Leif Eriksson’s Vikings and Christopher Columbus’s Spaniards. Most importantly, the way this theory has leached out of cautious academia and into the pop culture as wild-eyed fact suggests that America’s neurosis about race has taken up a new and potentially toxic location—deep in the heart of our continent’s creation myth. This discovery has happened not on the front page of the newspaper but in the rumor mill at the edge of archaeology, on the covers of pop science magazines, and in the whispers of self-employed anthropologists and unmoored amateurs.
For any new story to get told, there has to be an opening, a sudden tectonic jarring of all the conventional wisdom of a discipline, a kind of tilt between an old worldview and a new one. And that’s where we are now in the subdiscipline of ancient American archaeology, poised between two views held (as always) by mossbacked conservative traditionalists on the one side and young agitated revolutionaries on the other.
The voice of skepticism and orthodoxy is best embodied by Professor C. Vance Haynes of the University of Arizona. He comes by his skepticism honestly. He once bought into a claim quite similar to Adovasio’s, back in the 1950s at a site called Tule Springs in Nevada. He, too, thought the Clovis line had been breached. He was convinced by extensive evidence of “hearths” filled with charcoal and animal bones, revealing a human encampment dating back twenty-eight thousand years. But later, when Haynes conducted precise tests of the charcoal, he realized that it was merely organic matter turning into coal. All of it was wrong. “You begin to see how easy it is to misinterpret things,” Haynes said.
Very easy. When you look at the evidence and the fights around it, you can understand why. First, arrowheads are cool things. Every little kid who has ever dug one up knows this. Arrowheads are symmetrical and beautiful objects. Their flutes, their chipped edges, their flared tails have all been studied, categorized, and given handsome names, dozens of them. The Madison point dates from A.D. 1400, the Whitlock much further back, at 400 B.C. Keep going with Haywood points at 5000 B.C. and deeper still to Cascade points at 8000 B.C. (or so) and finally to the oldest, the Clovis point. To hold in your hand a weapon that is five hundred, a thousand, five thousand years old is humbling and, just … neato.
The style of these points, as you travel back in time, become noticeably less arrowheady. Instead of tooled edges, it is clear that they were flintknapped—i.e., beat at with another stone. A beveled edge might be replaced by a straight blade. The barbs near the base get more rounded. Those graceful fishtails disappear and then you get a simple stone point with a groove banged out at the bottom, the telltale primitivity signifying Clovis time. Beyond that, it is hard to tell whether the evidence is or is not man-made. In fact, archaeology has a term for naturally occurring objects that appear to be artifacts: geofacts. One archaeologist told me that in the old days, they’d dynamite a cave ceiling and then let naïve students in. When the students returned to class excited by their find of “ancient arrowheads,” the teacher would then school them in the ambiguities of geofacts.
So it’s easy to understand how much it pains the young cubs of contemporary archaeology when they have to listen to their older colleagues, the establishment, say that the entire array of pre-Clovis evidence is a pile of geofacts.
And not all that big a pile either. While all the evidence of Clovis man would pack a railroad car, according to Vance Hayes, all the good physical evidence of pre–Clovis man “would fit in a foot locker.”
The dates are a mixed bag ranging as far back as fifty thousand years ago to more recent sixteen-thousand-year dates. There are no broad patterns, there are no similarities, and there is no redundancy. So when you look at the individual artifacts themselves, it can be pretty underwhelming.
Add to that the messy business of obtaining dates. Rocks cannot be carbon-dated. The organic material they are found nested in can be, though. But that material can be easily contaminated by rain, by burrowing animals, by time. Plus, radiocarbon-dating sounds precise, and the idea of it—that carbon-14 molecules throw off electrons at a metronomically consistent geological pace—is more exact than the reality. Almost since the discovery of radiocarbon dating, scientists have been noting phenomena that cause variations with the regularity of carbon’s internal clock—sunspots, stray comets, the 1945 atomic bombs—such that they require applying a “correction factor.” Thus, for any ancient evidence to be confirmed, the punk rockers of archaeology have to look for affirmation from their elders, the Lawrence Welk orchestra. Worse, the old fogeys, like Vance Haynes and others, are essentially being asked to confirm a theory that overturns their entire life’s work. This combination of murky evidence and professional oedipalism can mean only one thing: academic food fight.
So in prehistoric archaeology there’s a lot of dialogue between the conservative traditionalists and the rebel theorists that, boiled down, typically goes like this:
UPSTART ARCHAEOLOGIST: This is a primitive stone tool that’s sixteen thousand years old.
EMINENCE GRISE: No, it’s not.
UPSTART ARCHAEOLOGIST: Fuck you.
Actually, that’s not much of an exaggeration. In Adovasio’s book The First Americans, he quotes a friend who said, “ ‘If they don’t believe the evidence, fuck ’em’—definitely not scientific discourse but not ill considered, either!”
From its opening line—“ ‘Damn,’ I said.”—Adovasio’s book quivers with the fury of a scolded teenager. His own site, the Meadowcroft Rockshelter I visited in southwest Pennsylvania, has been roundly dismissed by elders who note the existence of nearby “coal seams” (yet another factor that throws off C-14 dating) and groundwater seepage. C. Vance Haynes is among those who have wrinkled their noses at Meadowcroft. On page 216 of his book, Adovasio dismisses him as the “grinch of North American archaeology.” Anyone who has questioned Adovasio’s own site at Meadowcroft is a “gnat.” Every page is dipped in upstart snark.
And no love is lost among the rebels themselves. When a Parisian archaeologist discovered an amazing site called Pedra Furada in Brazil, the initial reports were breathtaking. Besides numerous pieces of pre-Clovis evidence, there were cave paintings said to be even older than the images at Lascaux in France or Altamira in Spain. The Pedra Furada drawings depict hot Pleistocene Era group sex. Brazil’s tourist bureau developed plans to capitalize on the find. Then Adovasio himself came down as part of an expert panel which, sorrowfully, declared it all wrong. Adovasio wrote that he saw nothing but “almost surely broken rocks that had fallen into the rockshelter,” i.e., geofacts. He dismissed the find of “ancient fireplaces” as “nothing more than material blown in from nearby forest fires.”
And so it goes. The entire subfield of pre-Clovis is a tiny shark tank where attacks are constant and the chum that keeps bobbing away from them is the unquestionable piece of evidence that convinces all the skeptics that somebody else was here before Clovis.
Of course, the language of this brawl is academic and Latinate, mostly fought with the manly sesquipedalianisms of science jargon. Here, tree rings are “dendrochronological samples.” A rock is a “lithic,” and a rock that’s clearly been flaked by human hands is “an indubitable lithic artifact.” Bits of stone chipped off to make a tool are “percussion flakes.”
These are the lyrics of the trade, played in the key of the high-science formality. And it’s with such swaggering sesquipedalianismo that an entire career of work can be cattily dismissed: “My review has raised doubts about the provenience of virtually every ‘compelling,’ unambiguous artifact,” wrote the archaeologist Stuart Fiedel in 1999 of the most promising pre-Clovis site ever.
The fight over this site—Monte Verde in Chile—is the most notorious in the field. The archaeologist whose work was trashed is Tom Dillehay of the University of Kentucky. He claimed to find fantastic evidence of a pre-Clovis community, a series of huts, one of which was some kind of primitive drugstore since there were traces of pharmaceutical herbs. He found a tent post still staked in the ground with knotted twine from a Juncus tree or, in the jargon, “the indisputably anthropogenic knotted Juncus.”
In 1997, a team of specialists, including Vance Haynes, visited the site, examined all of Dillehay’s cool evidence, and unanimously approved it. The pre-Clovis line was officially breached. Tom Dillehay was the man. But not for long. Haynes began to waver. Then this Stuart Fiedel, a private-sector archaeologist, wrote a withering dismissal of every single piece of evidence presented.
In his book, Adovasio (who sided with Dillehay on this one) suggested that Fiedel “reserve some space in the State Home for the Terminally Bewildered.” Adovasio whacked Fiedel as “a previously little known archaeologist now working for a private salvage archaeology firm” who “has no field experience in Paleo-Indian sites or complex late Pleistocene or Holocene sites” and “has published one rarely used prehistory textbook but otherwise has no apparent credentials.”
Archaeology’s tribal relations are run by a caste system that goes like this. The Brahmins are the credentialed, tenured professors at known colleges. They publish in peer-reviewed journals. Beneath them are private sector archaeologists, also known as salvage archaeologists. They might publish in popular journals like National Geographic. But their day work is something different altogether. They determine for, say, a mall developer whether there are any “significant” remains on a piece of real estate slated to become a food court.
“The people in that world earn a good living,” said Richard Burger. “But it is seen as being outside of the intellectual debate because they are so busy writing reports, they don’t really publish in peer-review journals. Their work is sometimes called ‘gray literature.’ ” Below the salvagers are the rank amateurs and hobbyists who often spend a weekend out at some site hoping to find a Clovis point or two to sell on eBay or keep in their special cigar box back home. Below them all are scum, the scrurrilous amateur critics who write books such as this one.
Archaeology’s caste system is another facet of the discipline that makes it more amateurish a science than, say, particle physics. How many weekend astrophysicists could write up a report challenging Stephen Hawking that would be widely accepted as truth? When new evidence in, say, particle physics opens up a Kuhnian melee, the folks who rush into the breach tend to be … particle physicists. (There are exceptions, but take the point.) In prehistoric archaeology, though, with its rather elastic sense of membership ranging from well-credentialed academics like Adovasio to salvage archaeologists to slightly bonkers theorists to ranting neo-Nazis—well, all of them can rush right in. And do.
What underlies the mudslinging use of bloated Latinisms as well as the compulsion to make a show of tidy whisk brooms and Euclidean grids is the sense, maybe even fear, that archaeology is not a science at all. There’s a lot of play in the radiocarbon-dating, all the evidence is in dispute, and sure, maybe the elders’ caution can easily be dismissed as a Freudian conflict of interest. But maybe not.
All of this means that the pre-Clovis evidence requires a lot of interpretation, a fact that makes it very easy for personal desire and anxiety to seep like groundwater into that drawer full of cobbles and lithics. As one defender of Dillehay confessed in his own report: “I wondered if, by being too close to these stones for too long, I was building an interpretive sand castle.”
But the sand castle’s been built. From the few lithics, others have begun to tell a new American creation story—about just who pre-Clovis man was, where he came from, how he lived and died.
The sudden appearance of this yarn explains why prehistoric archaeology really isn’t as much a science as an object lesson in just how amateurism can get so amateurish. The story that is getting told is a form of improvisational narrative—tribal storytelling. These stories have less to do with what’s obvious from the evidence than what some of us, anxious rootless Americans, deeply long to hear. It’s time to look closely at the story that’s getting told right now about the earliest inhabitants of this continent. I have a little experience in this field. I know how to jerry-rig a narrative using only a couple of wayward factoids to make it sound just right. It’s something I was born to do. It’s in my royal blood. I am the direct descendant of King Charlemagne.
For most of the 1990s, the sotto voce chatter about pre-Clovis man and his possible identity was little more than politically naughty buzz out on the edge of archaeology. Insiders talked about spear points, DNA, cordage, and some disputed bones, but it wasn’t a story as much as it was narrative tinder, very dry, waiting for a spark.
Which finally flew, one hot summer afternoon in 1996 on the banks of the Columbia River. Some kids were trying to sneak into a hydroplane race, and as they stomped through the muck of a bank, one of them saw a few old bones and then pieces of a skull. The find was quickly passed on to a local forensic expert, a salvage archaeologist who worked out of a converted rec room in his house. He would become the rhapsode for these bones. Divinely, his name was James Chatters.
Chatters released the radiocarbon dating that put the bones back to 7600 B.C. He also described a Cascade point embedded in the hip. This style of Paleo-Indian arrowhead is a long, thin design that would fit right in with the skeleton’s age. So far, so good. Standard ancient skeleton. But then Chatters said something odd, almost nonsensical to those unacquainted with the amateur anthropology rumor mill. He said he didn’t believe this skeleton belonged to a Paleo-Indian at all but rather to “a trapper/explorer who’d had difficulties with ‘stone-age’ peoples during his travels.”
In other words, this skeleton was not merely a non-Indian but a non-Indian who had survived well into Clovis time and then been killed by Paleo-Indians. Suddenly, this skeleton was a victim and the find was a crime scene.
Even as the media tried to make sense of this peculiar story, the Indians demanded the bones, charging that they had to be of Native American heritage. This was how these stories typically unfolded.
In 1990, President George H. W. Bush signed the Native American Graves Protection and Repatriation Act. NAGPRA sought to make amends for the grave robbing and bizarre antics of the previous decades. In the nineteenth century, for instance, the Smithsonian wanted Indian skulls to mount on display. So, quite often after a battle, Indian corpses were decapitated and the heads packed in boxes and shipped back to Washington to be “studied.” The money was good enough that sometimes Indians would see white men—the emissaries of European civilization—loitering around a burial ground until the crowd left—in order to dig up Grandma and cut off her head. A few centuries’ worth of desecration of the Indian body is something mainstream history still avoids mentioning. It’s hard for non–Native Americans today to understand the lingering resentment. Try this on: Toward the end of the Civil War in Denver, a group of marauding white men interrupted theatergoers with a mid-show display of fresh Indian scalps—not merely from heads but from women’s vaginas as well. The civilized audience whooped with approval.
As Native Americans asserted themselves, beginning in the 1970s, their movement led inexorably to a congressional act aimed at returning all stolen skeletons to the appropriate tribes. Some estimates put the number of skeletons held in museums at holocaust levels: 200,000.
NAGPRA decreed that all Indian human remains that could be culturally identified were to be returned to the appropriate tribe. Dozens of fights with museums erupted, and still go on to this day. Consider the University of California at Berkeley—a place one might suspect is ostentatiously pro return of Native American skeletons. But that would be wrong. Beneath the Hearst Gymnasium swimming pool is a charnel house of “thousands of remains” that Berkeley stubbornly holds on to, fighting all legal efforts to have them restored and reburied.
But these are all fairly recent skeletons—i.e., the last few centuries. Jump back a millennium or two and the skeletons tend not to have been stolen in conflict but discovered at various archaeological sites. These are also being reclaimed and reburied under the NAGPRA law.
In the past, a number of unusual skeletons have been discovered that date long after Clovis but are rumored to possess characteristics of another race of people—different from the Asians who crossed the Bering Strait. These skeletons have long fed the pre-Clovis rumor mill as evidence of some other group of early North Americans who survived the Clovis-era arrival of Paleo-Indians before slowly disappearing. As if to fuel this mill, as soon as such bones are discovered, modern Indians seize them and rebury them. In the gossipy back alley of amateur American archaeology, they are notorious. They are mourned. Let us name them: the 10,800-year-old Buhl Woman found in Idaho in 1989, the 7,800-year-old Hourglass Cave skeleton found in Colorado, the 7,800-year-old Pelican Rapids Woman skull and the 8,700-year-old Browns Valley Man, both found in Minnesota—all reburied.
So, the Native Americans in Washington State immediately suspected this familiar talk about Kennewick’s origin was merely a political tactic to end-run the straightforward requirements of NAGPRA. The scientists, especially when it concerned these ancient skeletons that they wanted desperately to study, counter-suspected political correctness run amok. The conspiracy theorists, not surprisingly, saw a conspiracy. Then, something else happened. The issue quickly got caught up in energy politics.
At the time of the discovery of Kennewick, the Umatilla Indians were working with the Clinton administration to dispose of some outdated chemical weapons (WMD, as we say nowadays). The federal government wanted tribal approval on this difficult matter. And, by the late 1990s, the Umatilla were also a federally recognized tribe and had a casino, which meant they had political and financial clout and couldn’t easily be kicked around. So, when they screamed for the bones, the Clinton administration jumped. Chatters and the group of scientists who gathered around him calling for an open inquiry into the skeleton were stunned by what happened.
Bruce Babbitt, the then secretary of the interior, ordered that the US Army Corps of Engineers seize the bones. In the meantime, to “stabilize” the site where the bones were found, the Army choppered in five hundred tons of riprap and buried the bank. The archaeological site was protected by being destroyed.
This wasn’t just politics to the scientists, it was medieval obscurantism. This was the equivalent of forcing Galileo to recant and locking him in his room. Which is how the entire drama would play out in the courtroom—a fight between Native American Indians trying to respect their elders and secular scientists defending their right to open inquiry.
But this time, there was one difference, one word that totally changed the story in the pop-culture telling outside the courtroom. When Chatters first examined Kennewick in his rec room, he looked at the skull and then deployed a single word to the media to describe what he saw: “Caucasoid-like.”
The narrative tinder suddenly exploded in flames, and from the fire arose a new and wild story: A Caucasoid man, who was among the First Americans, was murdered by genocidal newcomers, Mongoloid invaders coming across Beringia after the last ice age.
Throughout the theories and quarrels of this prehistory, there is a strange kind of recapitulation going on. Every theory propagated about the European conquest of the Indians after Christopher Columbus seems to have its doppelganger in the pre-Clovis era. Just as American Indians were the victims of genocide in the colonial period, so it seems were the early Caucasoids at the hands of paleo-Indians. Some theories say that the early Caucasoids were wiped out by germs, a recapitulation of the account of Indians and smallpox-infected blankets, which has become a near parable in American history. In this way, the scientists could even claim that the Indians’ attempt to take control of the Kennewick skeleton was simply the evil twin of nineteenth-century grave robbing, and haven’t we all had enough of that?
To bring the entire fight to a level of absurdity that marks it as a truly American event, the Asatru Folk Assembly—a neo-Norse movement that claims to represent the “native European” religion—also claimed Kennewick’s bones. The neo-Norsemen argued that they were the nearest tribe related to Kennewick Man and that under the law of NAGPRA they should be given the bones for reburial. The court did not give them Kennewick but did allow them to perform funeral rites over his bones. And so a year after that hydroplane race, big hairy blond men wearing horns and garish furs performed the Norse burial ceremony in Washington State for their mourned errant ancestor.
Does race exist? Of course it does. We see it every day. Guy steals a purse, and the cop asks, What did he look like? And we all easily say: He was a six-foot-tall black guy, or a five-and-a-half-foot-tall Asian man, or a white guy with long red hair. As a set of broad descriptions of how people look, race exists.
If you were to look at me, you would easily categorize me as Caucasian. I’m the ruddy sort that burns quickly, with reddish hair now shading into white. Most people hazarding a guess might say Scots-Irish, which is what I have always said. Just to be sure, I once submitted my DNA to see what the incontrovertible scientific evidence might show. The result was surprising: I carry the DNA marker found in great abundance among the Fulani Tribe of contemporary Nigeria.
Sure, maybe the marker is about as significant as my Charlemagne genes. On the other hand, that very Nigerian coast is the tribal location where many slaves were captured and held in the notorious slave castles until traders’ galleys could transport them to American ports. The main harbor that received more slaves than any other on the eastern seaboard was Charleston, South Carolina. My mother’s family has lived there nearly three hundred years. Maybe I have a Thomas Jefferson problem.
Since I had my blood work done, my nephew Chance Algar started poking around the distant family tree, and one Christmas he called me over to his computer. He showed me my great-great-grandfather— the man for whom I am named—in the 1860 census. But that man does not turn up on the census roles in 1870. Presumably, he died for the Confederate cause. His widow, Mary, along with their four children moved back in with her parents. In 1870 this older couple is marked on the census as “colored,” as are the neighbors on their street. But Mary and her children are marked white, their designation in the previous census. In all likelihood, Mary and her parents were of mixed race. But Mary could pass for white. Had the census-taker penciled in a single letter “c” on that form, how different might be the genetic trajectory of my then-toddler great-grandfather. But there it is: the probable origin of my genetic marker. I am one-sixteenth African-American, or in the pseudo-scientific jargon of that time, a mustefino or a hexadecaroon.
Yet it’s no longer apparent in the way I now look. I am Caucasian as surely as my Fulani cousins are black: because race is a set of visual cues, mainly skin shade but also nose shape, eyelid folds, cheekbone prominence, etc. We hold these vague blueprints of race in our heads because, as primates, one of the great tools of consciousness we possess is observing patterns in nature. It’s no surprise that we’d train this talent on ourselves.
The notion of race as an unchanging constant through time is an accepted truth as old as the Bible. When Noah’s Flood receded, the three boys Japheth, Shem, and Ham went out into the world to engender, respectively, white people, Semites and all others. This doesn’t quite shake out into the latter notions of white, black, and yellow, but you get the idea. The terms are still with us. The early word “Shemitic” settled down to become “semitic.” And, among amateur chroniclers writing in the ponderous style of the town historian, it’s not hard to find references to the “Hamitic race” as a way of saying “black folks.” Japheth never became a common adjective, perhaps because of those unwieldy consonants. More likely, it’s because whites appointed themselves the Adamic task of naming the other races. It was not until the Age of Reason that scientists tried to figure out empirically what race meant and how it came to be. The signal year was 1776 with the publication of a book, On the Natural Variety of Mankind, by German biologist Frederick Blumenbach.
In his day, Blumenbach’s theory had a certain symmetry that made it the very model of good science. These days, his theory seems insane. He argued that Native Americans were the transitional race that eventually led to Asians. (Don’t try to work out the geography of this, it will make your head explode.) And another group—which Blumenbach simply conjured from a faraway people, the “Malaysians”—evolved over time to become Africans. (Again, if you’re puzzling out the geography, watch your head.)
At the center of all this change was the white race, which was constant. Blumenbach believed darkness was a sign of change from the original. All of mankind had fallen from perfection, but the darker you were, the farther you had fallen. As a result, the best way to locate the original Garden of Eden, according to Blumenbach, was to follow the trail of human … beauty. The hotter the women, the hunkier the men, the closer you were to what was left of God’s first Paradise. Here’s Blumenbach, explaining the etymology of the new word he hoped to coin:
I have taken the name of this variety from Mount Caucasus, both because its neighborhood, and especially its southern slope, produces the most beautiful race of men, I mean the Georgian.
Blumenbach’s theory is totally forgotten today by everybody (except maybe Georgians) but this single word and the oceans of misconceptions that have sprung from it probably are owed to some one trip Blumenbach made to the area where a local girl gave him a lusty wink. The word itself is lovely. Say it: Caucasian. The word flows off the tongue like a stream trickling out of Eden. Its soothing and genteel murmur poses quite a patrician contrast to the field-labor grunts of the hard g’s in Negroid and Mongoloid. Caucasian. When you say it, the exotic isolation of those mountains intimates a biblical narrative. You can almost see it when you say it: the early white forebears walking away from paradise to trek to Europe and begin the difficult task of creating Western Civilization.
Ever since Blumenbach launched this word two and a half centuries ago, the effort to pin down the exact and scientific meaning of race has never ceased and has never settled into any undisputed categories. Even today, the US Census is little more than an explosion of ethnic agony that arrives every ten years like constitutional clockwork.
The number of races has expanded and contracted wildly between Blumenbach and now, depending on the mood of the culture. The basic three have gone through scores of revisions, growing as high as Ernst Haeckel’s thirty-four different races in 1873 or Paul Topinard’s nineteen in 1885 or Stanley Garn’s nine in 1971. Today, we nervously ask if you’re white, African-American, Native American, Asian, Hispanic, or of Hawaiian or Pacific Islander descent.
But it wasn’t that long ago that the question would have turned upon races only our great-grandfathers would recognize. Let us mourn their passing: the Armenoids, the Assyroids, the Veddoids, the Orientalids, Australoids, the Dalo-Nordic, the Fälish, the Alpines, the Dinarics, the Fenno-Nordic, the Osteuropids, the Lapponoids, the Osterdals, the Cappadocians, the Danubians, the Ladogans, the Trondelagens, and the Pile Dwellers.
In the meantime, science has made its discoveries. The mystery of race has been solved. For the longest time, the answer was stymied by a contradiction. Surely skin tone had something to do with colder climates creating paler shades, but then why weren’t Siberians as pale as Swedes, and why were Eskimos as dark as equatorial islanders? The answer was announced, but it’s so tedious hardly anyone noticed.
Skin pigmentation changed to regulate the amount of Vitamin D3 manufactured by the sun just under the skin. This is the theory of Professor Nina Jablonski, a paleoanthropologist with the California Academy of Sciences. So when the first dark inhabitants moved into Scandinavia, they confronted scant local resources—and almost no external sources for Vitamin D3. Their kind quickly selected out for paler children whose light skin would manufacture enough Vitamin D3 to keep them healthy. Meanwhile, Eskimos arrived to the Arctic dark-skinned. The local cuisine of seal and whale is rich in Vitamin D3, so the skin was never summoned into action. Evolution has one big rule: If there’s no pressure on the system to change, then it doesn’t bother. So Eskimos remained dark.
When we look at the different races, according to Jablonski’s theory, what we’re actually seeing is not “superiority” or “good people” or “race.” All that we are seeing, the only thing we are seeing when we look at skin color, according to the science, is a meandering trail of Vitamin D3 adaptation rates.
Science prefers to confirm its newest findings with the newest tools. Fingerprinting is no longer the gold standard of evidence now that DNA testing is the absolute solid proof of guilt or innocence. In anthropology, the cutting-edge techniques come with gleaming names—Optically Stimulated Luminescence, Electron Spin Resonance Dating, and Accelerator Mass Spectrometry. These are the devices that are confirming pre-Clovis dates in ways that make radiocarbon dating look like counting tree rings. By the time we figure out how they are flawed, of course, our prejudices will be so well muddled among the tentative facts that they will be as inextricable as ink from milk.
According to the revolutionaries heralding pre-Clovis, it hardly matters since so much other modern proof is appearing. New lab tests reveal that Native Americans apparently have a signature strand of very old DNA known as Haplogroup X. The only other large population on the earth carrying this genetic marker is Europeans. The suggestion is that there must have been intermarriage before Columbus, possibly before the last ice age. Moreover, now that the Iron Curtain has fallen, archaeologists have been able to do more digging in Siberia, where they expected to find Clovis points or something like them. But they haven’t. This absence, as well as the presence of Haplogroup X, has led some people to theorize that while Clovis man might have crossed over thirteen thousand years ago, at the end of the last ice age, he would have encountered someone already here—someone possessed of the X gene as well as the Clovis toolkit.
Who might these people have been and where might they have come from? One prominent theorist with an answer is America’s chief archaeologist at the Smithsonian Institution. A big bearded bear of a man, Dennis Stanford could pass as a Norse king from some other time. Stanford has struggled with the mystery of why Clovis points don’t show up in Siberia. He notes that they resemble the early work of Solutrean culture. The Solutreans were prehistoric people who lived in modern-day France and Spain some eighteen thousand years ago. They are perhaps most famous for being the possible artists who painted the horses of Lascaux and their own hands on the walls of the Altamira Cave. Stanford argues that their toolkit, which included stone points, looks like a predecessor to the Clovis style.
“There must be fifty or sixty points of comparison,” he has said.
He believes that these proto-Europeans must have been intelligent enough to make water craft. Hugging the coast of the glacial crescent of the northern Atlantic, they followed what is called the “kelp highway”—brimming with food—and sailed away to a new land.
Other scientists are providing even more evidence that seems to corroborate these general ideas. Several anthropologists have daringly revived the argument that examining skull shapes can reveal ethnicity. Pioneered by Douglas Owsley, also now at the Smithsonian, and his partner Richard Jantz at the University of Tennessee, two scientists who have put together collections of measurements, described by Newsweek as a database of “2,000 or so profiles” that “consists of some 90 skull measurements, such as distance between the eyes, that indicate ancestry.” They have developed software that allows them to input a bone’s measurements and the output is “ethnicity.”
Among their fans and followers, there is talk of some of the peculiar skeletons found over the years. An ancient body, known as Wizards Beach Man, found at Pyramid Lake, Nevada, in 1978, was determined to be possibly of “Norse” extraction and to have “no close resemblance to modern Native Americans.” Another skeleton, known as Spirit Cave Man, was found in Nevada in 1940. His bones date to 7450 B.C., and when his skull measurements were run through the database, out spat a finding of “Archaic Caucasoid.”
Once again, there’s Blumenbach’s word. Only this time it’s got that “-oid” ending. What is the difference between Caucasoid and Caucasian?
“Caucasoid sounds more scientific,” said University of North Carolina anthropologist Jonathan Marks, laughing. Otherwise it has no more meaning or significance than Blumenbach’s original. Caucasoid is a magnificent piece of pure Star Trekery, a word meant to sound all clinical and precise, even nerdy. But the word is a rhetorical Trojan Horse. Its surface meaning suggests something scientific, respectable and learned, when in fact what we really hear is the connotations lurking inside, long-suppressed intimations of superiority, exceptionalism, and beauty.
The court fight over Kennewick Man was resolved in favor of the scientists—in part because this is America and who can be against “open inquiry”? In the popular market of ideas, though, the courts also legitimized the story of the Caucasian man who came to this continent as the Authentic First American and whose bones survived the millennia to report the truth. It is the story that has gotten told this last decade about this hundred-century-old man that is arresting in its perversity. It begins with his name. Does anything sound more European, positively British, than Kennewick? Native Americans had dubbed him the “Ancient One.” But it didn’t take. The mass media, which follows the meandering will of the popular mob, could sense where this story was trending, and so they ran with “Kennewick.” Isn’t that a suburb of Essex or the other airport in London? Perhaps not so ironically, the origin goes back to a local Indian chief named Konewack, whose Native American name was anglicized to Kennewick after the railroad workers moved in. As if it were a trend, the very year after Kennewick, more ancient bones were found on Prince of Wales Island. This skeleton was quickly declared to be “Prince of Wales Man,” making it seem like the Stone Age forebears of the Saxon kings fancied the Pacific Northwest as a dandy vacation spot.
In the few years after Sir Kennewick’s discovery, his life was described and depicted in all the leading magazines. One writer on the subject, Sasha Nemecek, confessed that when she looked at the evidence “the misty images of primitive explorers evaporate” and now “I suddenly picture a single artisan spending hours, perhaps days, crafting these stone tools” whose “workmanship is exquisite, even to my untrained eye.” For the article, an artist rendered images of what Kennewick and his ilk looked like. He’s an average-height white man with round eyes and some sexily tousled long brown hair. He is wearing a pair of long slacks sewn with a fetching seam straight down the leg to his ankles and a big animal coat (with long sleeves)—an outfit that looks like it comes from Ralph Lauren’s Flintstone collection. If it weren’t for the spear in his hand, you might mistake him for an English professor at Bennington, but in fact he’s the “First American.”
And his bride has the complex toolkit of her time, not to mention a nice Ann Tayloroid dress and a haircut that presages Jennifer Aniston by nearly ten millennia. She has thoughtfully shaved her legs for the artist, the better to see her lovely Caucasian skin.
Where did these pictures appear? Scientific American.
Kennewick was instantaneously described with words that launched him millennia ahead of his primitive enemies, the Paleo-Indians. He was, as Chatters had said from the beginning, probably an individual “trapper/explorer”—two words that, together, suggest that Kennewick was practically an advance scout for Lewis and Clark. And these words imply degrees of complex rational thinking, especially when set beside a horde of “Stone Age peoples.” Other articles painted beautiful scenes of Kennewick as the “strongest hunter in his band.” Paleo-Indians were still mucking around in “tribes,” while Kennewick traveled with a “band,” which “usually consisted of immediate and extended family members with several bands ranging over the same general territory.”
Family? Absolutely, that’s why Kennewick lived “a good life. He had a mate, and two of their four children still were alive. He still could hunt, though he relied on his dog to bring the game down. And young men in his band still asked his advice, though lately, his sister’s mate was showing signs of impatience, always wanting to do things at his time, in his way.” Food was important. “To keep up his strength, he and his band dined on rich, lean roasts and steaks.” Kennewick is naturally on the Atkins diet. No type II diabetes from obesity for Sir Kennewick.
“It’s natural that people of that time would trade with distant bands,” the article goes on to say, although, as a caution, it adds, “there’s not a lot of physical evidence to prove it.” But proof isn’t really necessary when you have adverbs such as “likely” to work over: “To protect his feet, Kennewick Man likely trudged the hills and valleys of Eastern Washington and Oregon in sandals made of sagebrush bark.”
Within the first few years of his reappearance, Kennewick Man received major, lengthy profiles in nearly every major magazine in North America, from Newsweek, the Economist, and Natural History to the New Yorker, Maclean’s, and Discover. In all of these articles, sourcing is a good place to invisibly move the story in one direction or another. For the confirmation of Kennewick’s skull shape, the articles most often cited two people. One was Catherine J. MacMillan, an acquaintance of Chatters and a fellow private-sector archaeologist. Most articles avoid naming the salvage archaeology company she runs, perhaps because it lacks a certain gravitas: the Bone-Apart Agency. The other source is Grover Grantz, who was a professor at Washington State University. He, too, has an unmentioned gravitas problem, as a pioneer of “Sasquatchery” and a man often described as the “only true scientist to throw his hat in the ring”—the ring being the hunt for Bigfoot. Grantz was the physical anthropologist who suggested that believers kill a Sasquatch and bring in the corpse in order to prove his existence. Grantz died in 2002 and donated his skeleton to the Smithsonian to show he had no fear of having the flesh boiled off his bones so that they could be mounted for display.
Kennewick Man “seems to have been a tall, good-looking man, slender and well-proportioned.” Blumenbach’s notion of superiority as beauty is never really behind us. Also: “Some nearby sites contain large numbers of fine bone needles, indicating that a lot of delicate sewing was going on.” Here’s a classic case of journalistic nudging. Fine bone needles found in nearby sites. Could they have belonged to the marauding hordes of Paleo-Indians? Instead here’s the shove: “Kennewick Man may have worn tailored clothing.” Dig the “may.” But also swish that other word around on your connotative palate. “Tailored.” Feel the force, tugging us in a certain direction. Then: “For a person at that time to live so long in relatively good health indicates that he was clever or lucky, or both, or had family and close friends around him.” Good health, clever, family, close friends—a veritable claret of connotative complexity.
And these are the elegant accounts which struggle to keep the story contained inside the scientists’ own cautious terms. From there, the implications of Kennewick quickly became insinuated in current fashions of political opinion. Here’s the National Review writing about “the growing suspicion among physical anthropologists, archaeologists, and even geneticists that some of the first people who settled in the New World were Europeans.” Note how a tentative resemblance of skull shape, “Caucasoid-like”—always hedged by the scientists—has quickly settled into declarative certainty: “were Europeans.” The politically obvious conclusion is also clear, as the writer continues: “An important part of American Indian identity relies on the belief that, in some fundamental way, they were here first. They are indigenous, they are Native, and they make an important moral claim on the national conscience for this very reason. Yet if some population came before them—perhaps a group their own ancestors wiped out through war and disease, in an eerily reversed foreshadowing of the contact Columbus introduced—then a vital piece of their mythologizing suffers a serious blow. This revised history drastically undercuts the posturing occasioned by the 500th anniversary of Columbus’s 1492 voyage.” Once you step away from the magazines and books, the story leaches into the poisonous domain of online discussions, where one can easily find comments like this one from shmogie1 on the alt.soc.history board: “Kennewick man is older than any known N/A [Native American] remains, and appears to be much more European than N/A, so your people stole the land from my European ancestors who were here first.” And then you finally drift on down to the neo-Nazis, always good for a pithy quote. The extermination of the American Indian, said former Klan Dragon and part-time Nazi Louis Beam, was just “white people” getting revenge.
Most of these accounts conclude with a hushed imagining of how that spear point got into the hip. Because Kennewick Man’s skeleton body was found in a riverbed, the writers surmise that he died there, as in this sentence: “He may have perished alone on a fishing trip, far from his family.” Cue Samuel Barber’s Adagio for Strings.
In these stories, the Indians are typically ignored or they simply move about as a supernumerary horde summoned onstage to throw the Cascade point. They have no friends or family. Might they have also been clever and scored well on their SATs? It’s never mentioned, or maybe there just weren’t enough bone needles to draw such a conclusion. But the hedge in Kennewick’s favor is constant, each detail slightly pushed toward revealing a man who was smart and carried complex tools and had damn sophisticated taste in clothing. He hunted and ate well and had good bone structure. He was surrounded by friends and family, i.e., that intimacy of culture that would lead to the abandonment of nomadism, the invention of agriculture and society, the stable foundation that would lead us inexorably toward Western civilization. Which, in turn, would bring Kennewick’s Caucasoid-like descendents back to America to find him and tell his story.
The story of a European presence here in America prior to the Indians would be a truly novel tale if it hadn’t already been told so many times. The number of theories holding that Native Americans were either latecomers or actually Europeans who’d set aside their history for loincloths is impressive. We—and by we, I mean white people—have been getting Indians wrong from the git-go. Remember where the word “Indian” comes from, for starters. But even the earliest depictions of Indians simply used European bodies and faces with a few feathers added.
Western Europeans were stunned that the New World had so many people already in it. How could these primitives have gotten here first? They must be … us! Theories abounded. Some British savants thought Indians were covert Welsh families who’d slipped over on their rafts, the cunning little demons. Others theorized that the Indians were the lost people of Atlantis. There were a whole host of arguments that Indians were Jews. During the colonial era, the chief rabbi of Holland, Menashe ben Israel, theorized that all Native Americans were descendent from the Lost Tribes of Israel, and the theory was confirmed by a 1650 book entitled Jews in America or the Probabilities That the Americans Be of That Race. Mormons continue to believe this account of Native American origin, holding that the sons of Lehi sailed to the Americas around the time of Christ and forgot their knowledge of the Torah. They reverted to a state of savagery and their descendents scattered among the plains and throughout the two continents. Thus, all Indians are essentially: Jews Gone Wild.
Most Americans rarely saw images out of books such as the rabbi’s. Rather, the most common mass media image was found on the coins in your pocket—and for most of American history, those Indians looked astonishingly European, if not Roman imperial.
In 1914, on a ten-dollar gold coin, it was still possible to see in the face of the Indian the wavy blond Nordic princess of that dream. Then, at last, the famous buffalo nickel appeared on the eve of World War I, the first Indian image that looked like a Native American. It’s unlikely Chatters knew the full depth of this history when he asked a local artist to take the Kennewick skull and reconstruct the face.
Well, if only that were what he did. Chatters didn’t just hand the skull to someone and ask him to reconstruct the face. Instead, he had an epiphany, as he explained once, right at home: “I turned on the TV, and there was Patrick Stewart—Captain Picard, of Star Trek—and I said, ‘My God, there he is! Kennewick Man.’ ” And not long thereafter, the artist’s bust appeared on magazine covers across the country as a piece of sculpture that so resembled Picard you found yourself looking to see if he was holding a phaser.
Forensic reconstruction is a very iffy “science.” The problem is that the features that we look to for identification are fleshy ones—ears, nose, and eyes—and are the most difficult to know from a skull. By the way, I and every other writer call Kennewick’s head a “skull.” The implication is that it was found whole. But in fact, it was found in parts. Chatters pieced it together. When other government experts put the pieces together, they built a skull whose dimensional differences from Chatters’s version were deemed statistically significant. Again, at every stage of this story, the bits and pieces—in this case, literally—get pushed toward the Caucasoid-like conclusion.
Reconstruction is more art than science or, with its stated success rate of roughly 50 percent, about as good a predictor as a coin toss. Consider what Chatters did: By making Kennewick perfectly resemble one of the most famous pop-culture Brits of our time, he let the visual cues confirm his finding without ever having to once again repeat the term “Caucasoid-like.” Add to that the fact that leaving the clay gray-colored made it easier for the brain to fill in the skin tone. Chatters has revealed that he suggested to the artist that he not include the “epicanthic fold” of the Asian eye since leaving that out would be “neutral.” Plus, by leaving the sculpture bald, the artist produced a kind of featureless mannequin whom anyone can dress up with the hairstyle, eye color, and skin tone drawn from our deepest racial closet.
Kennewick’s skull is always described as “narrow, with a prominent nose, an upper jaw that juts out slightly and a long narrow braincase.” This description is often phrased this way: dolichocranic and slightly prognathous, marked by a lack of an inferior zygomatic projection.
Such sesquipedalianismo. Yet here’s the problem with looking at those vague features and declaring them “Caucasoid”: We don’t really know what people’s skulls looked like ten thousand years ago. We only have a few, like the pre-Clovis points, so it’s reckless to draw any conclusions. Skull shapes, like skin color, can change much more quickly than we think, especially if there’s been traumatic environmental change.
Franz Boas, the legendary anthropologist from the turn of the last century, is most famous for debunking a lot of skull science in his time by proving that the skulls of immigrant children from all parts of the world more closely resemble one another than they do their parents’. Rapid dietary shifts can cause major structural changes in skeletons—just ask the average Japanese citizen who has shot up four and a half inches in height in a single generation, or the average American man who has packed on an extra twenty-five pounds since 1960. The truth is that there exists no coherent history of skull shapes back through time, so to say a ten-thousand-year-old skull resembles a modern white guy skull is to compare apples with raisins.
In time, Chatters tried to calm the storm over his own remarks. He had repeatedly said things like this: Kennewick Man “could also pass for my father-in-law, who happens to be Scandinavian.” Then one day he was suddenly insisting: “Nobody’s talking about white here.” His contradictions are maddening. At one point, Chatters explained: “I referred to the remains as Caucasoid-like.… I did not state, nor did I intend to imply, once the skeleton’s age became known, that he was a member of a European group.” Afterward, he offered writer Elaine Dewar a coy aside: “I say you can say European. Who can prove you wrong?”
He insisted that he meant that the skull simply didn’t resemble the classic “Mongoloid” features of Asia. He said Kennewick could have been Polynesian or even ancient Japanese. It turns out that those vague Caucasoid features are also found in the Ainu people of prehistoric Japan, as well as other places outside Europe.
Don’t be confused here. The scientists themselves who fling around words like “Caucasoid” are the very ones who also admit that the “Caucasian” skull is found everywhere. That’s right. Everywhere. This Caucasian skull shape—they will admit—is found all over the planet. For example, another ancient skull always brought up alongside Kennewick’s is a female skull found in Mexico. Nicknamed Luzia, the skull was analyzed in a report that found a resemblance to skulls seen among early Australians, bones found in China’s Zhoukoudian Upper Cave, and a set of African remains knowns as Taforalt 9.
So we’ve narrowed the source of this Caucasian skull to Australia, China, and Africa. Huh? Another study, of an ancient skeleton known as Spirit Cave Man, narrowed down his skull shape origin to: Asian/Pacific, the Zulu of Africa, the Ainu of Japan, the Norse, or the Zalawar of Hungary. Just to add to the confusion, in 2009, an ancient skeleton one thousand years old and identified as “Incan,” was dug up in … Norway.
What conclusion can be drawn from finding Caucasian skulls in Asia? Or finding African skulls in Brazil? Or finding Polynesian skulls at the continental divide? Is it that these “groups” traveled a lot thousands of years ago, or that skull shapes change radically and quickly over time? Of course, it’s the latter, and some anthropologists have known it for some time. In the early twentieth century, Harvard anthropologist Earnest Hooton documented the wide variety of skull shapes he found among ancient Native Americans. (That’s right. This early scientist who spoke the uncomfortable truth about race was named Earnest Hooton.) “He studied Native American skulls from pre-contact all the way to the eighteenth century and he sorted them into cranial racial categories,” said Jonathan Marks. “He called them ‘pseudo-Australoid’ and ‘pseudo-Negroid’ and ‘pseudo-Mediterranean’ because they had those features. He was smart enough not to say, Well, I guess these people encountered a stray Australian aborigine on his way to Colorado. Clearly he recognized that there was considerably more diversity in early Indian skulls than he was used to seeing. And that’s the point in all this: Once you’ve started to racialize those variations, you’ve already given your answer.”
What this suggests may not be that Africans and Mongoloids and Europeans were storming the American shores ten thousand years ago as much as that in any one group at any one time, you will find all kind of anomalies. In fact, among many nineteenth-century images of Native Americans, you can easily find lots of paintings and even photographs of Indians whose skull shape is precisely that of Captain Picard with his “Caucasoid” and dolichocranic good looks. The famous image of Chief Whirling Thunder (Google it) looks more eerily like Jean-Luc Picard than Kennewick does.
The Center for the Study of the First Americans, at Texas A&M University, is a clearinghouse for pro-Caucasian theories of early America. The center publishes a manly newsletter, Mammoth Trumpet. There one can find a set of arguments that inspire a kind of sorrow and pity. The director, Dr. Robson Bonnichsen (like so many of these academics: the look of a Norse king with a big bushy beard), commonly says things like this: “We’re getting some hints from people working with genetic data that these earliest populations might have some shared genetic characteristics with latter-day European populations.” Maybe he doesn’t know that he’s the direct heir to King Charlemagne?
What makes the claim all the more paltry is that once you start reading about the European connection to pre-Clovis man here in America, you can’t help but notice that the same essential story is getting told in other, separate fields—such as the story of when the first European evolved or when early ape creatures crossed over the line leading to humans. All of them make claims that have the contours of the same fight—the revolutionaries challenging the traditionalists, all of them finding a way to shoehorn Europeans into a story, with hints of superiority and beauty. In so many of these fights, you can find the same kinds of amateurs making the same mistakes, arriving always at the same conclusion: that European development and civilization are somehow separate from the proletarian evolution of the rest of the human race.
For instance, the current theory about the beginning of mankind—the Out of Africa theory—states that an early pre-human, Homo erectus, evolved into Homo sapiens, who then left Africa some one hundred thousand years ago and eventually evolved into the modern peoples of the world. But there is a small contingent of rebel theorists—the “multiregionalists”—who hold that it was Homo erectus who spread out to various locations, in each of which he developed into a particular transitional hominid. In Asia: Peking Man. In south Asia: Java Man. And in Europe: Neanderthal Man. Each of these specimens would eventually evolve simultaneously into Homo sapiens.
According to the rebels, there was some gene mixing at the margins of these separately developing species, to keep the general hominid ability to reproduce together. It’s a serviceable theory that manages to keep all mankind barely in the same species while creating an intellectual space for racial differences and European uniqueness. It is the “separate but equal” theory of physical anthropology.
As theories go, however, multiregionalism can be pretty slippery. For the longest time DNA tests revealed that Neanderthal Man made no direct genetic contribution to modern man. But recently researchers in Europe—several of them sporting Viking beards—discovered that there was a genetic connection. Multiregionalists labor mightily to keep Neanderthal Man in the picture, arguing that there had to have been some sex among the different humans and that the evidence is with us. One of the arguments is that my big nose (as well as those great beaks on Jews and Arabs) is telling evidence of Neanderthal genes. That’s the theory of Dr. Colin Groves, whose very dolichocranic skull sports the requisite Nordic beard.
Remember how Neanderthal Man used to look—the ruthless brute of comic books, the near knuckle-dragger who exited around the time of the Last Glacial Maximum? Well, he’s had a complete makeover. Frank Frazetta’s neckless, club-swinging primates are now key players in the unique European formation of modern-day Caucasians, so they’ve put down their bludgeons and picked up some complex tools. One DNA study suggested that Neanderthal Man had red hair, practically Scandinavian. He’s gotten a haircut, Botoxed the beetling brow, and in the museums the Neanderthal models have replaced their murderous scowl with a pensive, more Rodinesque expression. One current display in Europe includes a cute Neanderthal boy with big eyes and the Broadway musical hair of a Dickensian street urchin. Like his dad, he’s had his Neander-snout bobbed into something that could easily develop into a tastefully trim WASP nose, and he holds a questioning expression that instinctively makes you quietly pray that he gets that ring to the volcano or it’ll spell doom for Middle Earth.
There are so many of these Euro-centric theories where the key moment of development that “makes us human” somehow occurred in France or Germany that I began to collect them, like baseball cards. Those cave paintings in Lascaux and Altamira are often held up as the threshold event revealing “abstract” thought, which made us truly human.
My personal favorite, this week, takes us all the way back to the apes. A primate specialist in Toronto named David Begun holds that he has found “the last common ancestor to the great apes”—i.e., the notorious missing link. Where?
In Europe. His theory is that African apes crossed into Europe, picked up those civilizing traits that would eventually lead to humanness, and then slipped back to the Dark Continent just under the deadline for their Out of Africa journey. A few years back, another ape, excavated near Barcelona, Spain, was heralded as further proof of Begun’s theory. The researchers remained tight-lipped about what it all meant, but popular outlets found ways to get the point across, such as this sentence in a CBS News report: “The researchers sidestepped a controversy raging through the field by not claiming their find moves great ape evolution—and the emergence of humans—from Africa to Europe.”
Every once in a while, one of the science magazines will ask mechanical engineers to consider the not-so-intelligent design of the human body. This chestnut of an article usually tells us that with a little more webbing in the toes we’d be great swimmers or that with slightly fleshier Yoda ears we’d be able to hear whispers a block away. These pieces usually make some reference to the naughty observation that only a third-rate architect would run “a toxic waste line through a recreation area.” We also learn that the spine is really a bit of jerry-rigging for an upright primate. A tube of flexible cartilage would result in far fewer deaths and better protection for the core piping of the central nervous system. Add to the list that our knee joints aren’t much to speak of, that our vision is actually fairly limited, and that our hips are oddly narrow (making childbirth nightmarish). If our shanks were longer, we wouldn’t be writhing hobblers when we ran. Instead we’d resemble gazelles or, more likely, ostriches. Still, we’d be awesomely fast. But it never worked out that way. The body—with its Rice Krispies knees, stumpy tibia, and dainty ears—has merely gotten us by all these eons. So, here we are, like it or not, with a chassis that’s … just fine, as is. We wish we didn’t have to spend half our culture’s resources on patching its grotty design, but what can you do already?
Our brain is like this too. Its perceptions are just as flawed in understanding the world as our body is in navigating it. Our species has a hard time wrapping its brain around this fact because we’ve devoted the last five thousand or so years of written history to convincing ourselves, through intense repetition, that we are the very best there is, the Xerox copy of an omnipotent deity. We are made in His image because nearly every Scripture and Holy Writ on earth includes that claim.
We can’t shake the idea that the brain is special, the center of self and the storehouse of being. We want it to be constant, Gibraltarian, immortal. But just as we have come to terms with the idea that the human body is not the fixed, instantaneous creation of a celestial architect (or that the universe is not a massively static amphitheater), the same can also be said of our glorious brain.
Evolutionary science has shown us that the brain is a patched-up jalopy—an improvised, makeshift do-over. The brain is reliable enough to get you to the next place, but when you finally look under the hood, it’s a little shocking to find a collection of barely successful workarounds and ad-hoc fixes using what appear to be the cerebral equivalent of baling twine, chewing gum, duct tape, and haywire. The reptilian stub got a big makeover during the mammalian period, with an overlay of the new limbic portion of the brain, and then later the more humany stuff known as the neocortal got plopped on top of that—each a series of fixes and patches of the previous networks. For instance, two parts of the brain that evolved with Homo habilis (literally, “handy man” because he was the first tool-making hominid) were the Broca’s and Wernicke’s areas, both crucial in putting things together to make tools. These parts of the brain evolved about two million years ago. Then, about one hundred thousand years ago, scientists now theorize, these tool-making portions of the brain were hot-wired and hijacked to form the centers of the complex human language you speak every day. That’s how evolution works; parts get re-adapted for new uses (“exapted,” in the jargon), or useless bits lying around (“spandrels,” in that same jargon) suddenly get appropriated to new uses. Ancient chewing bones in the upper jaws of reptiles got taken over a good while back and now serve at the scaffolding of the middle ear—the incus, malleus, and stapes—that allows us to hear.
Evolution did not set us on a trajectory toward the perfect brain, the best possible brain, or even, arguably, a decent brain. Rather, we got the amateur version, the unendingly fiddled-with version, a flawed instrument just good enough to get us through to reproductive age. After that achievement, evolution occurs (or not) without a central mission, which might explain the onset of loopy eccentricity in middle-aged aunts and uncles.
Had that evolution failed, we would have gone the way of the Neanderthal or the Australopithecus, visible only in the fossil record. But the brain shambled its way through to right now. And so we live. Scientists studying the early brain have determined that on the savannahs of Africa, we developed thousands of shortcuts in the brain to gain a quick and usually accurate depiction of reality. Our ability to make snap judgments was very handy—evolutionarily. We oversimplify the world we see—and take shortcuts in the viewing of it—in order to make quick sense of it. These shortcuts are called “heuristics,” and nowadays they can make navigating our way through a modern world, dominated by rational choice, quite dicey.
Take a quick one that we all hear our parents say when we’re kids: “Know what you are looking for.” It’s easy to project how helpful that would have been a hundred thousand years ago, when the difficulty of getting food was such that being sidetracked even momentarily could rapidly become total failure or death. But today, when that primitive tic rears its ancient bias, it more likely means we miss all kinds of new opportunities. In fact, many of these ancient heuristics have survived the eons to form a kind of distortion field through which we perceive the universe. And it’s only by looking at Kennewick through such a mirror that one can see anything like a wandering Caucasoid.
Take the most basic notion known as the fundamental attribution error: We are ourselves looking out at the world. That sounds fine, but it has serious and often unavoidable repercussions. When it comes to our own actions, we easily comprehend the state of affairs around us—the situation—because we are trapped in our bodies moving through time. But when we watch other people, we don’t see their situation; rather, we see their bodies as discrete actors in the world. So we are much more likely to ascribe menacing personal motives to another’s actions (“That guy who didn’t do his homework is lazy”), while we are very understanding when judging ourselves (“I didn’t do mine because of a family crisis”), sometimes extremely understanding (“My dog ate my homework”).
I once hung out with an abortion doctor in the Dakotas as he went about his rounds. He told me that pro-life women are no less likely to have abortions than pro-choice women. He said he sometimes found the protesters in one town showing up as patients in the next town (and after the abortion they would go right back to hurling insults at him on the street). But when queried, the pro-life woman would explain away her own choice to abort, saying that her circumstances were unique and one needed to understand the pressures she was under. The other women having abortions? They were baby killers.
Experiments confirm this tendency in every human endeavor. According to Tom Gilovich, the author of How We Know What Isn’t So, 25 percent of “college students believe they are in the top 1% in terms of their ability to get along with others.” It’s everybody else who’s the asshole. According to the Journal of the American Medical Association, 85 percent of medical students believe politicians are behaving unethically when they accept gifts from lobbyists. But only about half as many medical students—46 percent—think it’s unacceptable for doctors to receive similar goodies from drug companies. We can trust our kind, sort of, but definitely not the other kind. There are hundreds of studies yielding the same type of statistic (another medical study found that young doctors believe that 84 percent of their colleagues might be corrupted by pharmaceutical companies’ freebies, but only 16 percent thought they personally were vulnerable).
We can excuse ourselves, literally, because we see so many legitimate excuses in front of us. Other people? Liars, baby killers, thieves. So are the Native Americans politically correct tools of the federal government? Are the scientists opportunistic liars relying on hokum to make an end run around the law? If you’re on the other side, absolutely.
We naturally and easily create a world of order out of events that if examined more closely have other causes or, often, no discernible causes at all. Our ability to craft meaning out of non-meaning is impressive and no doubt has been fairly useful these last two hundred thousand years. But our view of reality, like everything, is not necessarily the best possible view, or even the “real” view—just the one that got us through to right now. The fact is that we see the world from inside this distortion field, and the more researchers study it, the more we learn just how twisted and tenacious it is.
These perceptual flaws now have many names, are being studied continuously, and have generated mountains of papers. The taxonomy of our flawed selves is an explosive and growing field and beginning to penetrate the world outside the lab. Many people have heard of the confirmation bias—the tendency to sort through evidence to confirm what we already know. That one has practically entered the common culture. Most days, it would appear that the Internet is little more than an exhausting orgy of confirmation bias.
There is a kingdom of graduate students and their notable mentors devising experiments to further understand dozens of fabulously named quirks: the Von Restorff Effect, the Status Quo Bias, Loss Aversion, the Semmelweis Reflex, the Déformation professionnelle, the Clustering Illusion, the Hawthorne Effect, the Ludic Fallacy, the Ostrich Effect, the Reminiscence Bump, Subjective Validation, the Texas Sharpshooter Fallacy, the Barnum Effect, Illusory Superiority, the Just-World Phenomenon, the Lake Wobegon Effect, the Out-group Homogeneity Bias, the Availability Heuristic or the Informational Cascade.
One of the most important biases is called anchoring, the cognitive bias that tends to make most of us always lean toward the first notion we were exposed to. Scientists have discovered that we “anchor and adjust” our beliefs. In other words, we can never really cut off our relationship to that first big impression.
The most famous experiment is simple yet mind-boggling. Say I get people to spin a wheel imprinted with two numbers—15 and 65—and it lands on 15. Then I ask a completely unrelated question—How many African nations are members of the United Nations? Most will cluster their answers around the number in the spin. Crazy, but true. That line you heard from your mom about “always make a good first impression” is not only true but a kind of classic heuristic—i.e., a short nuggetlike axiom that long ago worked well for us but nowadays can lead us into a forest of nonsense. The anchoring tendency is so strong that business schools teach it as a fundamental exercise in negotiation theory. Always be the first to state a number in a salary negotiation. Why? Because the final number will, more often than not, cluster around the first number uttered.
With Kennewick, the anchor was that first racial utterance, a work of periphrastic art: “Caucasoid-like.” We can discuss Kennewick all day long, but every conversation veers back to some aspect of this issue—whether he is or is not Caucasoid-like.
Humans are wired to see things even when they aren’t there. This accounts for so many routine sightings of the Virgin Mary and Jesus and even Michael Jackson on toast, in the bark of trees, or in a photo of spaghetti. These sightings might sound ridiculous, but they are great examples of how the brain fills in the story we want to tell (or picture we want to see). Brain scientists will tell you that the medium for such appearances must always be grainy—like toast, tree bark, or a photo of smeared spaghetti sauce. A blurred medium will activate the portion of the brain that fills out a pattern into whatever the brain wants to see confirmed. In fact, often if you can’t see the image, then squinting helps. Depriving the eye of the true specifics of the image allows the brain to fill in the image with its preconceptions, and there it is (like the blurry images of Bigfoot and the ivory-billed woodpecker). Has anyone ever wondered why, in a world where my local ice-cream parlor can print a high-pixel resolution image of my daughter’s face into the icing of a chocolate cake, the Creator hasn’t updated his tree bark appearances past the daguerreotype phase?
In the Kennewick controversy, this tendency to see Jesus in toast, technically known as pareidolia, is what explains the Solutrean Hypothesis. Only the theory’s most devoted zealots see similarities between the Solutrean laurel leaf arrowhead and the Clovis point. Only those who most desire it can see in these few bits of stone an entire land-based culture that could have turned into sailors without any evidence; maritime Homo sapiens who left the countryside of Europe and managed to adapt overnight to Inuit-style living, camping on ice and fishing along the kelp highway. Even though the Solutreans disappeared some four thousand years before the appearance of Clovis, and even though they left no redundant evidence behind, if you look at these dissimilar stone tools, you can see their entire voyage right there in the flutes of the Clovis points.
But only if you squint.
The language used to describe Kennewick is thoroughly infected with many of these biases. One of the most powerful is called the self-esteem bias. That is, we more eagerly see things that flatter us than those that don’t. Putting together a skull and nudging it a few millimeters here and there to make it more possible to see a “European” shape is a perfect example of the self-esteem bias on the part of white researchers.
Since there were a number of different ways to assemble the skull and one of them trended closer toward confirming what these researchers deeply wanted to see in the skull, the skilled scientist would certainly set up an experiment to work around this obvious tendency. If Chatters had sent precise molds of the skull to five different anthropologists and asked them to “assemble” it without telling them the age or the location of the finding and then asked them to explain what one might surmise from the skull—then you would have had an experiment and possibly a clear-eyed view of the skull. Instead putting it together yourself and then declaring that it just so happens to confirm what it is you so deeply long to see would make any cognitive scientist throw up her hands in despair.
The Kennewick court case itself is a classic example of another bias known as the Endowment Effect. Our ability to unconsciously create value for an object we are holding (or wish to hold) is impressive. A famous experiment demonstrating this effect involved giving free coffee mugs to people and selling them to other people. Later, when asked to sell them, people who had paid money for the mugs insisted on higher prices. People who were given the mugs didn’t care so much.
Because everyone was struggling to retain control of the skull and bones, they not only had to be valuable, but that also tended to make people believe they had to be valuable in other ways. Of course the skeleton had to be unique proof of a European presence prior to paleo-Indians. Why else were the Indians fighting so hard to take possession of it?
Priming is the other cognitive bias that overwhelmed the popular media in this story from the beginning. For instance, if I asked you to think about your grandfather’s death and then asked you to categorize words as “negative” or “positive” as I read off “happy,” “singing,” and “crying,” you would more quickly categorize the word “crying” as negative because I had already primed your mind to be on the alert for negative things. This happens in all kinds of ways. But few of them are as textbook perfect as handing a reconstruction artist a skull with the explicit observation that you think the skull bears an uncanny resemblance to Patrick Stewart of Star Trek.
Two other errors make the case for Kennewick look absolutely solid. The Texas Sharpshooter Fallacy takes its name from someone shooting up the side of a barn before drawing a circle around the most clustered shots and then bragging about his bull’s-eye aim. For Kennewick, it’s roughly like finding a bone needle leagues away from the skeleton and concluding that the “explorer/trapper” must have worn “tailored” pants.
Most people believe that we are born into a world of illusion but grow out of it as adults. When we are kids, sure, we might believe the explanations of the Just So Stories, or that little men live in TV sets, or that tiny fairies dwell in a realm beneath the toadstools. But then a time comes when we matriculate to a view of the world that’s more sophisticated. Culturally, we mark this coming of age in certain ways—the revelation of Santa Claus or the outing of the tooth fairy.
And then we are welcomed into the Cartesian world of adulthood, where we foolishly think we have entered a realm of logic and rational choice, a place where individuals make reasoned judgments about the world around them. What scientists are showing us is that while the common adult view of reality might be more empirically precise than a five-year-old’s—it’s not as precise as we want to believe. Academics have a name for the sloppy habits most adults have in their way of knowing the world, their epistemology. They call it a “makes sense epistemology.” That is, most of us, once we determine a cause for something that “makes sense,” rarely take the next step of a scientist—expose that idea to a test of some kind to see if we’re off.
Academics have numerous ways of trying to look around our flawed biases. Regression analysis is a form of statistics that uses large collections of data regarding many individuals’ actions to reveal the true movement of our hive, rather than relying on the august sentiments of elders. Now the Internet has organically developed several ways that group dynamics are performing a similar function.
The wiki—a technological platform that allows for a collective narrative to be written—has revealed all kinds of new or faster truths. It, too, has been derided as an assault on the very book of elder wisdom (Encyclopedia Britannica). Another more recent invention is the betting market. It turns out that creating a place where people with inside knowledge about events can win money by betting on that knowledge (think of the Iowa Electronic Markets, Intrade, NewsFutures) is another brilliant way to see past our prejudices and reveal the kinds of knowledge typically kept out of view. The attempt by the Bush administration to create a terrorism market—where terrorists could make money by revealing the most likely next targets—was canceled when people were offended by the possibility of rewarding terrorists in any way, even though the end result might be advance warning of another hit.
The oldest method to shake us out of our conceived universe is laughter. Needless to say, this has been studied! Solemnity and gravitas, while looking great on the face of an ancient professor, turn out to be a form of intellectual prison. Let’s go to the experiment: Give someone a corkboard on a wall, a box of thumbtacks, and a candle—then tell them to fasten the candle to the board. Overwhelmingly, most people will try to tack the candle to the board or light the candle and use hot wax to affix it. But neither works. Now show a similar group a Laurel and Hardy movie before the assignment, and creativity increases. Many of them will empty the tack box, pin the box to the board, and put the candle in it.
Other studies have confirmed just how solemnity (and its partner, overconfidence) in one’s knowledge is deeply related to being correct in one’s views. But it’s an inverse relationship. The more confident one is in one’s views, the more likely one is to be flat wrong. An in-depth survey of pundits on television charted two elements of their presentation—their accuracy in prediction and the display of confidence in their opinions. Perhaps it will come as no surprise that survival in the pundit mosh pit on television is linked directly to the pundit’s level of blowhardiness. The more absolutely certain a pundit was in couching a view, however, the more likely that opinion was found to be wrong. All pundits, in this way, bear a strong resemblance to Michael Gary Scott of The Office. Yet, all that said, the bubble of television information thrives on the “confidence bias”—our own flawed preference for blustery self-assurance in the present tense rather than spot-on accuracy down the road.
All these cognitive biases, from the fundamental attribution error to the confidence bias, come together at the end of this story in what’s known as an informational cascade. Typically the term describes how the same choice repeated by others just bandwagons without anyone pausing to make an independent judgment. In the Kennewick cascade, though, there were tiny tweaks all along the way—from the assembly of the skull to the detonation of the word-esque substance “Caucasoid-like” to the numerous stories about Kennewick’s “family” fleeing the savage “hordes.” The accumulation of errors gathered and increased, forming a cascade of faux evidence that for many, many people constituted a perfect proof.
Despite all the distortion involved in trying to see the world for what it is and in creating new ideas that are real enough to be repeated by others, there do emerge a set of rules from the best amateur pursuits. First, start at the beginning. All the assumptions of even the best experts are infected with their own prejudices and biases. If you are Steven Jobs in a garage in Cupertino in 1976, then you don’t need to know or listen to the wisdom of, say, IBM chief Watson, who once cockily said: “I think there is a world market for about five computers.”
Second, enter your literal or metaphorical garage in a sense of play. It almost doesn’t bear saying: The garage is a place of play, both when we are kids and as middle-aged grown-ups desperate to escape the bills and solemnity and tedium of “the house.” The garage is an outpost of joy, love, and freedom, which is why it long ago achieved mythic status as the fountainhead of amateur American creativity. But it’s that playful, supple state of mind that’s key. Why else do corporations spend so much time putting their executives on six-person bicycles or sending them off on retreats to smash the tedium of familiar thinking? Getting people into a state of playfulness is almost impossible. Amateurs enjoy the luxury of starting there.
Finally, there has to be an outside world of peers that you connect to who can keep you from getting sidetracked by your own or your culture’s biases. Scientists operating at the professional level do this through peer review. Amateurs can accomplish the same by joining weekend hobbyist groups, like the old robot clubs, where folks show off their latest creations and get critiques from friendly peers who want to make it better. Or perhaps you join a newsletter or subscribe to Make magazine or sign on with a DIY group or contribute along with others to a wiki devoted to your pursuit.
However one gets all the way back to the beginning of an idea, banishes crushing solemnity, and creates a small-scale community to keep it honest, you have to get there. Otherwise, you may find yourself looking at the Rashomon shape of a skull and seeing an itinerant European wandering the estuaries of the Pacific Northwest.
The question of just when we became human gets answered in our popular press all the time. Was it when we assembled the first rudimentary tool kit or grunted out the few phonemes of complex language? Was it when we made those paintings in Altamira and Lascaux, or when we left off being knuckle-dragging ape-like critters and stood up? Was the aquatic ape somehow involved? It’s one of those lines that doesn’t exist as a moment in time, but as an idea it does exist, and various scientists routinely make claims. Not long ago, a British scholar named Jonathan Kingdon laid out a new theory—about why we stood up—in his book Lowly Origin.
“Standing up” has been a particularly fertile field for this kind of musing, with theories ranging from cooling off to intimidating other species or freeing the hands. I’d always heard that we abandoned squatting because we wanted to see over the top of the grass on the African savannahs. One early 1980s theory was that standing evolved for “phallic display directed at females.” (Were this the case, every creature in nature, down to the ameoba, would stand, and the great outdoors would be a very animated place.)
Kingdon plods through a different argument. It’s dense and slow. Standing up, he says, probably had a lot to do with getting food and happened in undramatic stages, first by straightening the back while squatting and later extending the legs—all of this happening over vast swaths of time in tiny incremental stages. As theories go, that’s not nearly as fun as “seeing over the grass,” but it has the ring of truth to it, a ring that, let’s face it, never will endear such an idea to writers of newsweekly cover lines or green-lighters of movies of the week. Which is also why you’ve never heard of Jonathan Kingdon.
Scientists like to invoke Occam’s Razor, the principle that the simplest explanation is often the most truthful. The principle was born during the Age of Reason when logical thought was trying to cut through the intellectual encrustation accrued after millennia of seeing nature through both Holy Scripture and the blowhardiness of intellectuals trying to impress one another with their sesquipedalianismo.
These days we have a different, almost opposite problem. Pop thinkers tend to oversimplify in a way meant to attract attention. The first time I ever got a whiff of this was when I was a teenager reading Desmond Morris’s book The Naked Ape. Morris theorized that the reason human females had big breasts (as opposed to the tiny sagging dugs of other primates) was because we had discovered love. In doing so, we switched from copulating doggie style to the more romantic missionary position. But all those millennia of looking at the round globes of the female’s buttocks from behind had also developed into the image stimulus required for the maintenance of erections during intercourse. Morris argued that the male still needed large rounded visual cues so, according to the rules of Darwin, we were rewarded with great big hooters.
Even as a kid, I remember thinking, Excellent, but really? Morris’s simplicity makes monstrous assumptions that just so happen to yield a theory pre-edited for the short, punchy demands of modern mass media. A hook, if you will. (Not that it didn’t work: Thirty years after reading that book, the only detail I can remember is the boob theory.) Morris’s theory has little to do with truth and everything to do with selling books. Perhaps it’s time to set aside Occam’s Razor and pick up Morris’s Razor, which shuns any theory that might excite a cable television producer while simultaneously elevating the plodding theory that makes a kind of dull, honest sense.
Apply Morris’s Razor to Kennewick Man and here’s what you might get: Chances are Adovasio and his colleagues are right about the basic assertion of an ancient arrival of Homo sapiens to this continent. For instance, the archaeological record in Australia is redundant with proof that aboriginals arrived there at least fifty thousand years ago. That journey would have required boating some eighty miles, many believe. So there’s nothing extraordinary about there possibly being multiple entries to the American continent, with at least one crew, probably Asians like the Ainu, lugging their haplogroup X into North America some twenty to thirty thousand years ago, giving them plenty of time to leave some pre-Clovis fossils.
Sure.
That’s one story, a very Kingdon-like theory, all very probable but not a very good cable special or science magazine cover story. The Morris Razor, though, discards the other bits where the First American is of an ancient tribe (that just happens to physically resemble the very scientists making the claim) whose sad end came after a genocidal campaign between superior but outnumbered Caucasoids and hordes of Mongoloid “Stone Age peoples.” This epic extrapolation is drawn from one single Cascade point, a leap about as likely as a Martian anthropologist staring at a scrap of gray wool, an Enfield bullet, and a dinged canteen and then successfully imagining the states’ rights debate of the Civil War.
The same Martian anthropologist might also quarrel with the view that the Kennewick battle is a latter-day clash between science and religion—the Indians with their childishly mythic stories of origin and the scientists with their lithics and their scientific dates, 8700 ± 50 years. In an editorial a while back, the Seattle Times captured half the fight perfectly. Kennewick had “held onto his secrets for more than 9,000 years and now, finally, scientists will get a chance to be his voice.”
Why assume the scientists’ narrative in this case is closer to the empirical truth? In fact, if you know the history of archaeology, you know that there are times when one can find more objective, hard factual truth in the local oral narratives than in the scientists’ analysis. This may well be one of those times. The Indians make the argument that their creationist stories are the truth that they believe in. Every culture had its founding stories. Those myths can sometimes be decoded to reveal the nuggets of ancient journalistic truth that set them in play, just as Helge Ingstad’s devotion to finding the truth buried in the Viking sagas eventually led to the confirming archaeological digs at L’Anse aux Meadows in Newfoundland.
There are several Indian creation stories about coming out of ice. The Paiute tell one that ends this way:
Ice had formed ahead of them, and it reached all the way to the sky. The people could not cross it.… A Raven flew up and struck the ice and cracked it. Coyote said, “These small people can’t get across the ice.” Another Raven flew up again and cracked the ice again. Coyote said, “Try again, try again.” Raven flew up again and broke the ice. The people ran across.
Many Native American origin accounts involve coming out of ice, which certainly fits into all the theories of America’s human origins. So why aren’t these stories studied the way Ingstad examined his own sagas? Why is the benefit of the doubt given to the scientists’ story? It’s quite possible that every objective fact that went into the telling of this new “scientific” pre-Clovis story is not true at all—only a factoidlike projection of racial anxiety—and are more “mythic” than the creation story the Indians are telling.
Part of the problem of reading either of these stories is that we no longer have a capacity to appreciate the real power of myth. Most of us are reared to think of myth as an anthology of dead stories of some long-ago culture: Edith Hamilton making bedtime stories out of Greek myths; Richard Wagner making art out of Norse myth; fundamentalist Christians making trouble out of Scripture.
When we read ancient stories or Holy Writ or founding epics, we forget that the original audience who heard these accounts did not differentiate between mythic and factual storytelling. Nor did these stories have authors, as we conceive them. Stories arose from the collective culture, accrued a kind of truth over time. For that reason alone, they were sacred and had real power to move people. Belief is what keeps any tribe together.
Today we’ve split storytelling into two modes—fiction and nonfiction. And we’ve split our reading that way as well.
The idea of the lone author writing truth has completely vanquished the other side of storytelling—the collectively conjured account. I think we still have these stories, but we just don’t recognize them for what they are. Tiny anxieties show up as urban legends. In the late 1980s when the queasily mortal idea of organ donation was infiltrating the social mainstream, suddenly one heard an author-less story of a man waking up in a Times Square flat after a night of partying to find a stitched wound on his lower back and his kidney missing.
Enduring myth can be based on fact, as in Ingstad’s case. But often the collective account needs no factual basis, just a mild apprehension that the world is not quite what it seems. No one has ever found a razor blade in an apple at Halloween, nor has any doctor treated anyone for gerbilling. Bill Gates is not giving away computers, and the sewers of New York are gator-free. The story of the Ancient European One is this kind of story, toggling back and forth between the world of fiction and nonfiction, authored by a few curious facts and the collective anxiety of the majority.
Because we no longer read mythological stories, we no longer appreciate their immense power. We find ourselves stunned at how something so many deeply long to be true will simply assemble itself into fact right before our eyes. The scientists who eventually won control of Kennewick’s bones have been studying them now for ten years. What have they learned? Well, they don’t like to talk about it much. The only new idea that has been made public is an analysis of levels of mineral deposits in the bones, suggesting Kennewick was buried intentionally. Great.
More recently, Chatters has reverted to his incoherent ways, happily agreeing that the anachronistic word “European” aptly describes the skeleton. I’ve heard him celebrate “Solutrean Pride” and cheerfully joke with racist radio hosts who sneeringly refer to Native Americans as “Berengians” and guffaw at declaring February to be “Solutrean History Month.” The scientists have discovered almost nothing of Kennewick, but the growing band of amateurs they set loose have conjured a new and powerful creation myth. And if they profoundly long to believe that men of Caucasoid extraction toured here sixteen thousand years ago in Savile Row suits, ate gourmet cuisine, and explored the Pacific Northwest with their intact pre-Christianized families until the marauding horde of war-whooping Mongoloid injuns came descending pell-mell from their tribal haunts to drive Cascade points into European hips until they fell, one after another, in the earliest and most pitiful campaign of ethnic cleansing, then that is what science will painstakingly prove, that is what the high courts will evenhandedly affirm, and that is what in time ever more amateurs will happily come to believe.