Somewhere in middle age I returned to the quest, or, in its stripped-down version, the question of what exactly happened there when I was seventeen. In the midst of so much that was grown-up and responsible—deadlines, campaigns, movements, scholarly undertakings, motherhood—some crucial late-night part of the ongoing mental churning regressed back to the events of my adolescence, which were just too strange, and I wish there were a better word, to be permanently buried under the label of “mental illness” or some kind of temporary perceptual slippage. “If you see something, say something,” as we are urged in train stations today, and certainly I had seen something. Yes, it was something inexplicable and anomalous, something that seemed, in a way I could not define, to be almost alive. But this had also been the case with the oscillations at the silicon electrode, and it was still my responsibility to report them and bear the shame, if necessary, of bringing unwelcome, perhaps even incomprehensible news.

The circumstances of my return were not auspicious. The movement that had sustained me for more than a decade was crumbling under my feet. As my former comrades drifted away, to careers or, in a few cases, cults or prison, I could no longer imagine myself as a warrior. I was at best a soldier, sticking doggedly to the project of “social change” even when that meant serving in the most tediously compromised fragments of the left, where the idea was no longer to ignite the “masses” but to flatter, and thereby hopefully influence, people who were more influential than we were. More and more of my time was devoted to the feminist movement, but it too was often mired in useless discussion, such as attempts to determine our “principles of unity.” I got through the long meetings—often weekend-long meetings in windowless conference rooms—by trying to work out the prime numbers up to 200.

Meanwhile, my father succumbed to Alzheimer’s disease, which replaced that brilliant and complicated man with a partially melted wax effigy whose speech was increasingly limited to random word-sounds and chirps. Or maybe it was the nursing homes, as well as the Alzheimer’s, that worked this change in him, because if you take away all printed matter, occupation, and conversation, you will eventually get someone who kneels by the toilet to stir his feces with a plastic spoon, or so it has been my misfortune to observe.

Every few weeks I would fly to Denver to visit him, each trip a journey into the unbearable ugliness that humans manage to secrete around themselves out of plastic and metal and short-haired, easy-to-clean carpeting material: the airport, the interior of the airplane, the corporate chain hotel, predictable down to the free cookie offered to each arriving guest. And then there was the nursing home itself. Was I depressed because my father was dying or already dead, depending on how you evaluate these things, or because I had to spend so many hours in a place that made death seem like the best remaining option, if not something to be urgently desired? Moss green and salmon, no doubt thought to be an inoffensive, gender-neutral combination, made up the color scheme, except for the posters serving to remind us of the season—lambs and flowers in the spring, pumpkins in the fall, snowflakes in their proper time. And we needed those reminders, because fresh air was not allowed inside the nursing home, only air freshener.

Earlier, when things were going well, when the movement was thriving and before my father became a shell or my children turned into teenagers, bent on individuating themselves from me as if I repre­sented a potentially contagious condition, I could handle a world without transcendence or even the memory of transcendence. But in addition to everything else, my second husband, who I can say in retrospect was the big love of my life, got eaten alive by his sixteen-hour-a-day job as a union organizer and began to act like one possessed. I had been his eager helpmate—marching in picket lines, going to organizing meetings, welcoming insurgent truck drivers, factory workers, and janitors into our appropriately modest home—but now he was too preoccupied to reliably hear me when I spoke or notice when I entered a room. With my human environment falling apart, the repressed began its inevitable return.

The story of the years leading up to my return to the old questions can best be summarized as a series of measurements and chemical assays: the increase in amyloid protein in my father’s brain and the corresponding decline of serotonin in my own; the uncertain tides of estrogen and oxytocin, the diurnal rise and fall of blood alcohol, caffeine, and sugar levels. Simmer all this together for many months and you get a potent toxin, which seemed to come at me in waves. I can remember the luncheon hosted by a countywide women’s organization, probably on a Saturday after­noon in late February. All very jolly and heartwarming, until I happened to look out the window of the Long Island catering hall where we were gathered and saw the imminent menace. There was a gas station and an intersection under a pearly gray sky peppered with factory emissions; there was a parking lot mired with the black remains of snow; there was no hope. The award ceremony itself was a mockery, because anyone could see that the people presiding over it were dying right in front of us, if not actually dead, and that rigor mortis had already hardened their smiles into grimaces. And I was not a passive or reluctant participant in this event. I was the one who got the award.

The name for my condition, I discovered, was “depression,” which I learned from a 1989 op-ed column William Styron wrote about Primo Levi’s suicide. What surprised me was the term, “depression,” which seemed far too languid to apply—an insipid “wimp of a word,” as Styron himself put it, for “a dreadful and raging disease.” I went along with the diagnosis, therapy, and medications, but not without internal reservations. You can talk about depression as a “chemical imbalance” all you want, but it presents itself as an external antagonist—a “demon,” a “beast,” or a “black dog,” as Samuel Johnson called it. It could pounce at any time, even in the most innocuous setting, like that award luncheon or in a parking lot where I waited one evening to pick up my daughter from a school trip. What if her school bus failed to return? What if it had crashed somewhere? Even when I had her safely in the car, it was all I could do to get us home and rush into the bathroom for a fit of gagging and trembling over how close the beast was getting.

It was despair that pulled me back, as a mature adult, to the ancient, childish quest. I could not go on the way I had been, dragging the huge weight of my unfinished project. The constant vigilance imposed by motherhood, along with the pressure to get assignments and meet deadlines, had trapped me in the world of consensual reality—the accepted symbols and meanings, the highways and malls, meetings and conferences, supermarkets and school functions. I seemed to have lost the ability to dissociate, to look beneath the surface and ask the old question, which is, in the simplest terms, What is actually going on here?

Or maybe depression in its demon form awoke me to the long-buried possibility that there exist other beings, agents, forces than those that are visible and agreed upon. I wish I could draw some clear lines of causality here, but there are no primary sources to refer to, no journal or even any random notes to my future self.

But when I did try to return to the old questions, very furtively of course, despair and a kind of shame followed me and blocked the way. The impasse was this: If I let myself speculate even tentatively about that something, if I acknowledged the possibility of a nonhuman agent or agents, some mysterious Other, intervening in my life, could I still call myself an atheist? In my public life as a writer and a speaker, I had always been a reliably “out” atheist. This was my parents’ legacy, and a deeper part of my identity than incidentals like nationality or even class. At some point in the eighties I published an essay-length history of American atheism that unearthed the stream of working-class atheism from which I was descended. I won awards and recognition from organizations of “freethinkers” and humanists. When the subject came up, which it was bound to in our largely Catholic blue-collar neighborhood, I told my children that there is no God, no good and loving God anyway, which is why we humans have to do our best to help and care for each other. Morality, as far as I could see, originates in atheism and the realization that no higher power is coming along to feed the hungry or lift the fallen. Mercy is left entirely to us.

I was no longer the kind of scornful, dogmatic atheist my parents had been. When I read the book of Matthew closely in my forties I was startled by the mad generosity Jesus recommends: Abandon all material possessions; give all you have to the poor; if a man asks for your cloak, give him your coat as well; and so forth. If you’re going to help the suffering underdog, why not go all the way? But then, as the Bible drones on and Jesus fades away to be replaced by “the risen Christ” holding out the promise of immortal life, the message takes on a nasty, selfish edge. How can you smugly accept your seat in heaven when others, including probably some of the ones you love, are confined to eternal torment? The only “Christian” thing to do is to give up your promised spot in heaven to some poor sinner and take his place in hell. Far easier, it seemed to me, to profess atheism and accept the moral obligations a Godless world entails.

As for the mortality that atheism leaves you no escape from—do not for a moment imagine that this was the source of my depression. I was old enough, in my forties, to sense the beginnings of decline, first announced by backaches and the need for reading glasses. What I feared was something more suitable to a depressive: the unthinkable possibility of not being able to die. Suppose that my brain had been excised by evil scientists and was being kept alive in a tissue culture medium, then subjected to electrical shocks that varied ingeniously so that my mind could never become habituated to the pain, and that this could be done for centuries, millennia, forever. Or that my body had survived some catastrophic disease, leaving me in a “locked in” condition, unable to move or communicate. If you can imagine these states, then you know that a kindly god would not promise “eternal life.” He or she would offer us instead the certainty of death, the assurance that somehow, eventually, the pain will come to an end. Why believers should forgo this comforting certainty, which is so readily available to atheists, is a mystery to me.

My activism required me to be tolerant, to incline my head a little when others bowed theirs, but all too often I was more challenging on the issue than courtesy allowed, once even picking a fight with a local liberal minister. He was trying to reassure me that his vague denomination had no active involvement with God himself and remained fairly open on the question. That wasn’t good enough for me, though. I insisted that the appropriate stance toward an omnipotent God, even the possibility of an omnipotent God, should be hatred and opposition for all the misery he allowed or instigated. Another time, I disrupted the happy revivalist vibe at a conference held in a black church because I was tired of hearing the clergymen who were my copanelists exult in the unifying power of Jesus. I pointed out the number of women in the audience wearing head scarves, guessed at the number of Jews in attendance, and announced that I, an invited panelist, was an atheist by family tradition. Somehow I even managed to profess my atheism to an audience of striking janitors in Miami, all Hispanic and presumably Catholic or Pentecostal, to the irritation of the union officials who had invited me.

It wasn’t just family loyalty that held me back from potentially heretical speculations. The whole project of science, as I had first understood it way back in high school, is to crush any notion of powerful nonhuman Others, to establish that there are no conscious, subjective beings other than ourselves—no spirits, demons, or gods. An individual scientist may practice her ancestral religion with an apparently clear conscience, but once at work, her job is to track down and strangle any notion of nonhuman or superhuman “agents”—that being the general term for beings that can move or initiate action on their own. Thus, for example, the oscillations at the silicon electrode could not be the work of some malign creature lurking unseen in my lab: That was exactly the possibility that had to be eliminated, if only I could have found a way to do so.

The same impulse drives me today. If you hypothesize that certain strange noises in the house are produced by ghosts or poltergeists, I will tear the walls down, if necessary, to prove you wrong. Human freedom, knowledge, and—let’s be honest, mastery—all depend on shooing out the ghosts and spirits. The central habitat of spirits in our culture is religion, with the excess population flowing over into New Age spirituality, and nothing has ever happened in my adult life to incline me more approvingly to either.

At some point, close to what seemed to be the nadir of depression, I began to dig myself out, using tools that, I now realize, had always been at hand. I was by this time not only a journalist churning out weekly eight-hundred- to thousand-word columns and essays on topical matters, but an amateur historian. The short pieces entertained (and financially supported) me, while the longer historical excursions fed my mind, or rather the insatiable little creature within it that was always demanding fresh questions and fresh clues. Lab work had starved my intellect, but the form of science I turned to now, “social science,” which requires no glassware or equipment, opened up a feeding frenzy. At first I wrote books on relatively manageable issues related to class and gender in American society, and then, realizing I had nothing to lose, turned to much larger issues—too large, in fact, for any legitimate social scientist—​like religion and war.

I had no reason to think that my new research interests had anything to do with the old metaphysical quest. Anything I dignified in my mind as “work” was about “politics,” in the broadest sense, and social responsibility—good, rational, mature concerns that could be justified by my activist involvements and concern for my children’s future. But my intellectual agenda was hardly just a matter of rational, liberal decision-making. I had not come out of solipsism into a world of gemütlichkeit and good cheer. To acknowledge the existence of other people is also to acknowledge that they are not reliable sources of safety or comfort.

Metaphorically, you could describe the situation this way: I am adrift at sea for years clinging to a piece of flotsam or wreckage, alone and prepared to die. Then I get rescued by a passing lifeboat, packed with people who pull me in and give me food and water. But just as I am rejoicing in the human company, I begin to notice that there is something not quite right about my new community. I detect uneasiness and evasion in their daily interactions. There are screams and groans at night. Sometimes in the morning I notice that our numbers have shrunk, though no one comments on the missing. I have to know what is going on, if only for my own survival. Hence the frantic turn to history: If these are my people and this is my community, I need to know what evil is tearing away at it, where the cruelty is coming from.

I started my study of war and human violence with what I took to be a manageable hypothesis, based on many months of reading, but since I was untrained in any formal or official way, my research method was sheer mania: no stone unturned, no clue left hanging, no disciplinary barrier unbreached. I went from history to literature and classics, I immersed myself in ancient epics, and when anthropology seemed more relevant, I went there, and on to paleontology, archeology, psychology, whatever beckoned. Ah, the joy of libraries after so many years of laboratories! I cannot say that this new phase of research cured my depression, but I learned I could keep it at bay by clinging to the mystery I was trying to solve as if it were an amulet: Get up and make notes on the books that you have, reflect on these notes and order more books, get up again, revise the hypothesis, and figure out a new plan of action. Repeat, making sure to leave no cracks open through which the gray fog of depression can penetrate.

I tested the limits of interlibrary loans from the local public library, and sometimes the patience of the librarians. I got access to a few major university libraries, where I could wander in the stacks, following whatever bat-crazy line of thought turned up. I was in the NYU library, on some kind of paleontological trail that afternoon, when I came across the book that launched a decade of obsession. It was not the book I was looking for, just shelved near it, but the title, The Hunters or the Hunted?, lured me in, never mind the esoteric subtitle, An Introduction to African Cave Taphonomy, by the South African paleontologist C. K. Brain. The import of the book, which I absorbed in a single sitting, was that you could not understand anything about human violence—war, for example—without understanding that before they were warriors, or even hunters, our ancestors were the prey of more skillful and far better armed nonhuman predators.

Taphonomy is the study of fossilization, and the remains in question were the skulls of early humans, or hominids, found in an African cave. Puncture marks in the skulls had suggested to evolutionary biologists that the hominids had died violently at the hands of their fellow protohumans—an early case of murder, if not actual war. But then Brain came along and determined that the distance between the puncture marks precisely fit the gap between the lower canines, or stabbing teeth, of ancient leopards. I could see no daylight from my desk there in the stacks, no human faces, only the nightmare past recalled, through some inexplicable Jung­ian mechanism, in my childhood fear of lions. Humans had not written their own history and prehistory, with of course the collaboration of climate and terrain. Our evolution, and even to an extent our history, were also shaped by encounters with dangerous nonhuman animals, especially the larger carnivores, to whom our ancestors were little more than meat. The conventional narrative of unbroken human dominion over the earth and its creatures had managed to leave out some of the central players.

It took a while for me to grasp the metaphysical import of the animals that began to populate my imagination, my notebooks, and eventually my book Blood Rites: Origins and History of the Passions of War. Here were the Others, or some of them anyway, whose existence science had tried so hard to deny: conscious, autonomous beings, or “agents” in the largest sense, very different from ourselves and, no doubt, from one another. They were all around us and they always had been. The scientific notion that humans are the only conscious beings on the planet had been an error all along, an error rooted in arrogance and provincialism. Maybe other creatures are prone to similar fallacies: ants, for example, who get so caught up in the politics of ant warfare that they ignore the occasional reports of giant, colony-crushing bipeds.

Since childhood, I had never spent much time thinking of animals in any context, whether as pets or as objects of pity. Modern urban and suburban people live for the most part in an environment devoid of wild fauna larger than squirrels, where you might even forget about their existence except for their curious prominence in children’s books and as “stuffed animal” toys. By the 1980s, science was beginning to move toward an acknowledgment of animal subjectivity and emotions, but for the most part educated humans were stuck with the Cartesian view of animals as automatons, driven entirely by instinct and reflex, which is a way of saying that they are in fact, for all practical purposes, already dead—just mechanisms responding to instinct and external stimuli. If I had thought anything else, how could I have cold-bloodedly vivisected so many mice in order to “harvest” their cells for my experiments?

But as I got into my late forties and fifties, improving finances made it possible to go on vacations in rural and, incidentally, fauna-rich locations. We started going to the Florida Keys, in the summer when rentals were cheap, and I was struck there by the density of large and even dangerous creatures—snakes and stingrays and especially barracuda and sharks. None of these deterred me from going in the water; in fact, I was drawn by the frisson of being a soft, edible creature among so many experienced carnivores. When I got to know a diver—not a vacationer but a man who dove professionally for a treasure-salving operation—I pestered him for predator-related lore, learning, among other things, that it’s unwise to bleed in the water, wear sparkly earrings, or “act like a sick fish.” I taught myself how to kayak, just barely, and spent hours out in the Gulf of Mexico, finding hot, still spots on the leeward side of mangrove islands, watching out for dorsal fins, and then following—or, as I liked to put it, “hunting”—sharks. No danger in this except for one occasion, when a larger-than-usual shark whirled around at me in annoyance and made as if to ram my kayak.

My next set of vacation destinations, in the Rocky Mountains, brought me within range of more traditional terrestrial predators, chiefly bears. Bears had been a hazard to Paleolithic Europeans, as well as a source of bearskins and, some archeologists assert, iconic images featured in what may have been religious cults. I made myself into a minor expert on bear attacks, reading all the books available in tourist stores and once following up with an interview of the author by phone. The most important thing I learned, from a theoretical point of view, is that bears are not entirely predictable. One bit of lore, for example, is that grizzlies are unlikely to attack a person who is playing dead. But no one should rely on this trick, because sometimes they are attracted to what appears to be carrion. Similarly, you cannot exactly predict where a shark will show up, which could be in a mere foot or two of water where you might have thought you were safe. If animals are reflex-driven automatons, they should obey certain statistical rules, as, for example, mosquitos appear to do when they travel around in a cloud, like molecules of a poisonous gas, but what I learned through observation as well as reading is that large animals are individuals, making minute-by-minute decisions of their own.

Science has been moving in the same direction, and not only because of pressure from the animal rights movement. When observed through a lens cleaned of human vanity, more and more types of animals, many birds included, are found to reason, to exhibit emotions, cooperate, use tools, and plan ahead. I had my comeuppance in the Florida Keys, where I became fascinated by the group behavior of ibises. As the sun sets, they flock to a nearby mangrove island to roost for the night; at sunrise or thereabouts they take off again for their feeding grounds, and I would try to kayak out to watch both events. But the morning liftoff can occur before or at sunrise, and it can be either messy and anarchic or a single, coordinated action involving up to a hundred birds at a time. What, I wanted to know, determined the timing and nature of the liftoff? For surely, I thought, the ibises must be responding to some factor like sunlight or temperature that signaled them when to wake up and fly. Or maybe they were awakened by the sound of fish waking up and jumping. There had to be something—​​right?—that was controlling their behavior.

But when I put this question to an old college friend and animal behaviorist, Jack Bradbury of Cornell University, he told me essentially that there were probably some leaders and trendsetters among the ibises, but there was also a lot of early morning jostling and nudging. In other words, within certain parameters like hunger and the need to stick together, they do pretty much what they damn well please.

Dolphins are the free-will stars of the seas. You never know when or where you’ll run into them, in what season or depth of water, and whether it will be a single one or a pod. I was out on my kayak one day when I noticed some furious splashing off to the north. Paddling to the action as fast I could, I saw it was two dolphins playing some rough, elegant game involving alternating leaps out of the water, and when they saw me, they decided to include me in it. They’d swim alongside the kayak, then vanish under it and pop up dramatically on opposite sides with those wide dolphin grins on their faces. It would have been easy enough for them to flip the kayak over and, if they were so minded, to push me under­water until I drowned, but that was not the game they were playing that day. They fooled with me like this for about half an hour, and then zipped off to find a better player.

I described these encounters to a friend as “religious experiences,” and the deeper my studies ranged, the more apt this description seemed. If you go back far enough in history and prehistory, you find humans investing animals, especially large and sometimes dangerous animals, with a charismatic quality, a connection to the divine or at least the occult. Ancient, premonotheistic cultures worshipped animals, animal-human hybrids like Sekhmet, the lion-bodied goddess of predynastic Egypt, or human-shaped deities with animal familiars, like the Hindu goddess Durga, who rides a tiger. Almost every large and potent animal species—bears, bulls, lions, sharks, snakes—has been an object of human cultic veneration. Before the Christian missionaries arrived, my Celtic ancestors worshipped the goddess Epona, who often took the form of a horse. The Makah people of Washington State worship “Whale,” who provides them with both physical and spiritual sustenance. If modern people can still get a thrill, as I do, from an encounter with a large and preferably wild animal, it is because such animals once were gods—beneficiaries of sacrifice and the centerpieces of ecstatic ritual.

This posed a fresh challenge to my atheism. What does it mean to be an “atheist” if the gods could be, and once were, so numerous and diverse? I had nothing against Epona or even the death-​dealing Hindu goddess Kali, and certainly no way of refuting anyone’s claim to have made contact with the Vodoun loa or Yoruba orisha in a state of trance. I realized that the theism I rejected was actually only monotheism, or the particular version of it repre­sented by Christianity, Judaism, and Islam, in which the “one God” or “one true God” is not only singular but perfect—both omnipotent and perfectly good and loving. In the Freudian framework, the God of monotheism is a projection based on the child’s perception of reliably nurturing and powerful parents. I had no such template to build on, which may account for the scorn, expressed early in my journal, for a “parental God.”

But amoral gods, polytheistic gods, animal gods—these were all fine with me, if only because they seemed to make no promises and demand no belief. You want to know Kali or Epona? No “faith” is required, because there are, or were at one time anyway, rituals to put you directly in touch with her. Most of these rituals have been abandoned, repressed, and forgotten, but images of the old gods linger on to amaze us. I was suffering through an episode of deep depression when I got to see the giant chalk horse representing Epona carved into an English hillside. She did not cure me, of course, but I was briefly cheered to think that my ancestors had created an image so expressive of freedom and motion—​assuming that all the lifting and climbing hadn’t been accomplished by slaves. A few years later I had a chance to visit the great temple of Kali in Kolkata and went down the flights of stairs to the terrifying image of a three-eyed Kali with a long, protruding tongue. She is painted in broad, bold strokes, nothing like the complex curviness I expected from Indian sacred art. She clutches a severed head. Is she good or evil? Does the question even make any sense? I respectfully left her an offering of flowers, as recommended by my Hindu companion.

To propagandists for the one true God, the rise of monotheism represents an unquestionable advance in human civilization. But it can also be seen as a process of deicide, a relentless culling of the gods and spirits until almost no one is left. First there was (and in some places, still is) animism, which anthropologists found almost universally among indigenous tribal peoples, although “religion” is a Western notion ill-suited to a worldview in which divine life pervades every single object, animal, breeze, and ray of light. Next, in more complex and hierarchical societies, the divine life that once animated everything gets aggregated into particular “spirits,” and eventually into a number of recognized, “legitimate” polytheistic deities. Monotheism is the final abstraction, leaving humankind alone in the universe with the remote and perfect “one true God.” Nonhuman animals came to be seen as “dumb” or even evil beasts, best worked to death or consumed as meat. Thus did monotheism pave the way for Descartes and the dead world of Newton’s physics.

Where did I fit into this spectrum, or parade, of theological options? Officially as an atheist, of course, the progeny of a working-class lineage that had come to see the “one true God” as a prop for human power relations and his priests as cynical parasites. They were right, these ancestors, as far as I could see, particularly the great-grandmother whose dying act was to throw off the cross that had been placed on her chest. She understood that the great, unforgivable crime of the monotheistic religions has been to encourage the conflation of authority and benevolence, of hierarchy and justice. When the pious bow down before the powerful or, in our own time, the megachurches celebrate wealth and its owners, the “good” and perfect God is just doing his job of legitimizing human elites.

But nonbelievers have mystical experiences too, and mine seemed to locate me squarely in the realm of animism. That was more or less the state of things as I encountered them in May 1959—a world that glowed and pulsed with life through all its countless manifestations, where God or gods or at least a living Presence flamed out from every object. For most of my adult life I had denied or repressed what I had seen in the mountains and desert as unverifiable and possibly psychotic. But thanks to my years of research into history, prehistory, and theology, I was intellectually prepared, maybe as recently as a decade ago, to acknowledge the possible existence of conscious beings—“gods,” spirits, extraterrestrials—that normally elude our senses, making themselves known to us only on their own whims and schedules, in the service of their own agendas. In fact, I began to think, edging to this conclusion bit by bit and with great trepidation, that I had seen one.

But what was I going to say? Not that many people were asking, but I was no longer a social isolate or solipsist. I had embraced my species and accepted the responsibilities that go with membership in it, which meant, at the very least, that I could not tell a lie. When the subject of my atheism came up in a television interview a few years ago, I said only that I did not “believe in God,” which was true as far as it went. Obviously I could not go on to say, “I don’t have to ‘believe’ in God because I know God, or some sort of god anyway.” I must have lacked conviction, because I got a call from my smart, heroically atheist aunt Marcia saying that she’d watched the show and detected the tiniest quaver of evasion in my answer.