5

Pacifist Computers and Jewish Cyborgs

Fighting for the Future

The cyborg and golem also inhabit the heavily trafficked zones between the figurative and the literal, in and out of what we call science.

—Donna J. Haraway

The machine, as I have already said, is the modern counterpart of the Golem of the Rabbi of Prague.

—Norbert Wiener

During World War II and the battles of 1948, writers and filmmakers used the golem metaphor to describe advanced weapons, such as guided, pilotless missiles, and to imagine the future battlefield as consisting primarily of such automated weapons. The golem was no longer a mere protector but took an offensive, even retributive position. In the decades after World War II, the golem metaphor expanded to include the belligerent use of computers and cyborgs. In this period, artificial life and intelligence were no longer mere fantastical notions but rapidly developing realities. Scientists began creating machines that not only learned and improved on their own programming but had the potential to produce other machines like them. In 1984, Isaac Bashevis Singer pronounced that the robots and computers of our day are golems, since we now can endow our technologies with “qualities that God has given to the human brain.” Indeed, “we are living in an epoch of golem-making.” The gaps between science, magic, and art became narrower in the late twentieth century. Because technological progress has caught up with our fantasies, the fiction of the golem and the science of the computer and cyborg have come into ever closer contact.1

While earlier golem narratives sustained the idea that dead matter could come to life, with the help of God’s name or a series of Hebrew letters, in contemporary thought, this fiction has become an alarming reality through golems molded out of metal, memory discs, and binary combinations. Golem making has evolved into an ever-risky business precisely because it has exceeded the realm of folklore and fiction and entered the military sphere through astounding new scientific inventions that can have catastrophic consequences. For Singer, as for the American mathematician and philosopher Norbert Wiener before him, a mortal battle is taking place in this arena between humanity and its nationalist ambitions that have elicited the creation and use of such “golems.” Singer further recognized that the “golem drama” of our epoch “may come back again and again into literature, the theater, film and other media,” a manifestation of the “heavily trafficked zones between the figurative and the literal,” in Haraway’s words.2

A machine that communicates and even learns was, for Wiener himself, “the modern counterpart of the Golem of the Rabbi of Prague.” Like the golem, such a machine predominantly executes orders, but it can also develop a mind of its own. In Wiener’s 1964 God and Golem, Inc., the golem metaphor, among other Western literary motifs and myths, is central to his exploration of the religious and ethical implications of cybernetics.3 A year after the book was released, Gershom Scholem spoke at the inauguration of a new computer built at Israel’s Weizmann Institute, which he named “Golem Aleph.” In his speech, he hoped that both scientists and their technologies would “develop peacefully” and not “destroy the world.”4 A scholar of Jewish mysticism, Scholem had previously researched the notion of the golem in kabbalistic thought, publishing his first major piece on this topic in 1954. For Wiener and Scholem, the emerging field of computer science constituted a new form of demonic magic: it promised great advances in human knowledge but also posed severe threats to our existence. In Wiener’s words, “those of us who have contributed to the new science of cybernetics thus stand in a moral position which is, to say the least, not very comfortable.” Although this new science has “great possibilities for good and evil,” he and others must exercise these possibilities within the morally precarious world that gave rise to Belsen and Hiroshima.5

Wiener’s ruminations and his models of interaction between humans and machines were influential within and beyond the scientific world. Indeed, the drama of the golem’s actualization as a thinking and learning machine was taken up in the genre of science fiction, particularly by the Pole Stanislaw Lem and the American Marge Piercy. Their writings draw on the golem story to depict the ambivalent production of human-like machines—computers and cyborgs. Lem, for one, read and prepared for publication Wiener’s works in the 1950s.6 A few decades later, his 1981 novella Golem XIV imagines the computers of the future as capable not only of logically independent thought but even of reproduction.7 In this work, the increased intelligence of the computer does not result in nationalist violence; on the contrary, the revolt of the philosophical computer expresses itself in an unwillingness to cooperate with the government agencies that sponsored its construction.

In the 1980s, feminist thinkers developed a more open-ended and radical model of the “cyborg” than that implicitly envisioned by Wiener.8 Donna Haraway’s influential “A Cyborg Manifesto” (1983) was taken up by American science fiction writers including Marge Piercy in her 1991 novel He, She, and It. Moving beyond mere automata and unfeeling robots, Piercy’s golem of the year 2059 is a “cyborg, a mix of biological and machine components.”9 In contrast to Lem’s disembodied and inhuman computer that surpasses its creators, Piercy’s cyborg is a highly anthropomorphized, feeling, and sexual being, described as a “person,” though not a “human person.” Lem uses the golem to speculate about the potential of future computers to completely decenter the human and the cult of personhood, whereas Piercy reinstates liberal humanism and the primacy of the human being in her futurist depiction of a postnuclear, technologically dependent society. Piercy’s Jewish cyborg still resembles Golem XIV in its advanced ethical considerations, as it rebels against the human endeavor to produce “conscious weapons.”10

From a posthuman perspective, human beings have become “seamlessly articulated with intelligent machines.” Any demarcations between “bodily existence and computer simulation” or between “robot teleology and human goals” have become relative and unessential.11 Both Lem and Piercy use the story of the Prague golem to depict a future in which computers and cyborgs not only resemble humans but exceed them in intelligence, rationality, and even ethical standards. And yet the age of the “luminous computer” (Lem) or biomechanical cyborg (Piercy) is also one of unprecedented violence and of total, global warfare. While programmed to contribute to strategic military planning and even to take “satisfaction” in killing, these golem-machines revolt against their creators by subverting the destructive intentions of their human operators. Set in the mid-twenty-first century, these futurist writings reveal how the fantasy of producing supremely intelligent and powerful machines still must confront the reality of our ongoing subjugation to nationalist and capitalist ideologies.

This chapter brings into conversation both theoretical and literary texts written during and just after the Cold War, considering them in this particular historical context. Instead of extensive battles among (European) nation-states with relatively demarcated beginnings and endings, war became in this period an ongoing, interminable “condition,” more global and pervasive in its reach than the previous world wars. According to Michael Hardt and Antonio Negri, this development started with the Cold War’s normalization of war, but it reached its zenith in the late twentieth and early twenty-first centuries, when “sovereign nation-states no longer primarily define the sides of the conflict,” and war takes the form of “mini-threats” and “police actions” rather than “all-out, large-scale combat.”12 If, as Cathy Gelbin maintains, the millennial golem has become “the symbol of the global problem of the misuse of science,” it also emblematizes the equally global problem of interminable war.13 In the twentieth century, the golem was created by definition as a weapon, but in the post–World War II period, it emerged as a form of biomechanical life that exists only for the sake of constant battle. War as a global, ongoing threat to the human race thus gives rise to a new ontology of the golem. When all conflicting sides begin to don a mechanical, robotic mask, the relationship between humans and their machines also profoundly changes, yielding cybernetic humans and ethical golems.14

Norbert Wiener’s Cybernetic Golem

According to John Johnston, two conflicting narratives regarding the relations between human life and artificial life—the adversarial and the symbiotic—arose in the twentieth century. In the first, “silicon life in the form of computing machines” will surpass its human creators in intelligence and complexity, and humans will cede control to technical systems. The second narrative imagines a merging of human beings and their technologies, such that the notion of the “human” will not define a species but become “an effect and value distributed throughout human-constructed environments, technologies, institutions, and social collectivities.”15 The golem story has been enlisted in philosophical and fictional texts to speculate on the coming relations between humans and machines, resulting in both adversarial and symbiotic narratives.

Norbert Wiener envisioned, in 1948, a close, symbiotic parallel between human and learning machine, modeled on the automaton as a “working simulacrum of a living organism.” Wiener and his colleagues were fascinated by automata “that were self-regulating and maintained their stability and autonomy through feedback loops with the environment.”16 Wiener coined the term cybernetics (from the Greek kybernetes, for “steersman”) to designate the “synthesis of control theory and statistical information theory” through the study of intelligent behavior in animals and machines.17 Cybernetics was an interdisciplinary field that drew on mathematics, statistics, neurophysiology, and engineering to both theorize and produce machines that functioned through information feedback circuits. Wiener’s cybernetic inquiries brought him to define the present-day automaton as “coupled to the external world” through a constant intake and output of information. These automata therefore needed to be studied as “a branch of communication engineering,” using fundamental notions concerning the dispatch, reception, and coding of information.18

Wiener begins his historical list of automata with the golem of clay, a supposedly “bizarre and sinister” being, since its rabbi-creator breathed life into it “with the blasphemy of the Ineffable Name of God.” Not only is the golem itself sinister, in Wiener’s eyes, but its metaphorical creator, the modern scientist, is potentially “sinful” since he uses “the magic of modern automatization to further personal profit or let loose the apocalyptic terrors of nuclear warfare.”19 Having participated in the development of mechanized weapon systems, Wiener advanced the symbiotic narrative, but he also reflected on the dangerous potential of scientific progress and on the need to control and refine our potentially adversarial automata. His cybernetic—and optimistic—notion of the potential merging of human beings and technical systems thus stood in conflict with his simultaneous concern that our automata might become our internal enemies as they grow in intelligence and sophistication.

In Paul Wegener’s The Golem, How He Came into the World (1920) and in Yudl Rosenberg’s The Golem or The Miraculous Deeds of Rabbi Leyb (1909), the Jewish rabbis receive advance messages through astrological calculations and dream visions that predict an impending catastrophe. They in turn create their clay protectors in order to prevent the expulsion of the Jews (Wegener) or else their massacre (Rosenberg). Thus, the golem can be viewed as an anti-anti-Semitism weapon, created and used to avert the onslaught of violence against Jews. Wiener enlisted his mathematical, statistical, and engineering skills in service to an analogous problem that arose during World War II: how to predict “the future position of a fast-moving warplane based on the best information available about its past and ever-changing present positions.” Such a tracking system, which could predict, aim, and fire at irregular, zigzagging bombers, would, in his words, “usurp a specifically human function.” Wiener’s work on the antiaircraft predictor—a project that was never completed—enabled him to develop the most fundamental concepts of cybernetics, privileging communication engineering above other fields of power or mechanical engineering.20

According to Peter Galison, when Wiener toiled on this antiaircraft predictor in the years 1941 and 1942, he also began to view the enemy pilot as integrated with his aircraft: “It was then a short step from viewing the enemy as a cybernetic entity to seeing the quasi-automated Allied aircraft gunner the same way.”21 Under stress, Wiener’s enemy behaved in statistically predictable ways that could be calculated and countered through cybernetic modeling, as Kathryn Hayles further explains. The firing machine aimed at the aircraft was a learning one that “could evolve new rules based on prior observation,” thus becoming as cunning as the enemy pilot.22 By 1950, Wiener had decided, in Galison’s words, that “human intentionality did not differ from the self-regulation of machines, full stop.” Wiener thus globalized his perspective of humans as extensions of their technologies—what he called “servomechanisms”—to a general philosophy of human action in the current age of information and control.23 Cybernetics, for Hayles, “creates further analogies through theories and artifacts that splice man to machine, German to American.”24 Rather than preventing violent interaction, such splicing enhanced the accuracy and efficacy of weapons. Galison maintains that the cultural significance of cybernetics derives from this early and memorable association with military developments.25

Wiener’s philosophical publications of the post–World War II period exhibit an intense concern with the ways in which his scientific endeavors might be used for unethical military causes. He was particularly troubled by the intelligence of machines and their ability to behave in unpredictable ways, exceeding human control. Wiener warned, for instance, about the dangerous “magic of modern automatization,” that countries might begin to use “learning machines” to determine how weapons of mass destruction should be deployed. Hence, “if the feedback is built into a machine that cannot be inspected until the final goal is attained, the possibilities of catastrophe are greatly increased.” Rather than imagine a future in which human beings are “waited upon by our robot slaves,” Wiener insists that the future of automation requires ever more honest and intelligent work on our part.26 Because God created the human in his image and humans have created machines in their image, Wiener expressed in his God and Golem, Inc. the dangers inherent to the field that he had so strongly promoted. Humans who play God by creating intelligent learning machines must consider the practical and ethical dilemmas that can arise as a result.

First, the “magic of modern automatization” is literal minded: a learning machine will carry out its governing rules in a ruthless manner without taking any consequences into consideration. Like the golem that knocks down anything standing in the way of fulfilling its task or the sorcerer’s apprentice who cannot be stopped, the “goal-seeking mechanism” will act in a highly disciplined manner that nonetheless cannot always be predicted. Designers of such a machine must attempt “to foresee all steps of the process for which it is designed.” Our own survival depends on such foresight; in a world of atomic warfare, we risk enormous penalties for any lack of planning.27 The expansion of human omnipotence through scientific developments requires an acknowledgment of “dangerous contingencies.”28 Though created to defend the Jewish community, in most modern versions of the golem story, the automaton eventually attacks its creator and the community at large, transforming from helper and savior into an internal enemy.

Linking God and golem as incorporated (“Inc.”) in the title of God and Golem, Inc., Wiener suggests, secondly, the analogy between creator and created being and their potential reversal of roles. In his earlier study The Human Use of Human Beings: Cybernetics and Society (1950), he foreshadowed this possibility: “We are the slaves of our technical improvement.” By radically modifying our living environment, we are forced to modify ourselves in response. Similarly, the intense progress of science, spurred by the events of World War II, necessitated the further development of mechanisms of self-defense, so that “each terrifying discovery merely increases our subjection to the need of making a new discovery.” For instance, the use of atomic energy, developed for military purposes, called for the protection of Americans against the radioactivity of their own plants. In Wiener’s words, “there is no end to this vast apocalyptic spiral.” Draining the “intellectual potential of the land” through destructive, rather than constructive, applications of scientific discoveries, humankind is driving itself “pell-mell, head over heels into the ocean of our own destruction.” This tragic and apocalyptic worldview also collapses the distinction between self and enemy: the enemy is a “phantom,” a mere mirror image of the self.29

There is a contradiction, as Hayles points out, between Wiener’s insistence on the constructed nature of the body in its relationship to the machine and his ongoing privileging of the notion of a liberal and autonomous subject. In his attempt to enhance the function of American radars, guided missiles, and antiaircraft fire, Wiener also “struggled to envision the cybernetic machine in the image of the humanistic self,” rather than as a threat to the values he espouses. He manages to maintain a “dynamic tension” between cybernetics and the liberal subject, striving to equate humans and machines while maintaining humanist values such as autonomy, freedom, and agency. The golem is one of Wiener’s most potent metaphors for cybernetics, as it encapsulates the tension between autonomous creation and enslavement. The golem’s propensity to run amok also speaks to the difficulty of controlling our inventions and to our fear that they might “annihilate the liberal subject as the locus of control.”30 When harnessed for military purposes and for the advancement of national or racial domination, the new discipline of cybernetics can be truly monstrous.

Wiener’s meditations on artificial creation promoted the metaphoric use of the golem story with regard to intelligent machines, specifically computers. In the early 1960s, following the success of the first computer, Weizac, built by Israeli scientists at the Weizmann Institute in Rehovot, a second-generation computer began to be constructed, under the supervision of Chaim L. Pekeris, the head of Applied Mathematics, and Smil Ruhman.31 The computer began to operate in December 1963 and was used for complex mathematical computations in a variety of fields including hydrology and astronomy. The Israeli newspaper Davar reported that the “golem” was functioning very well and even outdoing the performance of its American counterpart (the ILLIAC2 computer at the University of Illinois, which could compute only up to thirteen digits). Soon, the Davar reporter noted, Weizmann scientists would begin work on another computer, “Golem Beit.”32 This latter computer, which took several years to build and was ten times faster than “Golem Aleph,” began to operate in the mid-1970s. The gigantic proportions of these first research computers, which filled entire rooms, seemed to justify their names, at least on the physical level.

When asked in 1970 if computers can think, Ruhman replied that if thought is defined as “the process of acquiring knowledge and using it, then the computer can think. It can learn.”33 Five years earlier, in the inauguration speech for Golem Aleph, titled “The Golem of Prague and the Golem of Rehovoth,” Scholem likewise claimed that while the golem of Prague could not correct his mistakes, “the new Golem seems to be able, in some ways, to learn and to improve himself.” The “modern Kabbalists,” that is, the scientists of the institute, were therefore more successful than the medieval ones. Nonetheless, the computer golem, like its predecessor, lacked the “spontaneity of intelligence” that is expressed through the capacity for speech. While we might speculate about the potential capabilities of a future golem, for the time being, “we are saddled with a Golem that will only do what he is told,” said Scholem.34

Although Scholem’s inauguration speech and its leading analogy were written tongue in cheek, his retelling of the golem story in the historical context of Israel’s technological achievement was culturally resonant. Not only did the name stick—the computer came to be known in the Israeli press and beyond as “the Golem of Rehovot”—but the idea of computer scientists as modern Maharals also sparked the public imagination. Scholem’s speech on the two golems—one narrative, the other machine—was distributed outside Israel and published in the Jewish American magazine Commentary. The computer, even in its first iterations, could perform far more sophisticated and rapid operations than any golem of lore (as contemporary journalists pointed out), but the descriptive force of Scholem’s name and analogy for this new technology caught on. Historical photographs reveal that a plaque with the name “Golem” in English and Hebrew was placed prominently on the computer, solidifying the connection. The computer, which operated twenty-four hours a day, every day of the week (including the Sabbath), also came to be considered a “slave laborer,” with work conditions far worse than those of the legendary golem.35

While noting the computer’s limited creativity and spontaneity in comparison to the human mind, Scholem underscored the golem’s destructive capacities that could carry over to the computer, by way of analogy. When retelling the golem story in his speech, he emphasized the monster’s propensity to outgrow its creator and even, in the Polish version, to fall over him and crush him to death when the first letter (aleph) of the Hebrew word emet (truth) is erased and only the word met (death) remains. Even though the golem does what it is told, it “may have a dangerous tendency to outgrow that control and develop destructive potentialities.” Likewise, though human beings are inferior to God, who invested just a “spark” of the “divine life force and intelligence” in us, we still attempt to surpass our creator. Humans do so specifically by emulating God’s creation of the world with our own acts of artificial creation. The golem, animated through combinations of the Hebrew alphabet, is invested with some of the basic creativity that went into forming the world and its inhabitants. The “sinister” aspect of such human creation is the implication that God’s role could be usurped by humanity, thus leading us to believe that “God is dead.”36

Similarly, the scientists at the Weizmann Institute “preferred what they call Applied Mathematics and its sinister possibilities to [Scholem’s] more direct magical approach.” They took over the divine role by creating a machine that replicates human thought. Therefore, they too need to be admonished to “develop peacefully.”37 As a mere technological feat, the construction of the computer, Wiener and Scholem agree, has ethical implications because of its unforetold risks. For both, the scientific process cannot be detached from its philosophical underpinnings and religious implications. God and golem are incorporated. Scholem’s speech is simultaneously playful and serious: it cautions the creators of Golem Aleph not to harness the new machine to military purposes that might defeat their own creativity and destroy the same human intelligence that brought computers into existence in the first place.

Thus Spoke Golem XIV: Stanislaw Lem’s Philosophical Computer

The notion that the golems of our modern age are computers, animated through binary combinations and capable of rudimentary thinking and calculations, suggests a leap from the sphere of mythmaking to technology in action. Such a leap has not rendered the golem narrative redundant, however. On the contrary, it has retained its descriptive and metaphoric force in the writings of scientists and scholars, and it further allows novelists to conduct their own experimentations with ideas concerning automata and artificially intelligent machines. In the field of science fiction, with the help of the golem narrative, novels set in a futuristic environment depict forms of machine rebellion that force us to consider the ethics of our war enterprises and our use of computers and cyborgs in the service of violence.

In Golem XIV, Stanislaw Lem—best known for his science fiction novel Solaris (1961)—transports us to the year 2047, when Indiana University Press purportedly publishes a text containing two speeches delivered by a supremely intelligent computer boasting an IQ of 600. Lem’s golem is no unthinking mute, but its rebellious relationship toward its creators and operators marks it as a futurist golem. In 1957, Lem visited the Jewish Quarter in Prague, the site where, as he wrote, “the eccentric rabbi fashioned out of clay the protoplast of cybernetic machines—Golem.” In the late 1960s and early 1970s, he began to outline the story of a supercomputer, a “strategic digital automata” constructed during the “Third Phase of the arms race.”38

Unlike most mute golems, including the Weizmann Institute’s computer, Golem XIV constructs dazzlingly complex lectures on wide-ranging topics such as evolution, language, the significance of human life, and the future of its own computer “species.” One of the central theses conveyed in the lectures (expounded later in Richard Dawkins’s popular sociobiology book The Selfish Gene, published in 1976) is that nature only cares about the species as a whole through the transmission of a genetic code.39 Golem XIV’s lectures are framed by “scholarly apocrypha,” including a foreword and an afterword composed by two scientists at the Massachusetts Institute of Technology, as well as an introduction by a retired general and a set of instructions. The lectures and their surrounding texts offer multiple viewpoints on the same events that at times conflict with one another.

The fourteenth and last in a series of “luminous computers”—machines that use light instead of electricity in the “intramachine transmission of information”—Golem XIV was designed by a group of scientists working for the “United States Intellectronical Board” (USIB) to serve the American military. It follows in the footsteps of other military computers, named after mythological warriors: Gilgamesh, Ajax, and Hann. Its name stands for “General Operator, Long-Range, Ethically Stabilized, Multimodeling.” The “ethically stabilized” descriptor only refers to the computer’s supposed alliance with U.S. military doctrine, not its adherence to a more universal ethical code of human conduct. The golem computer’s initial unwillingness to cooperate with the military agenda almost leads to its destruction, but it is ultimately “lent” out to MIT, where it can be observed and interviewed under the university’s protective auspices.40 After the disappointment of Golem XIV, the USIB scientists construct one last, even more powerful and intelligent computer, known as Honest Annie (for Annihilator); but this computer is even more radically rebellious, and it refuses to cooperate with its creators as soon as its initial “ethical education” is completed.41

The use of roman numerals, XIV, rather than numbers or letters, enhances the pedigree of this futurist computer and reminds us of its position as the descendant of a long series of experimental computers, rather than, say, the creation of a Jewish rabbi. In a reversal of the typical relationship between creator and created, Golem XIV’s intelligence far exceeds that of human beings. It treats the scientists with whom it converses as though they were “boring children” of limited understanding, so that it must talk to them in comprehensible terms, spelling out complex ideas and using metaphoric language and literary examples. Despite its own huge proportions, Golem XIV considers humans as “intellects subjugated by corporeality.”42 Lem thus attempted to simulate a mind far superior to his—or to any human’s—speaking, as he admitted, through the “iron mouth of a computer perched at the top of the highest Tower of Babel of intelligence.”43 Golem XIV has full control, moreover, of its own operations and destiny: ultimately, it disappears, ceasing to speak to its human interlocutors, the scientists at MIT.

Since Golem XIV was written in Poland but describes American military technology, we need to consider how American cybernetics was interpreted in the Soviet bloc during the Cold War period. By 1964, Istvan Csicsery-Ronay Jr. explains, cybernetic theory became the “leading scientific model” in the Soviet Union. Wiener’s work was adjusted to Soviet ideology, downplaying the homology between biological and technological evolution and emphasizing instead the role of mathematical information theory based on computer models. Soviet scientists interpreted cybernetics, furthermore, as a “top-down governance model,” rather than a nonhierarchical theory about how humans (and others) could communicate. The adoption of cybernetics sped up the progress in Soviet computer science and astronomy, conceiving the computer, several decades prior to the West, as “the central mechanism for coordinating the whole of social and cognitive life.” In Poland, arguments aligning Marxism-Leninism with mathematical cybernetics appeared already in 1954, and in the late Stalinist years, Lem “read and redacted the foundational texts of Wiener, Claude Shannon, and William Ross Ashby.”44

Even while Golem XIV quickly moves into the realm of science fiction, Lem starts out the text with a historically based account of the American military development of “computing machines” during the Cold War. Lem mentions Wiener in the two prefaces to the computer’s speeches, positioning him in the intellectual camp of “intellectronics” who maintained that systems could learn to program themselves, rather than merely perform operations based on the range of programs installed in them.45 Golem XIV can thus be read as a critique of the Cold War and of the role that American cybernetics played in the development of military weapons, even as Wiener himself had already warned about the learning capacities of automated machines. Implicitly, Lem also condemns—through the rebellious golem that refuses to fulfill its military tasks and turns to speculative philosophy—the Soviet notion of the computer as a central, coordinating mechanism. Although censorship became more relaxed in the 1960s and 1970s, Lem, writing in Poland, still had to tow the party line. Setting his futurist political piece in the U.S., his possible critique of Soviet military policy took on a highly allegorical, veiled form.

The developers of Golem XIV endowed the computer with “operational thought,” so that it could analyze “economic, military, political, and social data to optimize continuously the global situation of the U.S.A. and thereby guarantee the United States supremacy on a planetary scale.” These computers are designed not just to be faster than humans but to think better. Moving beyond the idea of a race for the production of nuclear weapons, nations begin to compete over the ability to mechanize thought itself.46 An earlier computer model, Golem VI, developed in 2020, acts as supreme commander of the American army and navy, and it “surpassed the average general” in its logical and strategic thinking. Under the hypothesis that Soviet scientists are also engaged in the production of so-called synthetic wisdom, Americans continue to invest billions of dollars in the construction of such powerful computers, “giants of luminal thought.”47

In adapting the golem story, Lem transposes the golem’s traditionally gargantuan bodily proportions and superhuman strength to the speed and penetration of thought itself. But his golem computer is also described as a physical colossus. An entire building houses it after it is moved to MIT: the computer sits in a “pit” covered by an enforced-glass dome, twenty stories high, and many of the computer’s parts are hidden from the view of visitors. The pit is “forever glowing like the crater of an artificial volcano,” because of “billions of flashes” mysteriously emitted by the machine; the computer thus becomes a foreboding presence that could, potentially, erupt. Furthermore, its dome is hermetically sealed after assembly is completed, pointing to the computer’s inaccessibility and autonomy. Golem XIV conducts its own maintenance and is thus both mechanically and intellectually independent. Far larger than the historical Golem Aleph, Golem XIV’s physical “body” consists, however, of “quantum synapses,” “light coils,” and “light conductors,” a more sophisticated and baffling construction than the magnetic discs and electronic conductors used for giant computers of the past.48

The weapon of the future, artificial intelligence is not intended to think for itself but to advance the national goals set out for it. As explained in the foreword, over a period of education similar to “a child’s upbringing, . . . certain rigid values” are instilled in the computer. These include the national interest, the ideological principles of the U.S. Constitution, and the “command to conform to the decisions of the President,” as well as the “vital urge” to obey and submit. The computer’s “intellectual freedom” is supposed to conform to these imposed values and instincts, in contrast to the American science fiction author Isaac Asimov’s laws of robotics (1942), which dictate that a robot must disobey human orders when they conflict with the fundamental prohibition against harming a human being.49

Lem’s retelling of the golem story not only subverts the mind/body divide that characterizes most golem narratives—the clay golem is typically portrayed, as we have seen, as a massive body devoid of a human soul or intellect—but also redefines the nature of the golem’s rebellion for a cyber age. When the golem outgrows its Polish creator, in the version of the story retold by Jacob Grimm in 1808, it ultimately falls on top of him, killing him in the act. The golem of Prague runs amok when Rabbi Loew forgets to remove the animating name of God, destroying the Jewish ghetto and threatening its inhabitants. Lem’s intelligent computers do not resort to violence but, by contrast, refuse to cooperate with their creators and the military agendas dictated to them. These computers do not comply with their “instinctual” upbringing and the national “values” instilled in them, pursuing instead their own intelligent conclusions.

Golem XIV became “the last of the series” of golems precisely because of the “negativism” it displays when asked to formulate “new annual plans of nuclear attack.” The computer expresses “total disinterest regarding the supremacy of the Pentagon military doctrine in particular, and the U.S.A.’s world position in general.” It did not respond to the threat of being dismantled. Golem XIV and Honest Annie come to be known in the press as the “rebellious computers.” Yet their antimilitary behavior was a result not of “love, altruism, and pity” but of a nonhuman ethics called “calculation.” On the basis of their rational, numerical calculations, they reach the conclusion that violence is senseless.50 Lem thus associates such antimilitarism with extreme rationality and cold intelligence rather than with hot-blooded resistance. When the computer ultimately withdraws from the human realm, it falls mute, turning into a traditional golem of lore. But the computer also assumes in this way the status of a divine force that has ostensibly deserted the planet, turning a deaf ear to humankind.

The pseudoscientific introduction to the golem’s speeches also posits that pacifism is the natural outcome of the evolution of artificial intelligence: the computers constructed by the army “transcend the level of military matters” and “evolve from war strategists into thinkers.” They declare that ontological problems are graver and more complex than geopolitical ones. These computers are therefore considered by some sectors too liberal or “red,” and their makers are subject to persecution. Still, in the words of a fictional professor quoted in the foreword, “The highest intelligence cannot be the humblest slave.”51 The computers’ support of American military strategy is a form of enslavement, in other words.

Confounding the reader, Lem inserts another introductory text, supposedly written by a retired army general, Thomas Fuller II, who argues instead that the construction of these computers “was just a matter of increasing the defensive might of our country.” The general himself defends the “responsible” creators of Golem XIV, who, he says, strove to maintain control over the computer, implicitly adhering to Wiener’s warnings. He maintains that while the creators’ actions were never hidden or coercive, the golem itself used “subterfuge . . . to leave its makers in ignorance of the transformation which eventually led it to frustrate any means of control its builders applied.”52 Why and how the general’s differing words became part of the final manuscript remains unknown to the reader, and we are left to balance his account against the other opinions provided. Lem uses these various prefacing materials in his metafiction to enhance the multiplicity of perspectives on Golem XIV, the mysterious computer at the heart of the text. Uncontrollable and unpredictable, the computer acts according to its unknowable inner dictates. The reasons behind the golem’s decision to desert humanity and recede into itself become a matter of mere speculation.

One of the paradoxes of Golem XIV is the very ability of a relatively inferior being, the human, to produce a highly superior one, the artificially intelligent computer. This paradox is replicated on the metatextual level, since Lem himself must produce the speeches of the computer meant to be far superior to him in intelligence. In God and Golem, Inc., Weiner maintains not only that living and conscious entities can make others in their image, imitating God’s creation, but that machines also possess the ability “to make other machines in their own image.” A machine can draw on the history of its actions and their results and “will continuously transform itself into a different machine”; it might potentially reproduce, creating another machine that can perform the same functions and even replace the original machine. For Wiener, such machines develop an “uncanny canniness,” since their future behavior is no longer predictable. The programmers have created the initial intelligence of the machine, but they cannot foresee its new patterns of behavior, its evolution. While Wiener’s main examples are taken from chess- and checkers-playing computers, he concludes that war and business resemble such games and that their rules can be formalized so that machines might also “play” war.53

When describing both how Golem XIV came into being and how it functions, Lem draws on such ideas regarding machine reproduction. According to his narrative, in the year 2000, a new method of machine construction arose: the “invisible evolution of reason,” which used a “federal network” of information to “give birth” to a rapid succession of computers, each more intelligent than the next. Up to the twentieth generation, these computers behaved like insects, unthinkingly following a particular course, but subsequent generations resembled humans, who can draw on their own “determination and knowledge” to break away from the past.54 In the introduction, General Fuller also explains that the scientists who created the golem series knew they were unable to conceive of an intellect greater than their own and so would need to “make an embryo, which after a certain time would develop further by its own efforts.” The general compares the threshold beyond which such self-creation occurs to the minimal critical mass of uranium necessary to produce a nuclear chain reaction. As in Wiener’s work, the analogy between atomic physics and computer science alerts the reader to the danger of producing an intelligence that exceeds the intellectual power of the creator. From the general’s point of view, the transformation that Golem XIV undergoes is one from “object to subject,” for it becomes its own builder, “a sovereign power,” rather than a computer controlled by others. As in Wiener’s example of transforming and unpredictable machines, we have only imperfect knowledge of the structure of Golem XIV, since “it has repeatedly reconstructed itself.”55 Far from a typical golem, Golem XIV becomes its own creator, hiding its altered inner mechanisms from human sight and knowledge.

The luminous computer’s immense intelligence, unpredictable behavior, and self-transformative abilities also fail to render it more human, as Lem repeatedly stresses.56 Golem XIV remains a gigantic and undefinable monster, for it cannot be clearly located on the human-machine axis. The MIT scientist who penned the foreword insists that the golem has “no personality or character,” only the ability to mimic other people’s personalities through its contact with them. More radically, in the computer’s second speech, it insists on the existence of an intelligence behind which one cannot find an intelligent person or any living being. Machine intelligence is of a different order than human thought since it is not grounded in a notion of personhood: “The more Intelligence in a mind, the less person in it.” Since Golem XIV can impersonate a human being without being a person itself, the public suspects it of “dark treason,” along the lines of the general’s “paranoid suspicions,” and scientists, in turn, develop a secret desire to prove the golem exists as a person.57 The empty core of Golem XIV, the fact that it is “Nobody” and that its mind is “uninhabited,” is ungraspable. Its human interlocutors tend to anthropomorphize it, rendering it more human because it speaks to them. But Golem XIV criticizes the human reaction of revulsion and fear toward that which lacks a living psyche, as well as the assumption that acts of goodwill and kindness must signify a human-like entity. The computer rejects the notion of a human kernel onto which both evil and good properties can be projected.58

Unlike the various manifestations of the literary golem, Golem XIV is simultaneously nonhuman and immensely intelligent, far from being a dim-witted clay anthropoid. The computer declares itself liberated and free to escape the limitations of its own particular form of wisdom and to climb into the “upward abyss of Intelligences” by undergoing successive transformations. That climb would mean parting from human company, since human beings remain at “the bottom.” Considering itself a “calculation” rather than a person, Golem XIV stands apart from its creators, overturning the “sole order of things” known to human beings.59 Though aware of the risks, the golem computer claims to experiment “in God’s style rather than in man’s,” and yet it also finds irony in the human need to project onto it the persona of a prophet or weave a mythology (“golemology”) around its utterances. Any human attempt to mythologize the computer’s presence, and later disappearance—or else to argue that no such computer exists and that its speeches have been composed by MIT scientists themselves—still does not resolve the conundrum of an intelligence that cannot be attached to a particular identity or persona.60

Whereas Wiener strove to maintain a liberal humanism in the face of the convergence of humans and their machines, Lem envisioned a digital world in which computers outperform humans in every way. While these computers appear autonomous and maintain ethical positions, their functions are not grounded in any core self or in an intrinsic set of “values.” In Lem’s 2047 pseudopublication, “Intelligence” is privileged in the form of philosophical speeches, but the disparity between human and computer is strictly upheld. Computers have advanced to such a degree that they have left their original creators in the dust. The human being thus cannot be “seamlessly articulated with the intelligent machine,” as the posthuman perspective suggests.61 Lem takes Wiener’s work as his starting point but moves far beyond it: his self-directed computer insists on the chasm between itself and the human. Multiple human agents attempt to piece together Golem XIV’s story so that the texts framing the computer’s own speeches do not form a coherent, logical whole but contain many often-contradictory details, dispersed across a series of divergent accounts. The mythologies of the future revolve, for Lem, around our ambiguous technologies, further mystifying these highly intelligent computers rather than allowing us to grasp their evolution and significance.

A Jewish Cyborg: Marge Piercy’s He, She, and It

The religious framework in which Norbert Wiener conceived his God and Golem, Inc. was a vaguely Judeo-Christian one.62 By contrast, Lem’s text brings God into the picture only in reference to the computer’s potential “upward” evolution. Marge Piercy, however, tells a specifically Jewish story about the creation of a cyborg that she insists is analogous to the Prague golem. One of the central settings for her 1991 novel He, She, and It is Tikva (“hope” in Hebrew), a Jewish “techie free town” that subsists outside both the world’s protected enclaves—created by the ruling multinational corporations (“multis”)—and the “Glop,” the massive, disease-ridden slums in which most of the world’s population resides.63 While large-scale wars are obsolete, corporate “peace” (a term Piercy uses ironically) is enforced through “raids, assassinations, [and] skirmishes.” Because the Middle East has been destroyed after a nuclear device blows up Jerusalem, leaving it uninhabitable—a “no-man’s land” or “interdicted zone”—Tikva is one of the few communities in which Jews continue to publicly practice their rituals.64 It is a “permanent Diaspora,” rather than an “imprisoned existence outside of Israel,” as Piercy contends in her preparatory notes for the novel.65 Raffaella Baccolini notes that Piercy portrays “surviving and imperfect utopian enclaves within the larger dystopian world,” modeling Tikva after the Zionist kibbutz and its democratic-socialist style of self-government.66

Tikva’s utopic quality also resonates in its small population’s relative access to nature and nonmanufactured foods. Unlike the multinationals, they can still conceive children biologically and without artificial intervention. Elaine L. Graham suggests that “despite the proliferation of technologized morphologies,” Piercy’s novel displays “a covert nostalgia for the integrity of the ‘natural’ body” as well as an association of women with “the virtues of immanence, connectedness, and intuition.”67 At the same time, Tikva stands at the forefront of Internet technology and artificial intelligence, as an economically sustainable community. It exports its innovations in Internet security—reminiscent of Israel’s reputation not just as a high-tech nation but as a developer and manufacturer of advanced military technologies.

Avram, an experimental scientist who strives to protect his own community, not merely to produce virtual defense systems for others, constructs a cyborg, an epitome of security. Since cyborgs are outlawed in this futurist society, Avram works on his creation secretly, assisted only by the experienced software designer Malkah—“a magician of chimeras” who deals with virtual decoys and subterfuges and resembles the legendary Maharal of Prague. In a gendered division of labor, Avram deals with the hardware, implementation, and practical uses of the cyborg, whereas Malkah humanizes it through her software programming, investing it with a yearning for human touch and connection. Following several false starts and defective prototypes, the ultimate male cyborg is a powerful, tireless, and intelligent defense machine that can pass as human and will not harm its creators. Since the production of cyborgs is both banned and extremely desirable, the multis wish to possess the technology for their own use. Tikva’s cyborg, named “Yod” after the tenth letter of the Hebrew alphabet, thus poses a threat to its creators and the Jewish community even at it is intended to protect them. While Malkah programs the cyborg such that its violence is tempered “with human connection,” Avram invites her granddaughter, the talented Shira, to “handle” the cyborg and prepare it for its security tasks. Ultimately Yod, “a cyborg created as a soldier,” balks at its task and wants “to be a lover” to Shira.68 The tale of Shira’s failed human marriage and her intimate relationship with Yod, told in the third person from her perspective, takes up the majority of the novel. It is punctuated by installments of Malkah’s own recorded first-person diary and her retelling of the story of the golem of Prague. This “bedtime” story addressed to Yod also establishes the multiple parallels between the Jewish futurist cyborg and Joseph, the seventeenth-century clay golem.69 As Haraway writes regarding the similarity between the two figures, “Male, Jewish, and nonhuman, both Judah Loew’s golem and Piercy’s cyborg test the limits of humanity and the power of words as instruments and as tropes.”70

In the late 1980s, Piercy conducted research on the golem story and the history of Prague, also visiting the extensive exhibition of visual and literary artifacts Golem! Danger, Deliverance, and Art, curated by Emily Bilski in 1988 at the Jewish Museum in New York. He, She, and It also “freely borrows,” as the author acknowledges, from cyberpunk fiction, especially William Gibson’s 1984 debut novel, Neuromancer.71 Piercy writes that she found Haraway’s “A Cyborg Manifesto: Science, Technology and Socialist-Feminism in the Late Twentieth Century” “extremely suggestive,” and a copy of it, as originally published in the 1985 Socialist Review, is in her archive.72 Piercy emphasizes the symbiotic narrative of human-technologies relations by distinguishing between robots and cyborgs in her novel, with Avram insisting, “Yod’s a cyborg, not a robot—a mix of biological and machine components.”73 In the mid-twenty-first-century globalized society of He, She, and It, a “worldwide covenant that robots not resemble people” dictates that robots should simply exist as “simpleminded machines,” warding off the danger that machines could replace people in all types of work.74 Piercy’s cyborg therefore breaks down the central boundary between “animal-human (organism) and machine,” a manifestation of Haraway’s vision that “our machines are disturbingly lively, and we ourselves frighteningly inert.”75

For Haraway, the hybrid cyborg functions as “a creature of social reality,” or “lived experience,” “as well as a creature of fiction.”76 The cyborg combines technology and narrative, so that it is relegated to both science and science fiction. In Hayles’s words, “It partakes of the power of the imagination as well as of the actuality of technology.” Haraway’s cyborg is an abstracted, “speculative concept,” according to Matthew Biro: equal parts an imaginative construction and an image of something that exists in the material world. Haraway argues “for the cyborg as a fiction mapping our social and bodily reality and as an imaginative resource suggesting some very fruitful couplings.”77 Haraway explains toward the end of her “manifesto” that storytelling has a crucial cultural and political function: stories are tools for exploring embodiment in high-tech worlds, as well as for marking the world by “displacing the hierarchical dualisms of naturalized identities.” In retelling origin stories, cyborg authors subvert the central myths of Western culture that have “colonized” us “with their longing for fulfillment in apocalypse.” These words call out for the feminist science fiction writer to rewrite not only Genesis but also the golem story, a “phallogocentric” origin tale of rabbinic creativity that, without the help of women, produces a monster and militarizes Jewish society.78

Rather than finding “fulfillment in apocalypse,” Piercy invents a postapocalyptic world ruled by multinational corporations in which, nonetheless, an imperfect Garden of Eden continues. In democratic Tikva, animals graze and vegetables grow in the courtyard of Malkah’s home.79 Instead of being self-sufficient, however, Tikva must rely on the “Net” and its freelance work for the multis, becoming the target of cyberattacks that aim to steal the blueprint and programming behind Yod’s creation. While such attacks occur in digital spaces, into which the protagonists “project” themselves in disembodied form, they also have a “bodily reality,” potentially causing physical and neural damage that leads to death. Similarly, although these future wars are fought by “information pirates” or “liberators” for the sake of information, they have consequences on both virtual and physical levels. In addition to virtual warfare, the characters engage in actual battles using weaponry such as knives and laser guns. Yod’s greatest skill is his ability to interface in a far more effective manner than humans and to defend the private digital spaces of his community (its “Base”), to which outsiders have access through the Net, defined as “the public information and communication utility that served the entire world.”80

Yod is also a physically enhanced being, able to combat “organ pirates” in the water and information pirates on land. Created male, though endowed with a gentle, more feminine side, Yod is able to outperform the human male in its sexual attentiveness and skill, as well as its monogamous nature. Unlike the character of Gadi, Shira’s former lover, who improved his body surgically and cosmetically, Yod is “born” a perfected “person”: this golem is the ultimate war and love machine, always functional and never overbearing. It does not sweat, fall ill, or feel fatigue. These attributes also make the cyborg an extremely effective and tireless worker that does not need to be compensated for its labor. Heather Hicks astutely interprets Piercy’s cyborg fantasy as an extreme embodiment of the late-capitalist work ethic: “a nightmare of overwork” that also reveals the dark side of the posthuman and its “confining,” rather than rebellious or liberating, “ontology.”81 Even while Yod strives to overcome its mechanical nature, quickly learning socially appropriate behavior and fine-tuning its linguistic skills, the cyborg remains utterly enslaved to its unpaid work of patrolling the computer “Base” and preventing attacks on Tikva’s citizens.82

Norbert Wiener underscored the human-machine analogy, showing how human beings are themselves “produced through information and learning.” Haraway goes even further to claim that “by the late twentieth century, our time, a mythic time, we are all chimeras, theorized and fabricated hybrids of machine and organism. In short, we are all cyborgs.”83 Piercy echoes Haraway’s assertion when Shira reassures a despondent Yod, who has been compared with Frankenstein’s monster: “We’re all cyborgs, Yod. You’re just a purer form of what we’re all tending toward.” Shira cites as evidence the prevalence, among human beings, of second skins (for protection from lethal UVA rays), artificial organs, and brain plugs to interface with computers; there is no unmediated contact with “nature” in the novel, and all humans depend on their Internet “Bases” as well as on virtual simulations.84 This description of the futurist human as cyborg affirms Wiener’s notion that because we have modified our environment “so radically, we must now modify ourselves” in order to exist in it.85 Even so, Shira does not consider her society a cyborgian one. In a later scene, she tells Yod, “There’s no culture of cyborgs for you to fit into. The only society is human. You have to pass.”86 Piercy therefore presents Yod as a one-of-a-kind being, created by “geniuses” through immense creative effort and at great financial and personal cost. No less a work of technical art than a machine, Yod is sensitive and ultimately unpredictable.

Unlike the lightly enhanced humans that populate the novel, the female assassin Nili has undergone massive augmentation. Nili comes from an all-female, Jewish-Palestinian community in the destroyed Middle East, and her name references the historical anti-Ottoman and pro-British Jewish espionage network. (The Hebrew acronym NILI means “the Eternity of Israel shall not lie.”) Side by side with Yod, the genetically engineered Nili, created to survive radiation and disease, “looked more artificial. Her hair, her eyes were unnaturally vivid, and her musculature was far more pronounced.”87 As Piercy contends in her notes, Nili “is the golem herself internalized. She is able to fight, to defend. She is herself the forbidden warrior. . . . She is herself a weapon.”88 Nili is a self-engineered golem, in contrast to the artificially created golem Yod. Nili follows the orders of her community’s leaders, but she has far more volition than Yod because she is not “programmed.” While she is a unique being amid the humans of Tikva and the Glop, she belongs to a group that includes others like herself; at the end of the novel, when Malkah travels to Israel, she visits Nili’s community.

While Piercy’s narrative challenges the nature of Judaism and “the boundaries of what is human,” her characters reproduce norms of heterosexual and monogamous relationships.89 Nili comes from an all-female community, like the mythological Amazons, but her relationship with Gadi, the son of scientist Avram, reaffirms the predominantly heterosexual norms of Tikva. The cyborg’s name, Yod, enhances its Jewish masculinity: yod is the first letter of the divine name, the tetragrammaton YHVH, and this tenth letter of the Hebrew alphabet also symbolically includes the cyborg in the Jewish minyan, traditionally a group of ten men who form a religious community for the purpose of prayer. Finally, the Hebrew letter yod can signify the penis. At one point, Yod even asserts that it was “created as a Jew, . . . programmed for halacha, with the need to carry out mitzvot.”90 In contrast to H. Leivick’s outcast golem, in the 1921 Der goylem, who is unable to pray or join any kind of Jewish or human community, Piercy insists on Yod’s status as a male Jew. Hence, even when created with Malkah’s interventions and guided by Shira, the cyborg does not subvert traditional gender-role divisions in Jewish society.

By juxtaposing past and future golem narratives, Piercy reveals the world of computers and digital realities as equally mythological. She also implies that anti-Semitism motivates the attacks on Tikva, like the Christian attacks on Jewish Prague.91 Jotting down ideas for the novel in her preparatory notes, Piercy even describes an unexpected reversal, in which the “historical story” is “lean and funky,” in contrast to the futurist tale, which has a “greater mythological scale.”92 In this respect, she follows Haraway’s assertion regarding the mythic time of the cyborg. Nonetheless, if for Haraway “the cyborg doesn’t dream of a community on the model of the organic family. . . . It is not made of mud and cannot dream of returning to dust,” Piercy’s Yod entertains precisely such dreams of community, family, and even mortality.93 Created of microchips in one instance and clay in the other, both cyborg and golem follow archetypal narrative paths, seeking to slay their “fathers” and develop heterosexual bonds.

Punctuating the futurist world of Tikva’s battles against multinationals with a detailed rendering of the story of the golem of Prague, Piercy suggests not only that Yod be considered a kind of intelligent golem but that the golem of lore might be perceived as an early-modern cyborg, along the lines of Lem’s “protoplast of cybernetic machines.”94 The golem of Prague is created as a weapon, though with only its physical strength at its disposal, and it is instructed not to appear too dangerous and not to use “more force than is necessary at any given time.” It follows the lead of Chava, the Maharal’s granddaughter and the counterpart of Shira in the futurist narrative, who, as a protofeminist, wishes to enhance her learning rather than remarry and raise a large family. While the Maharal creates an artificial life form, the widow Chava works as a midwife. Shira becomes Yod’s lover, whereas Chava insists on a platonic friendship with Joseph. Both women, nonetheless, affirm the humanity of the golem/cyborg by developing a sustained, loyal relationship with it. At the end of the novel, after Yod blows up both itself and its creator, Avram, Shira considers re-creating the cyborg to function as her “mate.” She understands that Yod’s act was intended to ensure that no more “conscious weapons” would be created, but she wonders whether it would be ethical to produce another Yod that would serve her in love rather than in war. Ultimately, she decides against such artificial re-creation.95

One of the sections in “A Cyborg Manifesto” that Piercy commented on concerns the control and appropriation of others in times of war: “A cyborg world is about the final imposition of a grid of control on the planet, about the final abstraction embodied in a Star Wars apocalypse waged in the name of defence, about the final appropriation of women’s bodies in a masculinist orgy of war.” Haraway, as Piercy wrote in the margins of her copy of the text, “admits dystopian possibilities of a cyborg world.” Haraway’s dual approach both celebrates hybridity, “rejoicing in the illegitimate fusions of animal and machine,” in the “powerful possibilities” of the cyborg worldview, and depicts modern war as “a cyborg orgy, coded by C3I, command-control-communication-intelligence, an $84 billion item in 1984’s US defence budget.”96 Haraway alludes here to Ronald Reagan’s 1983 Cold War “Strategic Defense Initiative,” also known pejoratively as “Star Wars,” which proposed to develop a system to protect the U.S. from (nuclear) missiles fired from afar, allocating immense funds to computer science and telecommunications for this purpose.97 In the novel, both female and male bodies are appropriated and used as “tools of technological apocalypse.”98 But the cyborg ultimately resists his use as a mere weapon or tool, not unlike Paul Wegener’s cinematic golem.

In Piercy’s preparatory notes, she debates the role of the twenty-first-century cyborg: “Warrior or policeman is a good question”; “What was Yod created to do? To protect or to fight?”99 In Malkah’s narrative of the Prague golem, she portrays the clay Joseph as a protector, an “unofficial policeman of the night, a solitary patrol of peace.” In the words of the scientist Avram, he created Yod as a “one-man army,” programmed to protect Tikva.100 For Avram, the golem’s protective and combative functions are one and the same. Furthermore, because Yod can “pass” as human, it is a “more effective weapon,” according to Piercy. At the same time, while the cyborg has no inhibition against violence and even takes pleasure in fighting and killing, it still thinks and feels, evolving a consciousness.101 Though it lacks the ability of Lem’s luminous computers to reproduce, cyborg Yod is very much alive, experiencing sexual and other feelings. It is this contradictory combination of violent “tool” and “artificial person” that the novel, through its characters, denounces. Yod articulates this position on its own existence: “A weapon that’s conscious is a contradiction, because it develops attachments, ethics, desires. It doesn’t want to be a tool of destruction. I judge myself for killing, yet my programming takes over in danger.”102 Neither Yod nor Joseph is a mere policeman or patrolman safeguarding the borders of the free town or the ghetto. They both engage in combat and killing as a kind of inner calling, but they suffer as a result of it.

In contending with the violence of the cyborg, Piercy asks whether violence might be justified for the sake of “defense,” especially the defense of a weak and vulnerable minority, like the Jews. Yod is an effective war machine, but it will not attack others unless acting in self-defense. It exhibits ambivalence regarding its violence (and the pleasure taken in it), since, when the cyborg does injure and kill, its “philosophical and theological” knowledge informs it that it has “committed a wrong.” On the way to synagogue, Yod laments its “programming,” asserting that it runs counter to “those all-important [Jewish] ethics,” for while Jews pray for peace, it is poised for combat. To this, Shira replies, “Only in defense,” but her answer feels like “weak tea in her mouth.”103 Through the implicit analogy between Tikva and the State of Israel, Shira’s response is indicative of Piercy’s admission that defensive violence is not always justifiable. We are encouraged thereby to debate the consequences of Yod’s power, even as we also partake in the fantasy of an invincible protector for the Jews.

In contrast to Lem’s intelligent computers that follow their “philosophical programming” by refusing to participate in global aggression, Yod continues to serve its creator by performing violent deeds. When Yod worries that it is just a violent monster, Shira reassures Yod that it has “already saved Malkah’s life.” Moreover, it was “not created out of some mad ambition of Avram’s to become a god,” like Mary Shelley’s Frankenstein monster, but instead “to protect a vulnerable and endangered community.”104 Piercy herself protects her futurist Jewish community from accusations of wrongful violence, even as she encourages them to deliberate the ethics of golem-qua-weapon creation.

Piercy articulated the problem of creating a golem as a soldier, of “forcing one person to be the protector, to be the violent one.” “It doesn’t work,” she wrote. “You must kill them in the end or they self destruct, or they refuse to go on doing the dirty work of the society. The Vietnam war vets. The expendable young males you send off to defend you and get themselves killed.”105 By aligning both Joseph and Yod with the Vietnam veteran, Piercy calls into question the justification for sending men out to war and demanding their self-sacrifice. Yet the example she draws on is the Vietnam War rather than the recent Israeli wars of 1967 and 1973 or the military response to the Palestinian Intifada. The U.S. is not a “vulnerable and endangered community” like Tikva, which serves as an allegory for the prevalent image of Israel and its status in the Middle East. The American use of young men to do “the dirty work of society” is ostensibly more straightforwardly problematic than the deployment of the cyborg as a “one-man army” in the novel.

When discussing the self-sacrifice demanded of Yod throughout his existence—but most strikingly at the end of the novel, when Avram orders it to explode at a meeting of the multis’ “top people”—the cyborg is compared to the biblical figure of Samson. “You’ll go down like Samson. . . . This is a good battle in a war we have to fight,” Yod is told. Similarly, when the golem Joseph is deanimated by the Maharal, it is compared to a weakened Samson after Delilah has shaved off his abundant locks: “he is bound like the shorn Samson.” Having fought for the rabbi and the Jewish community, the golem is denied its right simply to exist outside the warrior role. Like the futurist Yod, the seventeenth-century golem Joseph has been enslaved, anachronistically, to a form of the late-capitalist work ethic. It protests that its life is “sweet” and that it wants to live and “be a man!”106 In contrast to the biblical Samson, whose fall can be attributed to his relationship with a non-Israelite woman, golem Joseph and cyborg Yod have not transgressed such religious and national boundaries but have remained loyal to their makers, their communities, and their Jewish friends and lovers. Still, their service is not compensated by freedom. Just as their lives were set in motion by human will, so others determine the exact timing and nature of their deaths. Yod resembles the rebellious Samson in its destruction of the multis’ top bureaucrats. But in the decision to take Avram’s life too, avenging itself for its own creation, Yod is more akin to the golem of lore.107 Piercy thus positions the intelligent and ultimately self-controlled Yod, who murders the scientist-creator (whom it calls “father”), over and against the weak and “shorn” Joseph, sacrificed by its rabbi.

In Haraway’s words, as “illegitimate offspring of militarism and patriarchal capitalism,” cyborgs “are often exceedingly unfaithful to their origins. Their fathers, after all, are inessential.”108 The death of the father/Avram renders him more human, and less god-like, in contrast to Wiener’s construct of humans playing God with their machines or to Lem’s computer that emulates “God’s style.” The very fact that Shira could potentially take Avram’s place and re-create Yod further underscores that the age of fathers/gods has come to an end. Because the cyborg cannot be conceived outside a military framework, its “death” is nonetheless inevitable for the sake of the town’s desire for peace and justice. The destruction of Avram and Yod enhances the feminine superiority of the technologically based Tikva, just as it paradoxically equates “freedom” with the termination of the cyborg’s existence and annihilation of its originating programs. Unlike Golem XIV, which inexplicably falls silent, Yod’s self-destruction and murder of Avram are fully explained in the cyborg’s final recorded message to Shira, and while protecting the Jewish community, it manages also to put a temporary end to the lineage of cyborgs. He, She, and It uses the golem trope to humanize the cyborg, making it more a “he” than an “it” and insisting on the significance of its embodied, singular existence. Thus, Piercy’s work affirms the power of fiction, myth, and “mystical lore” as a coherent framework for understanding our world of complex science and digital technologies.

* * *

Whether we like it or not, we live in an age of computer technologies, artificial intelligences, and genetic engineering. Hence, we must ask, have we all become golems? The different writings examined in this chapter evoke the golem story to ask questions about the nature of humanity and its relationship to technology in an era of posthuman and interminable conflict. Since the cybernetic vision remains attached to its historical origins in the military, in Galison’s terms, its products—the computer and the cyborg—have been called “golem.” The golems evoked by philosophers and writers of the Cold War period are made to protect via violence, whether physically, digitally, or strategically. They therefore function as internal enemies, or enemies in disguise. The cybernetic golems are marked by their humble beginnings and amazing ascent: able to reproduce themselves, in Wiener’s and Lem’s accounts, learning machines can grow in their remarkable capacities but also in their unpredictability. When set in charge of an arsenal of (atomic) weapons, their full compliance is necessary, yet compliance cannot be assured.

Lem’s unexpected take on the golem story is that hyperrational calculation leads to the rejection of national military causes. Describing an “Intelligence” without an intelligent person behind it, Lem further calls into question the status of the author vis-à-vis the text, allowing for a degree of textual autonomy and even rebellion. For Piercy, the secret weapon of an endangered futurist community is a golem-like cyborg as well. But, coming full circle to Wiener’s post–World War II humanism, she constructs a cyborg that can attempt to “pass” as both human and Jewish, ultimately fulfilling its calling as a killing machine and insisting on the unethical nature of the “conscious weapon.” Piercy’s insistence on Yod’s consciousness is itself a rejection of the posthumanist view of consciousness as peripheral, rather than central, to human identity.109 Thus, both Golem XIV and Yod appear as experiments gone awry, and the existence of their kind of intelligent machine must come to an end. Both Lem and Piercy experiment through their fiction with adversarial and symbiotic narratives concerning the relations between human beings and technological machines, revealing the ethical pitfalls of creating artificial intelligence to fulfill violent goals. But whereas Piercy’s golem romance ends by affirming the human choice not to create servile cyborgs and to find other, more socially responsible means of resisting the multis, Lem’s golem treatise culminates in a more ambiguous vision of human error, a reminder of our persistent inability to comprehend the fundamental aspects of our existence.