[EIGHT]
WHAT INSPIRES THEM: SCIENCE FICTION’S IMPACT ON SCIENCE REALITY
You can never tell when you make up something what will happen with it. You never know whether or not it will come true.
—DONNA SHIRLEY
The Science Fiction Museum and Hall of Fame appropriately stands in the shadow of Seattle’s futuristic landmark, the Space Needle. Set in a multicolored, globular Frank Gehry-designed building that looks like a cut-up guitar (a “ridiculous . . . monstrosity of postmodern architecture” is another writer’s take), it shares the space with the Experience Music Project, a museum for rock and roll music. The odd juxtaposition of the two museums is actually quite simple: science fiction and Jimi Hendrix’s music were the two boyhood loves of Microsoft cofounder Paul Allen, who is the primary funder of both.
Founded in 2004, the Science Fiction Museum and Hall of Fame is dedicated to exploring the history of science fiction and how it shapes our culture, politics, and philosophy. While the Experience Music Project next door has the guitars used by Bob Dylan, Bo Diddley, and Kurt Cobain, the Science Fiction Museum rocks just as hard. Displayed in the museum are such artifacts as Captain Kirk’s command chair from Star Trek, the alien queen from Aliens, Darth Vader’s helmet from The Empire Strikes Back, Neal Stephenson’s handwritten manuscript for the Baroque Cycle trilogy, and the pistol used by Harrison Ford in Blade Runner. The museum also runs a kids’ program, including a “summer camp on Mars,” as well as a happy hour for the adults, with three-dollar beers on tap.
It is easy to think of the Museum and Hall of Fame as only some sort of “Pantheon of Nerds” (what my editor jokingly called it), as science fiction may well be the ultimate of geekdom. Perhaps no one puts it better than Chuck Klosterman, who once wrote that admitting you like science fiction was “like admitting that you masturbate two times a day, or that your favorite band was They Might Be Giants.”
And yet science fiction is undeniably popular. The earliest science fiction was by storied writers such as Mary Shelley, whose Frankenstein was first published in 1818, and Nathaniel Hawthorne, whose story “The Birthmark” wrestled with plastic surgery before plastic was even invented. Today, roughly 10 percent of all books are in the science fiction and fantasy genres. This does not even count major authors like Michael Crichton or Tom Clancy, who write “techno-thrillers” that are science fiction in all but name.
Science fiction has thrived even more in modern media forms. Six of the top-ten-grossing movies of all time are science fiction, led by the original Star Wars (inexplicably still behind Titanic in total sales). On TV, many of the most popular and influential shows of all time, from The Twilight Zone to Lost, have been science fiction. An entire cable network, the Sci Fi Channel, is exclusively devoted to the genre. For such a geeky topic, it is doing quite well, ranking in the top ten of all basic cable networks.
Science fiction is more than just popular; it is also incredibly influential, to an extent that is often surprising. Time and again, science fiction makes its presence felt in real-world technology, war, and politics. At iRobot, for example, the robotics research group described how their team motto was a toss-up between “making science fiction reality” and “practical science fiction” (they couldn’t yet decide which they liked better). Science fiction references and ideas also make frequent appearances on the military side, coming up in almost any meeting on new military technologies or how to use them. Even Admiral Michael Mullen, the chairman of the Joint Chiefs of Staff (that is, the man in charge of the entire U.S. military), proudly described how the navy’s “Professional Reading” program, which he helped develop to guide his sailors, includes the science fiction novels Starship Troopers and Ender’s Game.
WHAT IS SCIENCE FICTION?
Museum director Donna Shirley is perhaps the best person in the world to explain just what is science fiction. Shirley’s entry into the field came at the age of ten, when she went to her uncle’s college graduation. In the pamphlet they gave out, there was a listing for graduates with “aeronautical engineering” degrees. Shirley remembers asking her mother what that meant. “My mother replied, ‘Those are the people who make airplanes.’ And so that’s what I wanted to be.”
At the age of sixteen, Donna Shirley got her pilot’s license. She then enrolled as the only woman in her classes at the University of Oklahoma, and earned that degree her mother had told her about. “Although the guys in my classes were fine with me being an engineer, my college advisor for aeronautical engineering told me girls couldn’t be engineers,” she recalls.
She soon proved him wrong. In 1966, Shirley joined NASA’s prestigious Jet Propulsion Lab as one of the space program’s very first female engineers. Over the next thirty-two years she worked on projects as varied as automating controls of military satellites to the Mariner space probe’s ten trips to Venus and Mercury. She capped her career by serving as manager of the Mars Exploration Program, which included the 1997 Mars Pathfinder and the Sojourner robotic rover missions. As one article describes, “Not only were these events two of the U.S. space program’s greatest successes, but they may well provide the world with some of the most important scientific data of the 20th and 21st centuries.”
Shirley credits the science fiction she read growing up as a key factor in her career. She recalls reading the stories of Robert Heinlein and Isaac Asimov at the age of eleven. “The political issues in the books went over my head,” she says, “but their heroes were always engineers and scientists.... Heinlein and Asimov also frequently had women characters as heroes, which resonated with me.” During the publicity surrounding NASA’s Mars missions, the organizers of the museum heard Shirley talking about science fiction’s influence on her work and invited her to join the team.
She sees as her role and that of the museum to “educate people about science fiction and to make people realize how important it is in our culture, and by implication get them interested in science and the social aspects of science. At the same time, we can pass on some moral lessons.... In a sense, it’s to capture their imagination away from, say, Playboy and into something a bit more important.”
Shirley notes that the fictional worlds that science fiction authors often create are not what constitute science fiction. Nor does science directly drive the plotlines. Rather, science fiction forces the audience to wrestle with the effect that science has on society. She explains, “The technology is not the interesting part; it is what people do with the technology.” Most science fiction deals with some sort of fallout, usually political, that comes from a new event or technology. For example, Philip K. Dick’s Minority Report posits a technology that allows the police to predict a crime. The story is not about the technology, but “the political and legal ramifications of actually using such a system.” In short, science fiction is more about asking “thought-provoking” questions than merely providing “jaw-dropping” special effects.
This focus on the dilemma, rather than the technology, is what allows science fiction stories to remain relevant even when the world and technology advances past the time of a story’s creation. Shirley points out an exhibit at the museum that shows how H. G. Wells’s 1898 novel The War of the Worlds has been continually “remade and rereleased every time there was a perceived existential threat on this world.” Prior to World War II, Orson Welles did his famous radio broadcast. Then the story was made into a movie that echoed nuclear fears at the start of the cold war, and a third time in 2004 by Steven Spielberg, who used imagery evocative of the 9/11 attacks.
Shirley sees several trends in how science fiction is wrestling with the modern world. The first is a trend toward more women writers, in particular the pioneering work of the recently deceased Octavia Butler, one of the first African American women science fiction writers and the only science fiction author ever to receive a MacArthur Foundation “genius” grant. “Women writers tend to write more about the social stuff and what happens to people.” There is also an evident trend of more focus on the impact of computers and robotics. She notes the work of writers like Neal Stephenson and Bruce Sterling, who helped found the “cyberpunk” movement. The trend emphasizes not merely the coming technology, but what happens when it gets placed “in the hands of our depraved society.”
SCIENCE FICTION AND WAR
“I thought Ender’s Game might be popular when I finished writing it—high-tension story, semi-tragic outcome. I did not expect it to last as long as it has (so far) or to become as widely read by adults, teenagers, and children. Or, to put it another way, I think all my books will do wonderfully well when I’m through writing them; with Ender’s Game, I happened to be right.”
Orson Scott Card has written fifty-nine books that have sold twenty million copies in North America alone. But it is still Ender’s Game, his 1985 book, for which he is best known. The story of Ender Wiggin, a child who expertly fights war as if it were sport, won every major science fiction award, has been translated into eighteen languages, and is under development at Warner Brothers to be a major movie.
More important, the book’s stories of the command school of the future and experiencing war from afar via virtual reality struck a particular chord with the military. Some two decades after its publication, it is still in various military course catalogs (such as at the Marine Corps University, where it is used as a text on the psychology of leadership) as well as various U.S. military-required reading lists that generals and admirals tell their officers they should read if they want to be good warriors under their command.
He may be a writer of fiction, but like many in the field, Card also consults for the military, speaking on such topics as “Next Generation WMD: Anticipating the Threat.” Card is not surprised by the response his book has gotten from military readers. “Soldiers feel like Ender’s Game is telling their story—young people doing their duty in spite of the idiocies of the officers who lead them. But I get similar responses from gifted schoolchildren and from kids who do very badly in school, each of them seizing on the heroism-in-isolation of Ender and extrapolating it to their own lives.”
Card’s work is representative of a broader trend in science fiction, its overwhelming focus on war. While science fiction is known for peering into the future and bringing to light fanciful new technologies, the vast bulk of it places these stories and technologies in one particular context of our human experience: war. Each year, approximately five major science fiction movies that link to war are released. Fifteen of the ongoing science fiction TV series have a conflict or military element. The thirty-five science fiction magazines in the field each carry multiple stories set in war. And if you attend any of the fifty-two major science fiction conventions, your costume is most likely to pack a phaser, lightsaber, or blaster rifle. What some call “military science fiction” is by far the most popular part of the genre.
The reason why such a huge percentage of science fiction deals with issues of war, says Card, is “because war is a human constant. War also drives technological advance. And insofar as sci-fi was and remains a male genre, war will continue to fascinate readers.” He is more curious about why other genres don’t pay as much attention to war. “The real question,” he asks, “is why war is not more important in mainstream fiction? . . . Literary fiction generally skips over two of the primary occupations of humankind: war and religion. At least science fiction and fantasy can still address those topics, along with everything else that literature can talk about.”
Other writers point out that war is so popular in the genre because it is an unparalleled platform for wrestling with deep issues. Robin Wayne Bailey, the president of the Science Fiction and Fantasy Writers of America (a trade association for sci-fi writers; like the Teamsters but with pointy ears), explains, “The conflict is obvious [in war], the opportunities for technologic exploration and idea exploration are vast, as is the case in real war as well, and war provides both a microcosm and macrocosm for exploring human nature, and most pertinently human nature under stress.” Harry Turtledove and Martin Greenberg, noted authors themselves and the editors of a master volume of the field entitled The Best Military Science Fiction of the 20th Century, agree. “Fiction is about character under stress. What we do when the heat is on reveals far more about us than how we behave in ordinary times.”
Science fiction authors have set their stories in the realm of war since the very start of the field. H. G. Wells is perhaps the best known, but others include literary titans that we don’t often associate with science fiction, such as Arthur Conan Doyle, Jack London, and even A. A. Milne. Most know Milne as the creator of the lovable bear Winnie-the-Pooh, but in 1909 he wrote a science fiction short story entitled “The Story of the Army Aeroplane.” Just six years after the Wright brothers, it predicted that man might one day use those crazy flying machines for war.
The most influential author in developing the link between science fiction and war has to be Robert Heinlein. Heinlein came from a military background, graduating from the U.S. Naval Academy in 1929 and serving until 1934, when he was discharged for health reasons (during his convalescence, he invented the water bed). When World War II started, Heinlein went back to work for the navy doing aeronautical engineering. Interestingly, he recruited two young engineers to join his work at the Philadelphia Naval Yard; Isaac Asimov and L. Sprague de Camp would also later become some of the biggest names in the history of science fiction.
After the war ended, Heinlein became a key person in breaking science fiction into the mainstream, including being the first writer in the field to pen for the Saturday Evening Post, a leading magazine of the time. Over the course of his career, Heinlein would write thirty-two novels and fifty-nine short stories. But his two most influential were 1961’s Stranger in a Strange Land, which foreshadowed the “free love” of the Sexual Revolution and was embraced by the hippie movement, and 1959’s Starship Troopers, which, by contrast, is on the reading lists at the major military service academies, and inspired several technologies, such as robotic fighting suits.
In recognition of Heinlein’s popularity and influence, the U.S. Naval Academy even has an endowed professorship named for him, the Robert A. Heinlein Chair in Aerospace Engineering. There is also a movement to have one of the navy’s newest warships named the U.S.S. Robert Heinlein, in honor of his hundredth birthday. As the petition letter to the secretary of the navy reads, “It only seems fitting that a man who spent his life writing about the 21st Century should have a 21st Century destroyer named after him.”
THROUGH THE LOOKING GLASS: SCI-FI PREDICTIONS
Part of the popularity and influence of science fiction comes from its remarkable skill at foreshadowing the future. For a fictional genre that often takes place in settings that don’t even exist, science fiction has forecast real-world technologies, as well as resulting dilemmas, with stunning accuracy.
Perhaps the best example of how predictive science fiction can be is the work of H. G. Wells, who is known as “the Father of Science Fiction.” Wells was born in 1866, but his various stories forecast the twentieth century with incredible accuracy, predicting such things as computers, videocassette players, televisions, and even superhighways, each of which seemed unfathomable at the time. His books often had a theme of conflict running through them, and so he also predicted various military developments well before their time. For example, he wrote about tanks, or what he called “Land Ironclads,” in 1903, which inspired Winston Churchill to champion their development a decade later. Similarly, his 1933 book The Shape of Things to Come predicted a world war that would feature the aerial bombing of cities. Wells was not a fan of such technologies, as he saw them as “unsporting.”
Perhaps Wells’s most important and influential prediction was in his story The World Set Free, published in 1914. He forecast a new type of weapon made of radioactive materials, which he called the “atomic bomb.” At the time, physicists thought radioactive elements like uranium only released energy via a slow decay over thousands of years. Wells described a way in which the energy might be bundled up to make an explosion powerful enough to destroy a city. Of course, at the time, most scoffed; the famed scientist Ernest Rutherford even called Wells’s idea “moonshine.” One reader who differed was Leó Szilárd, a Hungarian scientist. Szilárd later became a key part of the Manhattan Project and credited the book with giving him the idea for the nuclear “chain reaction.” Indeed, he even mailed a copy of Wells’s book to Hugo Hirst, one of the founders of General Electric, with a cover note that read, “The forecast of the writers may prove to be more accurate than the forecast of the scientists.”
Wells’s story ends with scientists trying to organize an effort against war and the use of the new bombs. The idea later inspired Szilárd, Einstein, and others to form the Pugwash nuclear disarmament movement, meaning Wells’s book, in turn, is the inspiration for the modern arms control movement (as well as the robotic “refuseniks” described in the following chapter).
Perhaps the only equal to Wells’s work was the work of Jules Verne, who has been called “the Man Who Invented Tomorrow.” Born in 1828, Verne wrote such books as Twenty Thousand Leagues Under the Sea, well before such things as large-scale submarines existed. His greatest prediction may have been in his so-called lost novel. In 1863, Verne wrote a book entitled Paris in the 20th Century. In it, he predicted a future that would have skyscraper buildings made of glass, automobiles powered by gasoline, calculators, worldwide communications, and even electronic music. To give a sense of how impressive this was, at the time Verne was writing, the electric lightbulb hadn’t even been invented and the United States was locked in a civil war over whether human beings could be owned as slaves. What is more, Verne predicted that none of these fantastic improvements would make people happy. Instead, he foresaw that the technology advances would turn into a crass commercialism that overwhelmed worthwhile arts and culture. The publisher didn’t like this dark yet admittedly accurate prediction of the future and rejected it. They published his Journey to the Center of the Earth instead and the manuscript stayed locked away in a safe until 1994.
Science fiction continued to tap into the future throughout the twentieth century, as the field extended into film and TV. The only difference was that, with the speeded-up time frames, the imagined technology came to fruition much quicker. For example, Stanley Kubrick’s 1971 film A Clockwork Orange predicted futuristic small music devices (what we would now call MP3 players or iPods), while in the 1976 movie The Man Who Fell to Earth, David Bowie plays a futuristic alien who develops an equally futuristic technology, what we now know as digital cameras. Indeed, even The Jetsons proved prophetic. George Jetson spent most of his day at work at Spacely Sprockets pushing computer buttons as a “Digital Index Operator.” Spending your day in front of a computer seemed wildly futuristic in the 1960s, but now George is just a run-of-the-mill database administrator.
By comparison, the government often has a relatively poor track record when it comes to predicting the future. For example, in 1913, the U.S. government actually prosecuted Lee de Forest of RCA for telling investors that his company would soon be able to transmit the human voice across the Atlantic Ocean. The idea seemed so absurd to the government that de Forest was assumed to be a swindler. Indeed, Philip Tetlock, in his award-winning study Expert Political Judgment, found that the professional “experts” who advise government are actually more often wrong in their predictions than right. Industry equally has a mixed track record. For example, IBM president Thomas Watson famously said in 1943, “I think there is a world market for maybe five computers.”
When it comes to war, the same pattern holds. As a 2006 article in Armed Forces Journal, one of the leading magazines for U.S. military officers, notes, “We don’t do well, historically, in predicting the location and nature of the next war.” For example, Sir Arthur Conan Doyle, the creator of Sherlock Holmes, wrote a short story in 1914, just before World War I started. Entitled “Danger,” it warned that the new invention of submarines might be used to sink merchant ships. The Royal Navy’s Admiralty actually went public to refute and mock Conan Doyle, saying that “no nation would permit it and the officer who did it would be shot.” Just seven months later, the passenger ship Lusitania was torpedoed by a German U-boat, inaugurating the era of submarine warfare. Part of the reason for this pattern is that while science fiction looks forward, the military typically plans what the next war will look like by looking back at how it fought the last one. In discussing how the American army that invaded Iraq in 2003 planned for it to be a repeat of the 1991 Gulf War, Armed Forces Journal concludes, “Our advances in technical intelligence have not improved our ability to predict any specific war.”
There are a couple of explanations for why science fiction tends to do well in prediction, even though it is working in the world of fiction. Many science fiction writers are scientists themselves, so they are typically well equipped to stay within the rules of science yet extrapolate forward. Arthur C. Clarke, for instance, not only imagined a world of intelligent computers, but is also the man who invented the real-world communications satellite. As Donna Shirley explains, science fiction authors tend to get their predictions right because they are most often writing about what they know best. “Modern science fiction is increasingly being written by computer geeks, who are already experts on the technology side.”
Secondly, these writers must create a narrative at some point (that is, the plot, which usually involves a battle of good versus evil, hence the frequent setting of war), but along the way they must solve the same technical problems that real scientists do. However, they don’t have the constraints of a budget or lab time or bureaucratic politics. The freedom of the fictional world allows them to work out solutions sometimes easier than in the real world. As one computer scientist noted, “Science fiction is not making predictions, but playing with possibilities.”
Finally, dealing with the “what if?” is what sets science fiction apart from regular fiction, as well as real-world science. Shirley explains, “The best science fiction deals more with the social consequences of technology change than the technology itself.” It is the combination of scientific awareness with human imagination that allows science fiction to better deal with technology put in a complex social setting. Science fiction author Robin Wayne Bailey sums it up this way: “Science fiction at its best is about ideas. Maybe it’s criticized for often having wooden characters or unrealistic settings, but the ideas always come first.... Science fiction throws out ideas like some people scatter seeds. Most do not take root, but some do. And when they do, it is fabulous.”
It is important to note that in this seed-scattering of ideas, science fiction is not always perfect. As Donna Shirley notes, “Science fiction did not predict computers very well, at least until HAL.... The same for Martians. Mariner 4 [the planetary probe] killed all the Martians in 1965.”
Where science fiction tends to go most wrong in its predictions is not in the technology but in the timelines. Ray Kurzweil, who makes a living out of timing technology predictions, explains, “Science fiction is unreliable because [there is] no requirement that the time frame be realistic. Arthur C. Clarke chose the year 2001 as a literary device, not because that’s when he was certain AI would come to fruition.” Shirley agrees in a way. “The technology is changing so rapidly, they [science fiction writers] are increasingly having trouble keeping up.”
Orson Scott Card thinks that holding science fiction to any standard for its prediction is beside the point. “Predicting is a trivial aspect of writing science fiction. We are extrapolating what would happen if a particular configuration of future possibilities became real. The result is that we plunge readers into an environment in which they must rebuild their conception of reality. So we aren’t predicting the future, we’re helping readers rehearse for the future, whatever it might bring.” He continues, “The job of the sci-fi writer is to envision all possibilities and bring them to life in the readers’ imagination. What impact that will have is always debatable—less and less, these days, I believe. When things go horribly wrong, it’s small satisfaction to say ‘I told you so.’ ”
TURNING DREAMS INTO REALITY
“There is a back and forth between dreams and reality. Science fiction offers the dreams, the engineers make it the reality, and the readers are the ones who pilot the technology in planes, cars, rockets, whatever.”
Greg Bear is the author of more than thirty books and has won two Hugos and five Nebulas (the science fiction versions of Pulitzer Prizes). His most recent novel is Quantico, a thriller set in the “second decade of the War on Terror” about young FBI agents taking on a brilliant homegrown terrorist. The book flap captures it best: “It’s the near future—sooner than you might hope.”
Bear is especially well equipped to reflect on how science fiction doesn’t just predict but also inspires real-world changes, as his name is frequently mentioned in the military research community. For example, an air force lieutenant colonel commented how he even footnoted Bear’s work in a project proposal. By way of explanation he asks, “I mean, how many science fiction books have appendices and glossaries?”
Growing up as “a Navy brat,” who moved with his father from bases in California and Japan to the Philippines, Bear recalls that “in my living memory I don’t know a time when science fiction wasn’t in my life.” He started writing at eleven years old and sold his first story at age fifteen. The next year he met his hero, science fiction writer Ray Bradbury, and his career as a writer was decided.
Since that time, Bear has been called the “best working writer of hard science fiction” by The Ultimate Encyclopedia of Science Fiction. His impact, though, is decidedly beyond the world of fiction. Bear has served on various political and scientific action committees and advises the U.S. Army, the CIA, Sandia National Laboratories, and Microsoft Corporation. Indeed, when we spoke, Bear was just back from headlining a government conference on biotechnology threats, inspired in part by his book Darwin’s Radio.
Bear is also one of the core members of SIGMA, a “think tank of patriotic science fiction writers.” SIGMA was started by Arlan Andrews, a writer who also worked at the White House Science Office. “If you don’t read science fiction, you’re not qualified to talk about the future,” he said. Since the 9/11 attacks, SIGMA has worked closely with the Department of Homeland Security, and influenced it in particular to set up the Homeland Security Advanced Research Projects Agency, or HSARPA. A parallel to the Defense Department’s DARPA, HSARPA spends about $7 million a year (1 percent of the agency’s budget) on futuristic “high impact” projects. At a government conference where authors like Bear spoke in 2007, a government official defended the science fiction link to policy. “Congress asks me how can I afford to roll the dice with 1 percent of the taxpayers’ money,” tells Jay Cohen, head of Homeland Security’s Science and Technology Directorate. “I say there are bad people in the caves of Tora Bora who are rolling the dice with 100 percent of their money.”
Bear sees the influence of his work and his access to policymakers as coming in part from the focus on conflict. Referring to the many military readers of his work, he says, “If you lead the life, you tend to choose to read fiction about it.” He also sees science fiction’s influence spreading via its crossover into popular technologic thriller authors like Tom Clancy and Dan Brown, who are especially popular among military readers.
This fandom extends to the top. “There is a pretty striking amount of government officials that read science fiction,” Bear says. “Harry Truman loved science fiction. He was an ‘other planets’ type of guy....Reagan liked the older writers like Jules Verne and Edgar Rice Burroughs. Reagan even gave [promotional] quotes to writers and was not averse to receiving papers from them, when he was president.” He goes on to note that, as someone who leans left in his politics, he’s somewhat disappointed that recent Democrats tend to be less likely than the Republicans to be science fiction fans. “They seem to be more like FDR and get into the legal thrillers and mysteries.”
DIRECT INSPIRATION, OR “HOW WILLIAM SHATNER CHANGED THE WORLD ”
Science fiction may be incredibly popular, but raw fandom doesn’t necessarily translate into influence. If that was the case, as I write this, Hannah Montana would be the most powerful person on the planet. Rather, science fiction’s influence on real-world science and even war comes through a variety of pathways. The most simple is the direct way, giving scientists ideas of what to invent. And nothing better proves this than Star Trek. Or, as William Shatner (the actor who portrayed Captain Kirk on the original series) claims, “All this wiz-bangering didn’t happen by accident. I made it happen. Or rather, Star Trek did.”
While the original series only lasted three years (1966-69) before it was canceled by NBC due to low ratings,
Star Trek has since boldly gone where no work of fiction has gone before. It spun off five other TV shows, ten movies (an eleventh is in the works), an entire library of books (
Amazon.com lists 4,276
Star Trek books), and a city’s worth of exhibits, rides, and museums. The mecca of all this is “Star Trek: The Experience,” an interactive museum at the Las Vegas Hilton hotel and casino. Where Elvis used to do his famous “Viva Las Vegas” show, today you can drink a “Commander Riker-Rita” at Quark’s Bar or renew your Vulcan wedding vows. All told, the
Washington Post estimates that the worldwide fan base is 250 million Trekkies strong.
The original show came out of the vision of Gene Roddenberry, a World War II bomber pilot turned Hollywood producer. While he wanted technology that “looked futuristic,” the reality often had a different point of origin. For example, many recall the famous “transporter” that each episode would beam Kirk, Spock, and an anonymous, certain-to-die, red-shirted crewman down to the planet’s surface. Screenwriters now tell that the origin of the transporter actually came about when the prop company didn’t deliver a mockup of a shuttle craft in time.
These ideas, however, certainly made an impression on a generation of kids turned scientists. They became determined to make the world have technology just like they’d seen their heroes use in their favorite show. Martin Cooper, the inventor of the cell phone, recalls that his “eureka” moment of inspiration came when watching a Star Trek episode in his lab. “There’s Captain Kirk, talking on a communicator, without dialing! I think ‘This thing is genius.’ . . . The Star Trek communicator to us wasn’t a fantasy. It was an objective.” Similarly, John Adler of Stanford Medical School observes that Dr. McCoy’s sick bay “revolutionized the way we think about patient care.” Inspired by Bones’s medical tricorder (actually just a tricked-out salt shaker), Adler revolutionized the medical field by inventing the cyber knife, which does surgery by sending a beam into cancer tumors. Rob Hatani, who was equally inspired by the tricorder to invent the PalmPilot PDA, explains that this degree of influence is to be expected, given the popularity of the show among scientists. “In Silicon Valley, everyone’s a Star Trek fan. It’s like football in Green Bay.”
The franchise and its influence was reborn again in the 1980s with Star Trek: The Next Generation. The successor series differed in often focusing on the darker side of technology (such as its introduction of the Borg, the new adversary species, whose robotic technology had eradicated all empathy), but it too had a major influence on scientists. For example, Steve Perlman recalls how his inspiration moment came when watching an episode in which the robot Data relaxes by listening to several symphonies stored on his computer. Perlman went on to invent QuickTime, a software program that stores and plays electronic audio and video files. This, in turn, helped make possible iPods and other portable digital music players. Today, Perlman is working to make a virtual reality playroom, modeled after the Enterprise’s Holo-Deck.
The inspirational role of science fiction extends beyond the world of Trekkies, and is especially pronounced in military technology. An illustration comes from an anthology of short stories entitled The Best Military Science Fiction of the 20th Century. The volume is a collection of the most popular science fiction short stories, written from 1900 to 2000, set in war. What is noteworthy is that thirty-four technologies dreamed up in the last century are now under development by the U.S. military in this century. These range from exoskeleton suits that soldiers might wear to an automated defense system for tanks, now called by the Pentagon “Active Protection Systems.”
Those working in the military weapons development field are often surprisingly open about where they get their ideas. Colonel James Lasswell is a retired infantry officer now at the Marine Corps Warfighting Lab. He says, “The fact that it exists in our own movies proves that it is potentially possible.... If you can imagine it, we think it can happen.” For instance, when pondering how to aid marines in the battle against IEDs in Iraq, his team sent a request to DARPA to start working on what he called “Jedi Broomsticks,” that is, the hovering speeder bikes that appeared in Star Wars: Return of the Jedi. “We wanted ground mobility, but not on the ground.” The jet bikes are still not yet deployed, but another science fiction idea come true is miniature communications devices a marine can wear on his wrist and watch video footage shot by a UAV above. “We got the idea from Dick Tracy,” Lasswell says with a chuckle.
As Andrew Bennett, who leads the design team at iRobot, says, “We were all influenced by science fiction. You are always looking for ideas and science fiction is one of many sources.” His colleague Bryan Yamaguchi laments, “But now we are finding that our stuff is getting more advanced than science fiction.”
FUNDING SCI-FI
The researchers are not the only ones who grow up on this diet of science fiction. So too do the funders who decide which weapons programs to pay for. As former Speaker of the House Newt Gingrich (who actually visited Isaac Asimov’s apartment when he was in Congress) explains, “People like Isaac Asimov, Arthur C. Clarke, and Carl Sagan did an amazing amount to convince humans that science and technology were important.”
Perhaps the best illustration of this is at the Air Force Research Lab’s Directed Energy Directorate at Kirtland Air Force Base in New Mexico. In 2005, it rolled out a new prototype weapon with the mundane title of “Personnel Halting and Stimulation Response,” or PHaSR. People in the military tend to speak out an acronym as if it were one word, rather than reading the letters. So the whole convoluted name was just a way to call the Pentagon’s new weapon a “phaser,” the little ray gun from Star Trek that Kirk always “set to stun” before he beamed off to explore new worlds and romance buxom alien women. The PHaSR system is essentially a laser rifle whose beam can stun a target more than two hundred yards away, a nonlethal weapon perfect for mounting on a robot. When asked why they chose that name, program manager Captain Thomas Wegner proudly answers, “We picked the PHaSR name to help sell the program. It’s an obvious homage to Star Trek.”
It is often difficult to figure out just what the future will look like, but science fiction creates both an expectation and early acceptance of technologies that are not yet fully developed. As Bill Gates explains, Star Trek paved the way for his job at selling small, easy-to-use computers to the public. “It told the world that one day computers would be everywhere.” He sees the same happening with robots from movies like Star Wars and I, Robot. “The popularity of robots in fiction indicates that people are receptive to the idea that these machines will one day walk among us as helpers and even as companions.”
Military robot developers see the same trend when selling to the Pentagon. One explains, “It’s a way to make possibilities seem real, but also inevitable.” Sometimes, though, the popularity of science fiction among military funders can actually make it harder on researchers. “Naval customers just assume it will happen,” explains Thomas McKenna at ONR. Likewise, the military funders tend to want the cooler technologies, while the mundane are less likely to get funded. One U.S. Army researcher working on nonlethal weapons systems complains, “You have to beg for money for things like beanbags or acoustics. But say it’s for a laser or a lightsaber and the money is no problem.”
THE LENS OF THE LOOKING GLASS
“Any sufficiently advanced technology is indistinguishable from magic,” famously argued English physicist and science fiction author Arthur C. Clarke. Indeed, when the warriors of the Hehe tribe in Tanzania surrounded a single German colonist in 1891, they seemingly had little to fear. But he had a magic box that killed almost a thousand spear-armed warriors by spitting out death faster than they ever imagined possible, the machine gun.
New technologies often can seem not merely incomprehensible, but unimaginable. Science fiction, though, allows us to jump that divide. It helps to take the shock out of what analysts call “Future Shock.” By allowing us to imagine the unimaginable, it helps prepare us for the future, including even in war.
This preparation extends beyond future expectations; science fiction creates a frame of reference that shapes our hopes and fears about the future, as well as how we reflect on the ethics of some new technology. One set of human rights experts I queried on the laws of unmanned warfare referenced Blade Runner, The Terminator, and Robocop with the same weight as they did the Geneva Conventions. At another human rights organization, two leaders even got into a debate over whether the combat scenes in Star Trek were realistic; their idea was that this could help determine whether the fictional codes of the Federation could be used as real-world guides for today’s tough ethical choices in war.
By far the most influential writer when it comes to the right and wrong of robots is Isaac Asimov. Every single roboticist knows Asimov’s “Three Laws of Robotics” by heart, and they have become the reference point for ethical discussions about robots. Yet they are fiction, never intended for the real world. Instead, each of the stories in I, Robot uses the laws as a jumping-off point to look at the problems that occur when robots try to follow the laws in the complexity of the real world.
Many of our expectations and ethical assumptions around real-world robots come from science fiction. The irony is that the same stories that inspire and fund the research can also create assumptions that are often incredibly frustrating to real-world researchers. As one scientist discussed, “There seems a strong tendency over the decades to view robots as something evil, like technology run amok.” These fears date back to the slaves of Karel Capek’s 1921 play R.U.R. and the mechanical minx Maria, an evil robot in Fritz Lang’s 1927 film Metropolis (her ultimate evil was illustrated by the fact that she liked to both oppress the urban poor and dance exotically). They continue today in such movie franchises as The Terminator and The Matrix. Bart Everett of the navy lab describes it as a “paranoia” that “stems from the fact that doomsday scenarios make for better movies. As a result, there is often confusion with regard to what the technology actually can and cannot do today, as well as where it’s headed in the future.”
Regardless, the reality is that science fiction always lies in the forefront of debates over such key questions as whether robots should be armed or how much autonomy they should be given. And yet this doesn’t drive the field toward any one conclusion. The galaxy of stories that science fiction writers have created is simply too diverse.
Indeed, just as there is not one single world of regular fiction, there is no one culture of science fiction. The field itself can be different across time and space and thus have changing influences on how we frame the world of science. For example, if Star Trek was dominant in the 1960s, Harry Potter is the power series of today. To put it another way, kids today are infinitely more likely to know what a Chizpurfle is than a Tribble (for the uninitiated, a tiny, mitelike creature that feeds on both magic and electricity versus pink furballs that reproduce at exponential rates). Even though J. K. Rowling created a world more of fantasy than science fiction, its influences are already being felt within the real world of war and weapons development. Researchers in both Britain and the United States (with DARPA funds) are now hard at work on an invisibility cloak that works just like the one young wizarding student Harry inherited from his father. The real-world one is set to be made of novel “metamaterials,” which can be tuned to bend radio waves and light, so that the cloak would neither reflect light nor cast a shadow (a true science fiction comparison would be the chameleon camouflage used by the alien in Predator ). John Pendry, a physicist at the Imperial College London, notes that the Harry Potter link may not be an exact one: “To be realistic, it’s going to be fairly thick. Cloak is a misnomer. ‘Shield’ might be more appropriate.”
Older scientists also note that, as Rod Brooks of iRobot puts it, “There is becoming a generational difference in where the science fiction influence comes from.” As he explains, his major influences were science fiction books. For his colleague Helen Greiner, it was movies. Today, for his students at MIT, it is video games. “And I have no idea what will be the different impact of these.” One may be that the “new media” allow better special effects, but demand far less introspection. As soldiers grow up more familiar with the first-person shooters of video games like Doom or Halo and less the moral questioning of books like I, Robot or the “Prime Directive” dilemmas of Trek, we may find that the medium matters greatly.
Culture also appears to play a role. Just as the French love their Jerry Lewis and the English their Benny Hill, the popularity and influence of certain science fictions are linked to national tastes. For example, Dr. Who is perhaps the most popular sci-fi series in the United Kingdom, running for over a quarter century on the BBC (1963-1989) and spinning out two movies. In the United States, however, Who remains mainly a cult thing. Part of the explanation may lie in that the main hero is basically an oddball, go-lucky sort of guy, who stumbles into trouble while flying about the world in a spaceship that looks like a phone booth. We Americans like our science fiction heroes to be a bit more strong, cool, and dangerous; Dr. Who is no Han Solo.
If the British and the Americans differ along these lines, science fiction truly leaps in culture between East and West, especially when it comes to perceptions of robots. While the robot is consistently something suspicious in Western science fiction, it is the exact opposite in Asian science fiction. Indeed, the very first popular robot in Japanese science fiction was the post-World War II “Mighty Atom,” also known as Astro-Boy. A robot that keeps the peace among humankind, he was also a response to the man-made horrors of Hiroshima and Nagasaki.
To this day in most Asian science fiction, especially in the anime genre, the robot is usually the hero who battles evil. This has heavily influenced both Japanese scientists and that nation’s culture. “The machine is a friend of humans in Japan. A robot is a friend, basically,” says Shuji Hasimoto, a robotics professor at Waseda University in Tokyo. “So, it is easy to use machines in this country.”
Japan’s traditional religion of Shintoism holds that both animate and inanimate objects, from rocks to trees to robots, have a spirit or soul just like a person. Thus, to endow a robot with a soul is not an illogical leap in either fiction or reality. Indeed, in many Japanese factories, robots are given Shinto rites and treated like members of the staff. Masahiro Mori, a professor at the Tokyo Institute of Technology, explains that Buddhism also makes for a more soulful approach to what a westerner would see as just a tool or maybe a mechanical servant. Mori, who wrote a book called The Buddha in the Robot, argues that robots can have a Buddha-like nature and that humans should relate to them as they would a person. “If you make something, your heart will go into the thing you are making. So, a robot is an external self. If a robot is an external self, a robot is your child.”
In Asia, “companion” robots for the elderly are becoming quite common. One woman even found out she was dying of heart disease and included her Wakamura robot in her will. By contrast, Rodney Brooks of iRobot says that the mass-marketing robots as friends for elderly shut-ins is yet to be tried in the United States because most Americans find such a concept “too artificial and icky.” Sebastian Thrun, the robot car racer from Stanford, tells how the differing science fictions create a “willingness [in Asia] to go into new technologies and gadgets that is higher there than anywhere in the world.” As a result, his lab has more collaboration with Asian companies than American ones.
The same differing attitudes and influences affect what different cultures think is acceptable in war. The question of arming unmanned systems and giving them the ability to shoot at humans is perhaps the most hot-button issue within the U.S. robotics community. It is far less controversial in Asia. Indeed, South Korea sent two robot snipers with rifles to Iraq in 2004 with essentially no debate; they were reported in the media to have “nearly 100%” accuracy.
Even more notable is the Autonomous Sentry Gun, made by Samsung. The company, more known for making high-definition TVs, has integrated a machine gun with two cameras (infrared and zooming) and pattern recognition software processors. The gun system cannot only identify, classify, and destroy moving targets from over a mile away, but, as Louis Ramirez of Gizmodo relates, “also has a speaker that beckons the fool that walks near it to surrender before being pulverized.” South Korea plans to use the robo-machine guns to stand guard along the 155-mile demilitarized zone (DMZ) that borders North Korea.
The attitudinal differences become even more evident when you watch the promotional video put out by the Korean company for its new toy. The footage shows the machine gun automatically tracking a human test subject, who unsuccessfully tries to dodge the robotic gun by running back and forth and hiding behind bushes. For something that a westerner weaned on a diet of Terminator movies can’t help but find disturbing, the vibe of the Korean commercial is a bit more celebratory. The footage of a real-world automated machine gun tracking humans is paired with the rousing theme song of the Disney movie Pirates of the Caribbean.
THE FEEDBACK LOOP
“There’s definitely a feedback between the sciences and science fiction,” says James Cameron, creator of The Terminator, as well as a board member of the Science Fiction Museum and Hall of Fame. “It flows both directions.... Not only does science fiction inspire people to become scientists and want to ask questions about the real nature of existence and matter and reality, but what they’re finding then feeds back into the science fiction community, and gets embraced by that, and spins out a whole new generation of science fiction.”
Real scientists, soldiers, and policymakers may be influenced by science fiction, but change is coming so quickly that the creators of these imaginary worlds are increasingly borrowing from the real one. As Greg Bear notes, “I actually worry that science fiction isn’t keeping up.” Author and science fiction writers’ union head Robin Wayne Bailey concurs: “The military is doing a fine job with robotics. The toys they have could be placed within any science fiction story.... But to see what they have on the drawing board is mind-boggling to even science fiction writers.”
However, as science fiction experts look at some of what the military is doing today, many of them get frustrated. Donna Shirley may be the director of a science fiction museum in Seattle, but when the topic turns to the future of war, she is as smart as any political analyst inside the D.C. Beltway. “The Pentagon just doesn’t get it. This high-technology stuff just doesn’t work versus a distributed enemy like al-Qaeda.... No matter how many bunker busters you can drop from afar, if you don’t know where someone is hiding it will not matter.” And don’t even get her started on the plans for National Missile Defense. “The idea of trying to hit a bullet with a bullet is silly. It is much easier and efficient to place your interceptor system offshore and take the missile out early in the launch stage when it is slow and easy to target.... But instead we are spending billions on the harder part just because it sounds really cool.”
And yet the field still has a stigma that keeps its experts hidden away, even when on Pentagon contract. For all its influence on the future of technology and even war, “it is ironic then that we are rarely invited to the table to discuss these issues openly,” laments Robin Wayne Bailey.
Perhaps we as a society ought to be paying more attention to the world of science fiction. It not only predicts and influences the future, but nothing may prepare us better to assess the consequences of a new technology than a field whose very essence is to ask questions about the moral, ethical, and societal dilemmas that new technologies might provoke. As Donna Shirley explains, “Science fiction says ‘what if?’ So, it doesn’t say how exactly you can build the bomb. Instead, it says, if you build this bomb, you are going to get Dr. Strangelove.”