[FIFTEEN]
THE PSYCHOLOGY OF WARBOTS
Warfare is about changing the enemy’s mind.
—RALPH PETERS
“Human versus robot? How will that play? . . . The psychology of all this will be important, especially on the side of the people without the high tech.”
Eliot Cohen is the director of the strategic studies program at Johns Hopkins University. If there is a Washington “defense establishment,” Cohen is one of the key opinion leaders within it, especially among its right wing. Described by one media report as “the most influential neocon in academe,” Cohen gained much media attention right before the 2003 Iraq invasion, when President Bush showed up at a public event holding Cohen’s book Supreme Command. No one is sure if Bush actually read the weighty tome, but the choice was symbolic, as Cohen argued in the book for civilian leaders to exert their influence over military matters. Soon after our interview in his Dupont Circle professor’s office in late 2006, Cohen was named counselor to Secretary of State Condoleezza Rice, to serve as her one-man think tank and intellectual sounding board.
Looking the part of a defense intellectual straight out of Hollywood casting, even down to sporting a mean red bow tie, Cohen believes that human psychology will be a key determinant of robots’ impact on war. Having also written the book Military Misfortunes (a study of miscalculation and defeat in war, which maybe Bush should have also read), Cohen strongly believes that human motivation has usually been the key to victory or defeat. Whether it was Napoleon’s army at Waterloo, the kaiser’s army at the end of World War I, or Saddam’s forces in 1991 and again in 2003, the side that loses a war usually does so because its military hits a psychological breaking point, a time “at which a majority or a disabling minority [of soldiers] refused to go on.”
Cohen tells that we don’t yet have a full understanding of how people will be psychologically affected by robotic fighting systems, but he thinks there may be lessons from the past. “Its closest parallel may be the effect of strategic bombing. The foe just gets defiant, but also depressed over time.” But, says Cohen, there will be a new twist. Unlike the intermittent raids of bombers over Tokyo or Berlin during World War II, the fact that the systems are unmanned, as well as able to operate for days or weeks on end, will give them a psychological punch as never before. “They [the side facing robots] will feel like they are always being watched, that they face a nonhuman foe that is relentless.”
Overall, concludes Cohen, the trend should be of great benefit to America, especially against the terrorists and insurgents it faces in what he describes as the current “Fourth World War” (he counts the cold war as the third great global conflict). “It plays to our strength. The thing that scares people is our technology.”
Cohen is by no means alone in his belief in the psychological power of unmanned systems among the political establishment. The Washington Times, for example, reported that a great benefit of robotic systems is that “unmanned weapons tend to demoralize an enemy.” It described that “while soldiers will fight against their enemy if they have a chance to kill the attacker even against all odds, being killed by a remote-controlled machine is dispiriting.”
This same conviction extends beyond the D.C. Beltway. For example, Ed Godere, part of the Foster-Miller team behind the SWORDS, believes that “the psychological effects will be significant.” He predicts that it will cause “an almost helpless feeling” among anyone unlucky enough to see a robotic machine gun coming at them. Many troops in the field agree. Army staff sergeant Scott Smith says that “without even having to fire the weapons . . . it’s total shock and awe.”
FIRST CONTACT
In 1532, Atahuallpa was emperor of the Tawantinsuyu, better known to us as the Incan empire. Located in what is now Peru, Atahuallpa’s domain was the largest and richest of the empires in lands not yet reached by European explorers. Life was just getting good for Atahuallpa. He had beaten his brother in a civil war for the throne and was on his way back to his capital. Only a quick detour was needed to check out a tiny band of strange visitors that had just entered his lands. A proud and cruel king (he had just forced his defeated brother to watch his children be hacked to death), and at the head of a battle-hardened army of eighty-thousand warriors, Atahuallpa believed he had little to fear.
Atahuallpa and his army soon reached the encampment of the visitors, who invited the emperor to a peace ceremony. Carried in on a litter borne by the highest nobles of his court, and accompanied by a personal guard of four-thousand men, Atahuallpa entered the small courtyard where the visitors were camped. A delegation greeted him. One of the visitors, a man wearing brown robes, offered him a gift and told him, through a translator, that this “book” supposedly carried the word of God.
Never having seen such a thing before, and believing himself to be a representative of the gods, the emperor shook the gift and, when no noise came out, indifferently tossed it to the ground. He asked, “Why doesn’t it speak to me?” The man in brown cried out with anger when the packet of papers hit the dirt and gave a signal of some sort. Soon after, the air filled with a massive explosion and dozens of men ran in from the buildings that surrounded the courtyard. They were wearing seemingly invincible suits of metal that turned back the points of arrows and spears. They wielded strangely sharp, unbreakable metallic weapons that cut through flesh with ease, and, even more frightening, pointed sticks that spat lethal flames. Most terrifying, though, were the strange creatures that also charged out, which had four legs like a beast, but the upper body of a human warrior.
There were only 168 of these new visitors, but as they charged at the emperor and his 4,000 men, the effect was paralyzing. Atahuallpa’s guard was quickly chased away or slaughtered. The highest nobility of his kingdom were killed at his feet. When none were left to hold up his litter, the emperor was captured.
Seventy-six thousand of Atahuallpa’s warriors were waiting in the fields just outside the town, and milled about, wondering what to do when they heard the strange noises and then saw their noblemen running for their lives. Twenty-seven of the man-beasts then emerged from the square and put the entire army to flight. It wasn’t so much of a battle as a massacre; it ended only after the visitors gave up killing the fleeing Incan warriors when their arms grew too weary. The captured emperor then offered the visitors a ransom to set him free, enough gold to fill a room twenty-two feet long, seventeen feet wide, and eight feet high. The visitors agreed. But after these strange, fearsome men had their gold, they reneged. They executed Atahuallpa and took over his empire.
As the science fiction writer Arthur C. Clarke of 2001: A Space Odyssey fame once observed, “Any sufficiently advanced technology is indistinguishable from magic.” Nowhere is this more true than in war. Time and again, warring sides have used new technologies not only to kill more efficiently than their foe, but also to dazzle them into submission. The case of Atahuallpa, unlucky enough to become emperor just before the arrival of Francisco Pizarro and his tiny band of Spanish conquistadors, is a powerful example of just how shocking and powerful new weapons of war can be.
Cannon, armor, swords, muskets, and horses were particularly devastating to the Inca as they lived in a time when communication was difficult and information was hard to come by. This was not merely their first contact with such weapons, but they had never even conceptualized the very possibility of such fearsome technologies before. Yet even in our information-saturated world, the use of new weapons technologies can still have a powerful psychological effect. For example, an elite Iraqi Republican Guard colonel explains that the reason he felt his forces gave up so quickly during the 2003 invasion was that “U.S. military technology is beyond belief.” He described how American air power, able to strike with constant, pinpoint precision, whether in day or at night, took his unit by surprise, made it feel like any sort of organized resistance was impossible, and ultimately collapsed their spirit.
As Atahuallpa could have foretold, the new generation of unmanned systems already have had such a psychological effect on the minds of adversaries, specifically in sowing alarm and confusion. Marines in 2004, for instance, told how insurgents feared what they believed to be an all-seeing eye in the sky. They should fear it, explained a UAV operator, as he watched a suspected insurgent pickup truck race under a carport at a safe house. “With all the dust they kick up, how could we miss them?”
Troops are also finding that encountering a strange new, unmanned weapon conveys more than merely a psychological punch. War-game testing has found that foes tend to focus on “such an unusual technology” as the SWORDS; it is such a center of attention that this can be taken advantage of. One team was facing a group of hostage takers holed up in a building. So they sent a SWORDS to drive up to the front. While the hostage takers gathered on one side of the building to watch the odd little lawn mower with a machine gun track forward, a special forces team went around the back of the building and ambushed them from behind.
Another strange psychological lesson came from a real-world hostage crisis in Milford, Connecticut. A gunman wouldn’t let police come anywhere near him, because he thought they might try to surprise and overpower him. But he was willing to let the police send a robot to carry in a phone. As the hostage crisis dragged on, the police called him and offered to send him and the hostages some drinks. The gunman agreed, but again, wouldn’t let any human come near, as it might be a trick. Thinking robots more trustworthy than the fuzz, he agreed again to let the robot bring in the drinks. Of course, robots can have tricks up their sleeves as well. The robot delivered coffee that had been laced with knockout drops and, as the gunman fell asleep, the crisis ended with no one getting hurt.
The obvious problem is that what is “unusual” wears off and such tricks only work so many times. While it was too late for Atahuallpa, the Incas did grow accustomed to the Spanish weapons. Just three years later, the dead emperors’ generals launched a surprise uprising that evolved into an insurgency that lasted for years. Similarly, the Iraqis soon adjusted to the American ability to launch pinpoint airstrikes, and learned that the easy answer is not to mass troops in open terrain. Word travels fast, people adjust, and the psychological power of something new and different wears off quickly.
THE “CREEP ” FACTOR AND THE UNCANNY VALLEY
David Hanson, a former employee at Disney’s Imagineering Lab, makes robots that “creep people out.” Hanson’s robots look like machines from the neck down, but have incredibly realistic heads. Their lifelike “skin” is made using a material Hanson invented called Frubber. His “Hubo Einstein” robot, for example, has a mechanical body, but the head and face of Albert Einstein. One scientist described it as “spookily cool... a giant step forward.” Hanson is also “an avant-garde artist” who puts together art shows. In the long line of artists doing self-portraits, for one show he made a robot modeled on himself. Only his self-portrait was a “large homeless robot figure in a box.” His intent was to use a robot to take viewers out of their “comfort zone.”
Hanson is proud that his robots pose “an identity challenge to the human being.... If you make it perfectly realistic, you trigger this body-snatcher fear in some people,” he tells. “Making realistic robots is going to polarize the market, if you will. You will have some people who love it and some people who will really be disturbed.”
Inspired by Brian Aldiss’s short story “Supertoys Last All Summer Long” (the basis for the Steven Spielberg film AI), Hanson is currently at work on robotic “supertoys.” He explains that these robots will have evolving personalities and grow up with the child. Looking a little like robotic versions of the Oompa-Loompas from Willy Wonka’s Chocolate Factory, Hanson’s robots are two feet tall and have cartoonish faces. The name he gave the first of these new robots is Zeno, the same name as his eighteen-month-old son.
Hanson sees his work as “changing the expectations of machines,” and he ultimately hopes that the “social robotics” field will become so much bigger than the military robotics industry that “market forces will shape things toward friendlier robots.” This remains to be seen. But his work clearly illustrates how robots can be designed to influence the “attitudes, feelings, emotions, and ultimately the behavior” of those who see them. This quote tellingly comes not from Hanson or a science journal. It is the Pentagon’s definition of psychological operations.
History is filled with all sorts of ways that weapons and uniforms can be designed to create some sort of psychological reaction among the foe. For example, the famed British Redcoats of the Revolutionary War era wore that color so that blood wouldn’t show up on their uniforms from a distance. Among their units were grenadiers, especially tall soldiers, who wore huge peaked hats to make them look even taller. The effect of seeing them on the battlefield was akin to watching a line of giants marching toward you, whom your bullets seemed not to hit.
The difference between the Redcoats’ psychological effect and those of the conquistadors is the difference between fright and fear. As Sigmund Freud explained, fright is the state one falls into “when confronted by a situation [for] which we are unprepared,” akin to what the Incas felt when seeing guns for the first time. Fright, though, can wear off quickly as one grows accustomed. Fear, by contrast, comes from “a definite object of which one is afraid.” It is something you can see and even understand, but it still evokes a state of terror that causes dread or panic. The patriots knew the Redcoats were men, but it didn’t make them any less fearsome.
Current robots on the battlefield tend to have a totally utilitarian look, but it still gives them some psychological punch. Foster-Miller’s SWORDS robot, for example, got its design from just mounting a machine gun on top of an older robot’s chassis. Even then, as one magazine quipped, the SWORDS “makes Robocop look like Officer Friendly.”
Strategic thinker Eliot Cohen thinks such an unintentional effect is all well and good, but something more may have to be done. “We will have to figure out how to maximize the psychological impact of it [a robot]. We will have to think not merely in terms of costs and benefits and how to get steel on target, but much more. How it gets that angry insurgent from being eager to fight to thinking that there is no point in it, there is no chance to win against a relentless foe.”
If not all robots are going to look like Disney-Pixar’s cute and cuddly WALL-E (though one of the British army’s robots is a dead ringer), the first and easiest step to fearing up a robot is to equip it with effectors that can play a role in psyching out the enemy. If history is any guide, we can anticipate that this won’t just be about giving them a scary look, as with the Redcoats, but also a scary sound. The ancient Chinese set off fireworks to spook enemies’ horses, while the Nazis mounted sirens on the wings of their Stuka dive bombers during World War II; often the high-pitched noise of the diving plane created even more chaos among the troops on the ground than the bomb itself.
The sound that the U.S. military will use to put chills down enemy soldiers’ spines most likely will come from the real experts, Hollywood. The military has long used Hollywood special effects for psychological operations. During the 2004 battle of Fallujah, for example, the marines set up loudspeakers around the city and broadcast the sinister laughter of the alien from the Predator movie. They were hoping to spook out the insurgents, as well as drown out the sermons that the insurgents broadcast back at them from each of the city’s mosque towers. The noise was so constant that a marine joked that the siege should be called “Lala-Fallujah” (after the famous alternative rock concert festival Lollapalooza).
Given that the marines’ new ground systems like the Gladiator come with their own loudspeakers, there’s nothing to prevent them from doing the same with their robots, to create a more mobile fear factory. Of course, there are always downsides to these kinds of operations. After hearing the Predator’s evil laugh one too many times, a marine scout team on the front lines radioed back to base to tell them the noise was having more of a psychological effect on them than on the enemy. “That’s not funny anymore. You keep that shit up and we’re coming back in.”
The same kind of turn to Hollywood will likely take place with the overall design of unmanned systems. Says military robots pioneer Robert Finkelstein, if you want to truly have a psychological effect, “Make ’em look like Frankenstein’s monster. Or make them look like creatures from Star Wars. . . . Make ’em hideous.” Scientists at one military robotics firm similarly report how the military inquired if they could make a system that looked like “the hunter-killer robot of Terminator .” Not much of a sci-fi buff, Eliot Cohen suggests that we instead turn to nature, that “we exploit the basic human fear of bugs.” Whatever the inspiration, as Finkelstein concludes, there are “infinite possibilities” of how the looks and design of systems might be manipulated to heighten an enemy’s fear. One day, specializing in scary designs “might even be a profession.”
But as David Hanson’s work illustrates, the creepiest robots of all may be the ones that look mostly human. He notes that the reactions to his robots vary. “Some people take it as a thrill, some think it is neat.... Others find it just creepy and threatening.” He explains that different parts of the brain deal with social relations versus identifying objects. So when the human brain sees “an object acting as a human, it sets off natural alarms, so to speak.”
Hanson is tapping into a phenomenon called the “uncanny valley.” Researchers are finding that the more human attributes a machine has in design, the more people’s connection seems to increase. As Hiroshi Ishiguro, the maker of such humanoid robots as the sexy Repliee android, explains, “The keyboard and the monitor are primitive. My brain was not designed to watch a display and my fingers were not designed to type on a keyboard. My body is best suited for communicating with other humans. The ideal medium for communicating with a computer is a humanoid robot, which is, of course, basically a computer with a humanlike interface.” But this doesn’t mean that we are entirely comfortable with ever more life-like robots. “People’s empathy increases until a sudden point at which the machine seems like the living dead, like a frightening imposter.”
This is the “uncanny valley,” when the appearance of a robot is close to a human but not close enough. At this point, a robot’s look is most disturbing. The end of the “valley,” when the fear goes away, is when the robot becomes so human in its appearance that it’s hard to tell the difference. So the relative length of the “valley” is the space at which robots freak you out. Explains AI expert and psychologist Robert Epstein, “If a human can’t tell it isn’t human, no worry.... Humans are also okay with it if it doesn’t look like a human at all, like Johnny 5 [the robot from the movie Short Circuit that looks a bit like a PackBot].” It’s that part in the middle of the uncanny valley that is so disturbing. “It’s like interacting with a corpse, a moving corpse. It makes you uncomfortable.”
THE OTHER SIDE
Psychologists like Epstein, however, are discovering that an encounter with a robot, whether it’s a sexy android, a machine-gun-carrying lawn mower, or even one that seems straight out of Night of the Living Dead, is not just straight “shock and awe.” “It’s not just that a certain type of machine or robot makes us uncomfortable. It very much depends on who we are.” Just like with those early guns and armor and the Incas, the effect greatly depends on one’s prior experience with similar technology. “The more familiar with technology, the shallower the uncanny valley; the less familiar, the greater the effect.” David Hanson similarly described that for people seeing his lifelike robots, “If they are not used to robots, the negative reaction is more likely.”
Age also can be a factor. Oddly, children up to the age of roughly three years care the least about a robot’s appearance. They accept almost any bizarre look matter-of-factly, good news for the robot-nanny industry. But around the age of four years, appearance becomes highly important to a child, with a wide “valley” that doesn’t tend to go away until the teenage years are over. Ishiguro, the maker of the Repliee, first witnessed this aspect of the valley with his original version of the android, which was shaped to look like his four-year-old daughter. “When my daughter first saw her android she began to cry.”
But as an iRobot scientist tells, “The uncanny valley is definitely cultural as well. . . . The Japanese will put up with a robot that even freaks me out, but they are totally comfortable with that.”
Messaging across cultures has always been difficult, especially in war. In World War II, General Curtis LeMay ordered American bombers to use firebombs on Japanese cities, with the intent to terrorize the Japanese public into a realization that continuing the war was futile. The raids killed hundreds of thousands, but many in Japan instead interpreted the “message” as that it was dangerous to surrender unconditionally to an enemy willing to drop flaming napalm on civilians living in wooden houses. The United States tried similar messaging with its bombing during Vietnam, this time influenced by mathematical models and strategic game theory. As army colonel H. R. McMaster explains, such approaches proved “fundamentally flawed.... The strategy ignored the uncertainty of war and the unpredictable psychology of an activity that involves killing, death, and destruction. Human sacrifice in war evokes strong emotions, creating a dynamic that defies systems analysis quantification.” In short, the message you think you are sending is not always the one that the other side actually receives.
This same phenomenon may be playing out with unmanned systems as well. Much in line with his sense that robotics can help the U.S. military push its enemies’ psychological buttons, Eliot Cohen describes his belief of what an insurgent in Iraq thinks about such systems. “They are likely asking, ‘What tricks are the Americans going to pull out of their bag next?’ ”
The troops who currently use unmanned systems are also generally hopeful about the psychological effect their robotics might be having on the other side in Iraq. As one drone pilot explains, “I think that it will discourage them more than anything. I know that if I was out on a future battlefield risking my life, my emotions would be out of whack knowing that I could be killed and the only damage I could inflict was to a robot. For today’s battlefield, the UAV is used largely as a deterrent. AI forces [‘Anti-Iraqi,’ the official term at the time for insurgents] know we are out there. They know they are constantly being watched. The fear of being caught in the act keeps a lot of would-be insurgents out of the fight.” Concluded one air force officer, “It must be daunting to an Iraqi or to an al-Qaeda seeing all our machines. It makes me think of the human guys in the opening to the Terminator movies, hiding out in the bunkers and caves.”
The irony is, of course, that the humans in that movie were the side the audience was supposed to root for, and who overcame their fears to beat back the machines. So while there is no way to formally test this proposition out, in the summer of 2006 I was connected with two Iraqi insurgents by a trusted intermediary. Both of them were Sunnis, who were opposed to the U.S. presence in their homeland and had decided to join the insurgency. Notably, one was a former engineering student. Even with this background, he described the various unmanned systems his American foes were using as a bit bewildering. “I didn’t really imagine that military industry reached such levels of imagination.”
It may have been a bit of posturing, but the two also discussed how they were not all that intimidated by the technologies, as some of the strategists like Cohen and others might have hoped. “It is not really a matter of how sophisticated you [sic] weapons are,” one told. Instead, they expressed a confidence that they would find ways to adapt and take advantage of the technologies soon themselves. Sounding almost like an Iraqi version of Ray Kurzweil, the former engineer expressed his sense that this trend would likely continue, as “the modern age is also marked by increasing trends towards automation.”
What they expressed in the limited interviews I was able to carry out squared very much with what other experts with far more experience with insurgents have found. Nir Rosen is a reporter and the author of In the Belly of the Green Bird, a study of the early days of the Iraq insurgency. Born in New York City, but having learned to speak Arabic with an Iraqi accent in his youth, Nir was able to gain the trust of local civilians and insurgents in a way that few other journalists could. Indeed, he was the only Western journalist to spend time inside Fallujah among the insurgents before the major battles there in 2004. When we spoke in 2006, Rosen was just back from Somalia, having gained a meeting with the armed Islamist faction that had taken over Mogadishu.
Rosen told how during his time in Fallujah, the insurgents were “definitely aware of UAVs and other American technology, but not always aware of their full capabilities. They wouldn’t understand the things they could and could not do.” He described how they would sometimes give the systems credit for things not yet technologically possible, while other times make simple mistakes in underestimating what was possible decades ago.
As far as the supposed psychological effect, Rosen responded that “you have to remember that insurgents only have their own weapons, whereas they are fighting a force of F-16s [fighter jets] to tanks, up-armored Humvees to platoons of troops in helmets, flak vests, knee pads, boots, etc.... For insurgents, it already feels like they are fighting robots of a sort.”
Rosen sensed that fighting more and more unmanned systems would “not be a huge quantum leap” to the insurgent psychology. “With things like F-16s, it’s not like they are fighting face-to-face now anyway.” Instead, he saw that what seemed like an overreliance on these systems is even backfiring psychologically on the Americans. “In their rhetoric, they’ll make fun of the Americans for not being man enough to fight face-to-face.” Ultimately, though, he felt the insurgents in the field would understand why the United States was using them and may even follow suit with whatever technology they could. “They will adapt very readily. . . . It’s just about achieving your ends.”
So, at least within the psychological war of ideas, unmanned systems may not convey the messages we desire. Instead, they may send rather undesirable and unintended signals about our intentions and even our character.
For instance, unmanned systems are intended to reduce casualties. But as Peter Feaver, a Duke University professor turned Bush administration National Security Council adviser, asks, “What is Osama bin Laden’s fundamental premise if not the belief that killing some Americans will drive our country to its knees?” Indeed, robots’ very rationale of limiting human risks runs counter to the local values in many of the most important theaters of the war against terrorist groups. As one marine general explained, in places like Afghanistan, especially among the Pashtu tribes in the mountainous south, “Courage is the coin of the realm.” Showing personal bravery, which you cannot do with a robot, builds trust and alliance in a way that money or power never can.
Finally, the systems are hoped to limit the number of “boots on the ground.” But the effect can send an unintended message, blunting the psychological and even tactical effects of defeat on a foe. As Bevin Alexander, the author of How Wars Are Won, explains, “Victory comes from human beings moving into enemy territory and taking charge.” Otherwise, you repeat the experience of the Sunni Triangle in Iraq. The future hotbed of rebellion wasn’t occupied until weeks after Baghdad fell in 2003, and local would-be insurgents instead got the signal that they had never been defeated.
Rami Khouri is well placed to evaluate the effect of our new technologies in the particularly important area of the Middle East. The director of the Issam Fares Institute of Public Policy and International Affairs at the American University of Beirut, Khouri is also the editor-at-large of the Beirut-based Daily Star newspaper. When we spoke in 2006, the electricity in his Beirut home was still cutting in and out, the effect of the Israeli bombardment (coordinated by a near-constant flyover of Israeli UAVs) during the war between Israel and Hezbollah.
Khouri described how it felt to be on the receiving end of unmanned targeting and an all-seeing eye in the sky. The kind of depression that Cohen had hypothesized was certainly present, as the normally ebullient Khouri fretted over whether he would have enough food for the week ahead if the electricity went out again. But so was the defiance. Khouri is a leading voice of moderation in the region and is so much of an admirer of the United States that he is an avid baseball fan. Yet even he described how, instead of cowing the populace, these sorts of attacks were reinforcing the position of radical groups like Hezbollah. The use of such technologies was “spurring mass identity politics.... The new combination of Islamist, Arab nationalist and resistance mentality is seen as an antidote to the technology discrepancy.”
Instead of receiving a message that they were overmatched, “it is enhancing the spirit of defiance.” Khouri explained how both the Hezbollah fighters in the field and the broader Lebanese populace saw that “the enemy is using machines to fight from afar. Your defiance in the face of it shows your heroism, your humanity.... Steadfastness is the new watchword. Take the beating and keep fighting back.”
As an Arab moderate, Khouri was not happy about such reactions. But then again, he was also not happy about having spent the last few weeks watching UAVs fly over as his city was bombed. Indeed, he talked about how the unmanned drones somehow made him “even more angry” than the manned F-16s.
Khouri’s explanation of how those on the ground viewed unmanned systems in the Lebanon war was very much like the reactions of the insurgents in Iraq. Rather than creating just fear, fright, and depression, such systems were also unintentionally sending messages of weakness, and even vulnerability. As he concluded, “The average person sees it as just another sign of coldhearted, cruel Israelis and Americans, who are also cowards because they send out machines to fight us, . . . that they don’t want to fight us like real men, but are afraid to fight. So we just have to kill a few of their soldiers to defeat them.”
THE EVIL EMPIRE
When people talk about the psychological war of ideas that takes place in conflict, they are often not talking merely about the effects on the field of battle, but also among the broader populace. Geopolitics is not a popularity contest, but it is dangerous to disregard international public opinion to such a degree as to assist the recruitment and growth of radical, anti-American groups. If you lose your credibility and reputation, you alienate your allies, reinforce your foes, and shoot your own ideas and policies in the foot. General David Petraeus, the commander in Iraq, once described these aspects as 80 percent of the fight
Unfortunately, by most metrics, the United States is losing this war. In a few short years, America went from being viewed as the beacon on the hill of freedom, Coca-Cola, and blue jeans that won the cold war to the dark home of Abu Ghraib, Gitmo, and orange jumpsuits. Already at the bottom of a deep hole, we can’t afford to dig much deeper.
Hence, former assistant secretary of defense Larry Korb argues, “Unless you are refighting some form of World War II, your warfighting must include some part of trying to sway people. . . . If the U.S. doesn’t handle robotics right, it will undermine [our] moral standing, and the U.S. can’t be a global leader without such standing.” John Pike of the Global Security organization concurs. “This [the robotics revolution] opens up great vistas, some quite pleasant, others quite nightmarish. On the one hand, this could make our flesh-and-blood soldiers so hard to get to that traditional war—a match of relatively evenly matched peers—could become a thing of the past. But this might also rob us of our humanity. We could be the ones that wind up looking like Terminators in the world’s eyes.” Noah Shachtman sums it up with another sci-fi reference. “The optics of the situation could look really freaking bad. It makes us look like the Evil Empire [from Star Wars] and the other guys like the Rebel Alliance, defending themselves versus robot invaders.”
A concern is that the uncanny valley may also have a cultural distance to it, as it is widened by a lack of familiarity with technology. When much of the Christian world was burning down libraries during the Dark Ages, the Muslim world was the home and protector of much of modern science and mathematics, flourishing in places like Córdoba and the House of Wisdom in Baghdad. But today, the popular penetration of science in the Muslim world has been stifled by a combination of backward-looking fundamentalists who fear anything new and corrupt regimes that look at science as simply something to buy but not understand. Spending on science and technology in the region is 17 percent of the global average, with the region falling behind not just the West, but also the poorest states in Africa and Asia.
The region’s media doesn’t help much either. As an example, rather than celebrating the only two Muslims to have won a Nobel Prize in the sciences, a show on Al Jazeera in 2006 described that they should be shunned, as the Nobel Prize “encourages heresy. It encourages attacks against the heritage, and encourages those who scorn their people and their culture.” The show went on to describe the world’s highest scientific honor as a part of a conspiracy stemming from “the Elders of Zion.” Given this kind of message, it’s not surprising that the journal Nature lamented that science in the region lacks “a cultural base.”
As a result, differing interpretations of the technology certainly could reinforce an already growing chasm. Retired Pakistani lieutenant general Talat Masood is uniquely qualified to assess both the technology and the gaps in understanding that could play out on the “street” in the Muslim world. Masood, who served in the Pakistani army for thirty-nine years, including as the man in charge of military technologies, characterizes the region’s impression of American strategy and doctrine as that of “distant war.” That is, the United States has a great willingness to use force, but only if it can do it from afar with high technology, limiting as much as possible its human exposure on the ground.
Masood, whose former colleagues trained up the Taliban in the 1990s, described the technology that the U.S. military was using as “amazing,” but also as causing “great anger” in the region. “This type of warfare seldom involves distinct front lines. Fighting has taken place in a confusing mix of friend and adversary, usually directed from afar with occasional failure in communications systems bringing death and destruction to civilians. There is a lack of understanding by the U.S. of the human realities and a marked insensitivity about the casualties of the opponent, and at times even of their own forces. Implications of the RMA are thus broad and profound and a frequent cause of creating a major rift between the U.S. and the Islamic world.”
Similarly, he described how people in the region felt that “distance warfare, due to its relative safety, acts as a ready incentive for the U.S. to use military force in pursuing its foreign policy objectives.” But this comes at a cost, he found. “Overreliance on the military instrument has brought under sharp scrutiny the great values and political principles of the U.S. that many in the Islamic world admired and respected.... The advent of ‘distance warfare’ has profound implications for the battlefield and for America’s global strategy. It is fast transforming the relationship with its allies in the Islamic world. Undoubtedly, the U.S. has been able to militarily overwhelm its adversaries, but in every case, whether it is Afghanistan or Iraq, it has vastly complicated the prerequisite of building the structures of peace.” In short, warned Masood, “The concept of ‘shock and awe’ could drive moderate and uncommitted civilians toward anti-Americanism.”
Other regional observers agreed strongly with this view. As a security expert in Qatar summed up, “How you conduct war is important. It gives you dignity or not.” Their reactions also appeared to confirm a sense that America was coming across as a menace, using its high technology to pick on the little guy. As one Pakistani observer commented on a 2006 Predator strike that just missed al-Qaeda leader Ayman al-Zawahiri, “The mythology surrounding Mr. Zawahiri’s ability to survive all attempts to capture or kill him will dramatically enhance his political power to raise funds, along with his moral suasion to rally dormant cells of al-Qaeda followers around the world.”
Even pop culture in the region echoes the experts. In 2007, for instance, one of the most popular songs in Pakistan, where there are as many as ten Predator strikes a month, was “Chacha Wardi Lahnda Kyo Nahen?” (“Uncle, Lose the Uniform Why Don’t You?”). The song was played at street protests and even became a popular ringtone for cell phones. Its lyrics give a hint at how what Masood described as America’s “distance war” is being portrayed: “America’s heartless terrorism, Killing people like insects, But honor does not fear power.”
Someone who has started to give credence to the unintended psychological consequences of using robots in war is Mubashar Jawed “M.J.” Akbar. Akbar is an Indian Muslim who is the founding editor of the Asian Age, India’s first global news daily. He is the author of eight books, including most notably In the Shade of Swords, which came out just before the Iraq war and warned America not to underestimate the brewing anger in the region. Also a columnist read by millions in newspapers across South Asia and the Middle East, Akbar mixes smart analysis with a finger on the pulse of the region.
Akbar expects the future media coverage in his region of robotic systems to be “frightful.” He explains, “It will be like when tanks were first used in World War I. When they were introduced, they were described like a weapon of horror, like a large monster, not a weapon.... It will excite lots of references to movie horrors. It will be seen as evil, by the way.” Indeed, given regional distrust of America, if any mistakes do occur, “then the region will assume you meant for it to happen.”
In talking about the Israeli use of UAVs, Rami Khouri in Lebanon had observed that “the general reaction is of an evil, brutal enemy that will use any means to accomplish its goals. Some might say, ‘It’s too tough. Just give up.’ But for a lot of people it will spark a greater desire to fight back.” M.J., living in South Asia, sees a similar message going out from American use of unmanned systems to the broader Muslim world. “It will be seen as American cowardice. In war terms, if you are not willing to sacrifice blood, you are essentially a coward.” He continues, “These systems will show the pathway to your defeat unintentionally. They create a subtext that shows that you don’t want to die. . . . That all we need to win is to frighten them.”
Clearly, conflicts in places like Iraq and Afghanistan are bringing together combatants with vastly different understandings of war, the role of the warrior, and the meaning of sacrifice. One side looks at war instrumentally, as a means to an end, while the other sees it metaphysically, placing great meaning on the very act of dying for a cause. It is for this reason that completely different interpretations are made of the same act. A person who blows himself up can either be a martyr and shaheed or a murderer and fanatic. There is no in-between.
Unmanned systems take this collision of human psychologies to the next level. They are the ultimate means of avoiding sacrifice. But what seems so logical and reasonable to the side using them may strike other societies as weak and contemptible. Using robots in war can create fear, but also unintentionally reveal it.
It is this link that leads Akbar to conclude that another unintentional effect must be watched out for. The greater the use of unmanned systems, the more likely it will motivate terrorist strikes at America’s homeland. “It will be seen as a sign of American unwillingness to face death. Therefore, new ways to hit America will have to be devised.... The rest of the world is learning that the only way to defeat America is to bleed her on both ends. The [American] public responds to casualties and to bleeding of the treasury, so if something goes on long enough they get tired.”
Disturbingly, I heard the same conclusion time and again from other regional experts. Speaking from his experience in Iraq, Nir Rosen expects that the continuing trend will “encourage terrorism,” maybe especially among those not fighting that way now. As he explains, it is important to understand that in places in Iraq, not every fighter is an al-Qaeda terrorist intent on attacking the United States. “The insurgents are defending their area and focusing on troops they see as occupiers. But if they can’t kill soldiers on the battlefield, they will have to do it somewhere else.” He predicts that the more we take American soldiers off the battlefields, the more it will “drive them to hit back home.”
Rami Khouri similarly anticipates that for many in the Middle East, the sense will be, “If they play by these rules, which are completely unequal, then we’ll play by our own rules.... Whether it’s in the U.S., the U.K., or Malaysia.” As the United States uses more unmanned systems, terrorists “will find much more devious ways to cause panic and harm. They’ll say, ‘If they are going to use these machines, we should get some chemicals and use them.’ Put them in air-conditioning ducts in shopping centers or university dorms.... They might go after soft targets, shopping centers, sports stadiums, and so forth.”
The same observers are all realistic, however. They see terrorism occurring regardless of unmanned systems. Moreover, they see that same sort of adaptation that the Iraqi insurgent hinted to me. Despite all the expected negative coverage such systems might receive in the region’s press and public opinion, they anticipate there will be a quick willingness to gain and use them as well. As Akbar explains, “When they first come out, the very first reaction from the defense establishment will be, ‘Where can we order these fucking things?’ ”
Indeed, Akbar (like many of the other regional experts I spoke with) believes that nongovernmental groups like insurgents and terrorists will also be quite willing to use them. Indeed, they will have their own ready explanation, to bolster their own psychological operations. “In fact, they will likely cite a verse in the Koran that you do not start jihad until you have the latest weapons, armor, and steeds.”
This quote in the Koran reads, “Against them make ready your strength to the utmost of your power, including steeds of war, to strike terror into (the hearts of) the enemies, of Allah and your enemies, and others besides, whom ye may not know, but whom Allah doth know. Whatever ye shall spend in the cause of Allah, shall be repaid unto you, and ye shall not be treated unjustly.” As Akbar explains, “The ‘steeds of war’ is translated today as ‘the best equipment.’ . . . Basically, it tells that you should not go unprepared into war. Valor is good, but not enough.... Even David had a stone.” Or, as Rami Khouri in Beirut puts it, “The response to drones is to get your own drones. They are just tools of war. Every tool generates a counterreaction.”
DO ROBOT SOLDIERS DREAM OF ELECTRIC SHEEP?
As both sides of a conflict then begin to use unmanned systems more and more, a larger question of psychology comes to the fore. For all the differences in war through the ages, human psychology has always been at its center. Napoleon said, “In war, moral considerations account for three-quarters, the actual balance of forces only for the other quarter.” What happens when that three-quarters is replaced by something else? What happens when the forces feel no fear, no fright, no anger, no shock, no awe, or any other elements of human psychology, but are guided only by a software of 0s and 1s?
The effect on future history will be immense. Imagine how different the world would be today if at the battle of Hastings, the English hadn’t lost heart when their king was killed, or if at Waterloo, Napoleon’s Old Guard hadn’t grown weary of war and instead fought to the last machine.
The historian John Keegan wrote that “the study of battle is therefore always a study of fear and usually of courage; always of leadership, usually of obedience; always of compulsion, sometimes insubordination; always of anxiety, sometimes of elation or catharsis; always of uncertainty and doubt, misinformation and misapprehension, usually also of faith and sometimes of vision; always of violence, sometimes also of cruelty, self-sacrifice, compassion; above all it is always a study of solidarity and usually of disintegration—for it is towards the disintegration of human groups that battle is directed. It is necessarily a social and psychological study.”
This was the truth of the last five thousand years of war. A human army needed some “vision, a dream, a nightmare, or some mixture of the three if it is to be electrified into headlong advance.” A robot just needs an electric charge.