Susan expected a grilling, particularly after Lawrence’s “robopsychiatrist” comment; but, after only a request for her to remain in the building, Detective Diondre Riviera dismissed her to a waiting area with coffee, tea, and pastries. Too keyed up to eat, Susan sat on a folding chair, hands interlaced in her lap, and waited for the arrival of Dr. Lawrence Robertson.
Alone in the quiet of the small, institutionally two-toned room, Susan found her thoughts straying to Ari Goldman. Though she had joined him on the nanorobot project, she did not know him all that well, at least not personally. He and Cody Peters had worked together for more than twenty years, long enough for rumors to surface about their sexuality. It seemed odd to Susan that society extolled the virtues of friendships and partnerships; however, if one actually lasted, romantic ideas seemed to automatically enter the minds of all who knew them, whether casually or as family and close associates.
Both men were married to women. Goldman’s wife was a teacher at an elementary school and Peters’ was a microbiologist. Both couples reportedly had children, though Susan had never met them. Had the men not worked so well together on research projects that spanned the psychiatric gamut, Susan suspected they would never have become friends. Their personalities clashed: Goldman was gruff and no-nonsense; Peters loquacious and social, often to the point of silliness. He enjoyed seeing how far he could push his partner on a daily basis, it seemed to Susan. She had always enjoyed their oddball interactions; Peters clearly did, and, though he hid it well, she believed Goldman had, too. He pretended to ignore his partner’s antics, but on occasion Susan saw a ghost of a smile slip through his façade. Susan could not help wondering if Peters had heard the news yet; and, if so, how he handled it. They had discovered two new forms of schizophrenia, helped uncover a genetic defect in a common familial type of bipolar illness, and paved the way for what was once a new class of antipsychotics, now in common usage. The few studies of robotechnology in psychiatric medicine had all been undertaken by them, and they had enlisted Nate as an assistant multiple times. Which explains why he was at the murder scene.
Discomfort accompanied the intrusive thought. Susan had attempted to direct her mind solely to Ari Goldman and the sorrow she felt over his premature and horrible death, yet she found herself at least equally concerned about Nate. His emotional state, his dispensation, intruded on any thoughts she tried to focus fully on the murdered researcher. What does it say about me as a human being if I’m more concerned with a frightened robot than a slaughtered man?
Sooner than Susan expected, the door opened. Detective Riviera ushered Lawrence Robertson inside. “I apologize for the delay. Would you two mind waiting ten or fifteen minutes while I speak with the officers returning from the scene?”
“No problem,” Lawrence said, with a nod toward Susan. “We want to get to the bottom of this as well.”
The detective continued to hold the door open. “Help yourselves to refreshments.” Without awaiting a reply, he stepped backward, allowing the door to shut itself behind him.
Susan recalled something Detective Jake Carson had said after he had abruptly, and wholly unexpectedly, asked her for a date in the middle of a conversation about her father’s murder: “Nothing said or done in a police station is private anymore.” She felt certain she and Lawrence were being watched, if not through one-way walls, then with a camera, and everything they said to each other would be recorded and considered. Susan knew she ought to rise from her chair and embrace the president of United States Robots, but she made no move to do so. She felt drained and cold.
Lawrence took the bottom thermocup from a stack and filled it from the coffee spout. With his other arm, he dragged a folding chair in front of Susan’s, shook it open, then sat and transferred his drink to both hands. “So, how is Dr. Susan Calvin?”
“Under the circumstances, not great.”
Lawrence nodded. “I share your pain.”
“Most of it.” Susan sighed and pulled her chair around to face Lawrence squarely. “I hope you have room for a full-time robotherapist on your staff, because I just annihilated my residency.”
Lawrence’s lips twitched into a frown. “But you’ve come so far, worked so hard. Why would you do that?”
Susan did not want to discuss the details with a precinct of police officers. “Maybe, deep down, I prefer to be a full-time robotherapist.” She was kidding, but if Lawrence realized it, he did not show it.
Lawrence’s features remained sober, his coffee clutched in his lap. “That would suit me, but I still think you’re being unfair to yourself with that label. You have more than enough credentials for a psychologist, and you are a psychiatrist.”
“But not a robopsychiatrist.” Susan found herself adding without intention, “Not yet.”
Lawrence seized on the last two words. “When, then?”
Susan could no longer pretend it was all a joke. Lawrence obviously wanted her on staff, though whether because he felt responsible for her parents’ deaths or because he really believed USR would benefit from a robopsychologist, she did not know. No such degree existed, of course, but she knew she could not just laugh off an answer. So, she considered the question and answered appropriately. “When I’ve earned my PhD in robotics.”
Leaving his coffee between his knees, Lawrence threw up his hands. “So, maybe a year’s worth of classes and a thesis.”
Susan mulled his words. Her MD counted for more than a master’s, and the two years of residency fully prepared her for psychiatry or psychology. She held a bachelor’s degree from Columbia University, double majoring in math and physics. Additionally, she had taken multiple postgraduate-level classes when the standard ones had proven so easy she had spent more time tutoring than learning. “That’s about right.” She smiled. “I hear Columbia has a decent robotics program.” Her father and Lawrence had met while attending that program and became friends and roommates before starting USR.
“The best.” Lawrence smiled and reclaimed his coffee, taking a sip. “And I’m paying.”
“I can’t let you—”
“I’m paying,” Lawrence said with finality. “I created the position, I want you in it, and I’m going to make it happen.”
Susan stopped arguing. Unemployed and out of living relatives, she really could not afford to argue.
“It’s done all the time in business,” Lawrence pointed out. “All you have to do is apply, get accepted, ace all your classes, and write a killer thesis.”
Susan almost managed a smile. “Easy peasy.” Though said facetiously, it was not far from the truth. She had taken top grades in medical school, besting most of the finest minds of her generation. That had earned her a place in the national honor society, AOA, as well as her school’s own.
Lawrence took several more sips of coffee while Susan considered her future. Always before, she had pictured a thriving, hospital-based practice that included face time with patients as well as research and teaching. She saw herself married to a loving, intelligent man with a career of his own. That dream had wholly crumbled, and it would take some time to build something in its place. But when she thought about including USR in the restructuring, it seemed the one and only certainty.
Lawrence broke the silence. “So, as a robotherapist, do you believe it’s possible that Nate—”
“No.”
The swiftness and force of her answer seemed to catch Lawrence off guard. “Don’t you want to think about it a bit?”
“Not necessary,” Susan said, just as swiftly. “I’ve spent more than three hundred nights tossing and turning, thinking about it. It’s like the stories of evil genies and wishes. A man asks for eternal life, and the genie grants the wish. But by the time the man reaches seventy, he has terrible pains in every part of his body. By ninety, he’s wheelchair bound, and by one hundred and thirty, he’s basically powdered carbon begging for death. Or he asks for wealth beyond his wildest imaginings and winds up crushed to death beneath piles of platinum bricks.” Susan looked up to find Lawrence staring at her.
“I’m missing your point.”
Susan realized she had not yet made the connection between her examples and the Three Laws of Robotics. “My point is that computers are absolutely literal. The simplest programming typo renders them useless, and a tiny mistake can require a specialist to spend hours attempting to find it. Machines can’t intuit or extrapolate. Unlike children and dogs, they can’t do what you mean. They can only do exactly what you tell them to do.”
Lawrence started to contradict. “Except the positronic brain—”
Susan interrupted again. “The positronic brain is an exception in that it allows for the development of contemplation and emotion; it literally learns. It’s plastic, in the developmental sense, not the inanimate sense. More important, it makes judgments based on knowledge and experience. So, for example, it is capable of interpreting what it sees, hears, and learns as well as the commands given to it.”
Lawrence pressed. “I’m aware of that, Susan. I created it.”
“Yes.” Susan gave the claim little credence. She had already marveled over Lawrence’s genius and an invention that few people could have conceived of, let alone accomplished. “So the production of the positronic brain takes a lot of steps. No?”
Lawrence bobbed his head. “Alfred counted them once. It requires more than seventy-five thousand operations to manufacture one positronic brain, each with hundreds of factors. If any of those goes wrong, the brain is ruined and we have to restart.”
Susan had never witnessed the creation of one, but she assumed the most complicated and fascinating invention in the solar system would require a lengthy and difficult process. “And one of the earliest of those factors is the Three Laws of Robotics, correct?”
“Correct,” Lawrence confirmed. “If anyone tried to remove or deactivate them, the entire brain would be destroyed. A positronic brain cannot function without the Three Laws in place.”
Susan continued as if he had not spoken. She had not really needed his affirmation. “So, at the time the Three Laws are inserted, we’re still talking about a computer. A literal interpretation.”
“Your point,” Lawrence coaxed again.
Susan leaned forward, clasping her hands between her knees. “My point is that eventually a robot with a positronic brain learns to reason. In that respect, their thought processes are, essentially, human. Humans murder other humans because they find a way to justify it.” As a psychiatrist, she had to add, “Unless they’re psychopaths. In which case, they don’t need a justification.” She returned to her original point. “Since the positronic brain doesn’t rely on neurotransmitters, and damage to the neural pathways would render them inoperable . . .” Susan paused and rolled an eye to Lawrence.
“Correct,” he encouraged. “And I believe I know where you’re going, too. You’re saying that, since the Three Laws are placed into the positronic brain while it’s still in the literal phase, they can’t be countermanded. Robots can’t develop neuropathic psychiatric problems, such as psychopath . . . ologicalness.”
“Psychopathology.” Susan fixed the word absentmindedly, then spoke her own sudden realization aloud in awed tones. “If we could inject the Three Laws into every developing embryonic human brain, we’d finally have the ‘world peace’ everyone claims should be our highest priority.”
Lawrence stiffened and took another sip of coffee. “Well, I wouldn’t go that far. . . .”
Susan would not, either, but she could see the positive side of it. “I’m not saying we should. I’m just presenting a hypothetical scenario. I’m also not saying the Three Laws of Robotics have no wiggle room. Beautifully crafted, yes. Perfect, no.”
“So, you’re saying . . .”
“Nate could not have killed Ari Goldman. In fact, he could not even have witnessed it happening without intervening because the First Law doesn’t just state that ‘a robot may not injure a human being.’ It adds, ‘or, through inaction, allow a human being to come to harm.’” Lawrence already knew this; Susan continued for the sake of any law enforcement officers who might be listening, now or in the future. “Obviously, you have to educate and supply parameters along with the Laws. What, exactly is ‘harm’? Humans can drown, but that doesn’t mean robots should bar anyone from entering a bathtub or a swimming pool. However, if I was about to step into a vat of boiling lava, I certainly hope my robotic companion would stop me.”
Lawrence swallowed his coffee. “He would have to. And, yes, there’s a threshold for potential harm. Any of our robots can instantly calculate the odds in almost any situation to about twenty subdecimal digits.” He considered a moment longer. “However, as we discovered with the Mercury expedition, there are some situations in which a robot might inadvertently disobey the second part of the First Law. Out of ignorance.”
That caught Susan wholly by surprise, though she tried not to show it. “What happened?”
“Suit malfunction.” Lawrence simplified, probably more for listeners than Susan. “Robots don’t need oxygen, and they can work in extremes of temperature, so they didn’t realize the need for assistance. Of course, once they were commanded, the Second Law kicked in. The astronaut was whisked to safety, the robots were better educated, and we had no further difficulties.” He added carefully, “Of that type, at least.”
Susan knew the Mercury mission had been plagued by problems, most of which had nothing to do with the robots. With temperatures ranging from 280 degrees below zero to 800 degrees above, it was hardly the most hospitable planet for human exploration. She took the logical leap. “So, it might just be possible that Nate witnessed the murder if he did not realize the killer’s motive. In other words, he saw no danger in a human carrying a tool; and, by the time it was used to crush Dr. Goldman’s skull, it was too late to intervene.” Susan continued thinking aloud. “And, at that point, the Three Laws would force him to do three things: one, try to prevent the victim from dying; two, protect any other persons in the room; and, three, protect the killer himself from harm. Because the First Law doesn’t discriminate between human beings?” It was as much a question as an observation.
“Of course not,” Lawrence pointed out amicably, “because it’s the job of a robot to remain subservient to all of humanity.” He added, “Although, that’s not to say it wouldn’t rank those three priorities in order based on . . . knowledge and experience. The fact that one person in the room had already shown himself willing and able to kill would become a part of any decision making.”
Susan thought back to when two Cadmium agents had broken into USR, threatening her, Lawrence, Detective Jake Carson, and Kendall Stevens, a fellow psychiatry resident. Both sides had fired many gunshots, placing Nate in multiple, untenable positions. Susan knew Jake and the agents were too well trained for anyone or anything to disarm them. Is it possible Nate could have taken the gun from me at some point but realized, if he intervened, the agent would have lived at the expense of Lawrence and me, at least? Was there a point at which Nate could have thrown himself in front of a bullet but realized that sacrificing himself would have negatively changed the outcome of the battle? Susan could not think of a specific moment when Nate could have acted differently and saved a life without forfeiting more. However, she also knew that at no point did she ever worry about his loyalties. Should I have? It was a question she could spend a long time pondering in the future, but she knew she would not need to. Nate posed no harm to her under any circumstances. She felt more sure of that than she did about any human in the universe.
Lawrence glanced around, as if worried someone might burst in on them in a delicate moment. Susan hoped he realized they were probably being unobtrusively observed. “Susan, do you think it’s possible someone tricked Nate?”
Susan did not understand the question. “Tricked him?” She pondered further. “You mean into committing murder without realizing it?”
Lawrence bobbed his head, but only once, and leaned forward conspiratorially, still clutching his coffee. “For example, what if someone befriended Nate. Every day for months, they played a game where he swung a hammer through the air or at a nail, always safely. Then, a blindfold became part of the game.”
The suggestion was so ludicrous, Susan laughed out loud until Lawrence’s wounded look stopped her. “Lawrence, that’s insane. There’s no way Nate would allow himself to be blindfolded with a heavy implement in his hands, even if only because he might crush the finger of someone holding the nail. He cannot harm a human being. Period.”
“Maybe if someone convinced him he was hammering a nail, then switched the situation somehow . . .” Lawrence was fishing.
Susan gave him that much. Perhaps there was some twisted scenario in which a brilliant, thinking machine could be duped into performing an action without realizing the consequences, but this did not seem like a possible one. “Too complicated and perverse, Lawrence. Anything that required weeks or months of preparation would risk Nate telling one of us about it. I can’t see Dr. Goldman going along with it, especially if it was presented as a prank or a game. Peters, maybe, but not Goldman.”
“Yeah.”
Susan continued. “Far more likely a human being murdered Dr. Goldman and framed Nate. The robot’s constraints and naïveté would make it an easy target.”
“Yeah,” Lawrence repeated. He glanced at his coffee, probably gone cold. “Which brings us to the only logical reality: Nate witnessed the murder but couldn’t stop it.”
Now it was Susan’s turn to say, “Yeah.” She already knew the Frankenstein Complex had forced USR to make its robots wholly subservient, to make the protection of their own existence, their own survival, a distant third to defending and slavishly serving their human masters. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” Then, “a robot must obey orders given it by human beings except where such orders would conflict with the First Law.” Only after those conditions were met could a robot consider the Third Law: “A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.” It might have rendered Nate a sitting duck; yet it made him wholly safe to people, exactly what the human race wanted and insisted upon.
“So why won’t Nate tell the police what he witnessed?”
Susan had to admit she had no idea.
• • •
The single occupant of the six-by-eight-foot cell sat cross-legged on the concrete floor, face buried in his hands. He still wore the casual khakis Susan had noticed in the laboratory, and his blue polo had splatters of blood. She felt certain the police had emptied every pocket. Had she not known his identity, she could never have guessed it. Unlike the brisk and friendly robot who had become her best friend, the figure in the cell appeared dejected and eerily still. Though their footsteps echoed through the hallway, and Detective Riviera guided Susan and Lawrence verbally, Nate did not even raise his head at their approach.
“Nate,” Lawrence said.
There was no response from the cell. Nate could just as well have been a statue.
Lawrence glanced at Susan, who tried, “Nate, it’s me, Susan.”
Still, the figure in the cell did not move.
A thought struck Susan, and she wondered why she had never considered it before. “Does Nate have an Off switch?” She had never seen him in a dormant state, but she did not always find him when she looked for him, either. Lawrence had once mentioned that positronic robots required an energy source in order to maintain their function and memories.
“Not a switch per se.” Lawrence studied the robot, apparently assessing his physical status. “But the battery does need occasional updating.” He stepped closer to the bars and spoke in a commanding tone. “N8-C, look at me!”
In slow increments, the robot’s head lifted until the eyes appeared just above the fingers. They were not the sweet brown orbs with which Susan had become so familiar. Now they looked drawn and haunted.
Lawrence addressed the detective. “You say he hasn’t been answering your questions.”
“That’s right.” The detective kept his attention fixed on Nate. “In fact, that’s the most he’s moved in at least an hour.”
“How have you been asking?”
The detective went on the defensive. “The usual way. We read him his rights first.”
Lawrence shook his head impatiently. “That’s not my concern. I’m not a lawyer, and robots don’t have rights anyway. No matter how human one might appear, it’s still an object, a tool. You no more have to Mirandize him than you do a bridge that collapsed or a piano that fell on someone’s head. He does only what he’s been told or programmed to do.” He returned to his original point. “The Second Law states that a robot must obey orders given it by human beings. I’m just wondering if you simply asked him or actually commanded him.”
Susan considered Lawrence’s words. Her first reaction, dismay, quickly faded as she realized the truth behind his statements. So many videos and shows had contemplated the humanity, the entitlements and privileges of cyborgs, she could not help considering them just and civil rights. Yet a line needed to be drawn before positronic robots became commonplace, to prevent a “conception” debate from dividing and crippling the country in the same way abortion had in the twentieth century.
Putting aside her love for Nate and John Calvin, Susan had to submit to Lawrence’s point that the dividing line was the brain itself. Anything with a human brain was human; anything with an artificial brain, including the wonder that was the positronic brain, was not. To do otherwise might permanently destroy the robotics industry. United States Robots was not God, and what it created was not life, only a facsimile. Recycling a robot must never be considered murder. She could imagine larger hordes of protesters exhorting USR than Manhattan Hasbro. Most would demand its closure. Others would insist that every brain, no matter how damaged, must be salvaged and kept “alive.” The Frankenstein Complex, Susan realized, had an even more evil twin, the Belgar Complex, named for a sympathetic robot in a sappy movie. To define a robot as a legal person with standing in the justice system opened a box worse than anything Pandora could have dreamed about.
Several thoughts behind Susan, Detective Riviera addressed the point on the table. “I asked him questions he chose not to answer. The right to remain silent is well established.”
“For people,” Lawrence pointed out. “Do you ask a pedophile’s computer to please disgorge its child pornography and accept its uncommunicativeness as an answer?” Lawrence did not await an answer to his rhetorical question but turned his attention back to Nate. “N8-C, stand up.”
Nate climbed gingerly to his feet and looked forlornly at his master.
Lawrence continued. “Tell us what happened to Dr. Ari Goldman.”
Nate fairly mumbled, “He was killed.”
“Tell us how, N8-C.”
Nate remained quietly in place for several moments before replying, “It would appear he was bludgeoned with a Stanley 55-099 FatMax Xtreme FuBar Utility Bar.”
That startled Susan, not only for the technical jargon spewing from the mouth of someone she ordinarily considered a friend, but because she had believed the tool on the floor was a hammer or, possibly, a wrench.
Lawrence frowned. “Tell us more details.”
Nate obliged. “It’s a utility bar used for heavy demolition work, a four-in-one tool for prying, splitting, bending, and—”
Lawrence interrupted, “Not about the tool, Nate. Tell us more about the murder.”
Detective Riviera jumped in. “Tell us who killed Dr. Goldman.”
Nate’s gaze dropped to the floor, and he shuffled from foot to foot, looking for all the world like a chastised child. Then, he gave the last answer Susan expected. “It would appear . . . that I did.”
It was not exactly a confession, but was apparently near enough one for the detective. He turned to face Lawrence. “Dr. Lawrence Robertson, who designed this robot?”
“I did,” Lawrence said.
“And who programmed him?”
“Myself and my team,” Lawrence said, then added unnecessarily, “I have promised to take full responsibility for anything said or done by N8-C.”
Susan did not like where this was going, but she felt as helpless to stop it as if it were a speeding locomotive. She closed her eyes, though it seemed absurd and unnecessary. What she really wanted to block out were Detective Riviera’s next words.
“Dr. Lawrence Robertson, you’re under arrest for the murder of Ari Goldman. You have the right to remain silent. . . .”