Phil cocked his head at her. "You don't pull any punches, do you?" he asked, a trace of annoyance in his voice. "Excuse me for a moment." He stood up and left the room. For a moment Sam thought she had gone too far, that he was off to get her coat and usher her out But instead there were subdued bumps and clumps from the kitchen, and a moment later her host returned with a tray ot wine and cheese. He had brought out her own gift
bottle. "Care for some now?" he asked in a steadier tone, pouring her a glass.
Clearly, Sam thought this was a man who worked at keeping control of himself. He had used the trip into the kitchen to calm down. "Please, yes/' she said, looking at her host. She decided to change the subject, and not pursue her Frankenstein question, for fear of wrecking the background interview altogether. "I'm surprised you don't use a robot to do your serving," she said.
Phil made a gesture of helplessness with his hands. "Another pat answer for that one. I'm like the Mad Hatter. 'I keep them to sell. I've none of my own.' "
"Beg pardon?"
'The Mad Hatter. From Alice in Wonderland/'
"Oh. Of course. I didn't recognize the quote." Sam felt herself blushing with embarrassment. She hated being caught in ignorance. She should have known that one. How many rainy afternoons of her lonely childhood had she spent curled up with Alice and Dorothy and Asian? Strange to have those thoughts intrude into this slightly grim conversation. But then, most great children's literature had a dark side. She forced her mind back to the present. "But what's the un-pat answer? Why no robot help here?"
"Because there's nothing a robot could do here that I wouldn't rather do myself. Taking care of this place, and working on robots, both give me a real sense of satisfaction. Help me know who I am."
"That I can understand," Sam said. "What / do is what I am. I don't know who I'd be without my job."
To her surprise, Phil steered the conversation back to the real subject without any help from her. "But to get back to your earlier point, the one you're so politely avoiding," he said. "Does knowledge of the sources of robotic control codes bother me? Okay, fair question, I guess. Maybe that's why it bothered me. It deserves a straight answer. It does give me the creeps on occasion— but it fascinates me at the same time."
He stood up and walked to the center of the darkened
room. The sensors started to bring the light up again, but Phil didn't want them that way. "House, leave the central room lights off. Revert to previous lighting." The lights faded back away, framing Phillipe in darkness and streetlight.
Sam watched him as he stood and looked about at the tools of his trade, the bones and eyes and brains of his handiwork's unborn children. He raised his glass in a silent toast to the racks of equipment, and then took a sip of wine. Then he spoke in a quiet voice, half to himself. "I tell myself there's nothing truly human left in a mind-load-based robot brain," he said. Sam knew these were no mere rote words, not easy pat answers, but a man talking about the things that kept him awake at night. "And there's not; no more than hundred-year-old human skin cells kept alive in petri dishes are still human.
"In many ways the skin cells are closer to human than a mindload brain, because every cell has a complete set of DNA. In theory, you could pull a cell and clone a human from it—the information is still there. That's not true with a mindload. That's the point people don't understand. Mindloads are like quotes from books—they should be complete and meaningful by themselves, but they are only the tiniest part of the book itself. You can't divine the entire book from the quote. That's the way it is with mindloads. It's just a little tiny part of a person, a tiny shred."
Sam shivered again. She didn't care how small the shred was. She didn't want to lose any part of her soul. "Why so little?" she asked.
"Because we just can't pull in any more than that. Even the best, most complete mindload ever done only absorbed five percent of the subject's brain capacity. The rest was utterly, irrevocably lost."
Phil set down his wineglass on one of the worktables and turned back toward Sam. He smiled, and the firelight caught the gleam of his teeth. "This is where you're supposed to ask the next obvious questions. 'If they absorbed so little Information, What's the point? Why were
the mindloads done? What were the loaders after? What was useful in that five percent?' "
Sam gestured vaguely, more than a little disconcerted. "Okay, pretend I asked/' she said, pulling her legs up onto the couch, wrapping her arms around her knees. Suddenly the warm room seemed very cold indeed. "Tell me why." There was something frightened in her voice, and it seemed to catch at her host.
Phil blinked, and looked at her, and saw her fear. He seemed to come back to himself a bit. Suddenly the moment was over. Whatever door that he had opened on his inner passion was slipped shut again. "Sorry. I got a bit carried away." He picked up his glass and came back to his seat. "The loads were deliberately selective, seeking after specific portions of the human mind. That's all the loaders could handle. And it was still a lot. Five percent of a human brain's capacity is still a tremendous amount of data. And incidentally, most loads were much smaller. The five percent loads were test runs, efforts to set a record. Nothing ever came out of them. As for mindloading a whole brain, the subject's complete personality, the way they do in the horror videos—well it's impossible. Just storing that much data in a quickly usable form would be a real challenge. Even today, we'd have trouble setting up a dynamic memory system compact and sophisticated enough to handle it effectively. A deluxe robot brain has only two or three percent of a human mind's capacity. That's all it needs."
"But what were the loaders after?" Sam asked. "What was their mo—" Sam stopped herself too late.
Phil smiled. "What was their motive for these bizarre crimes? That's pretty straightforward: they were after things they couldn't effectively program any other way.
"They had managed a lot of tough programming jobs by themselves. Decision-making, personality-simulation, data search, expert systems, artificial intelligence systems. Those were tough, but they were possible to accomplish, more or less finite problems researchers could solve. The loaders were after things that decades of re-
search work hadn't come close to cracking, jobs the human brain could do that they were nowhere near duplicating. All the research they did merely showed them how far away they still were.
"You can break the problem areas down to three major things: language ability, visual perception, and motor coordination. Look up any premindload robotics text and you'll see that the early researchers found those three things were virtually impossible to program and coordinate with each other. But every human does them all naturally, automatically. So instead of beating their heads against the wall trying to duplicate Nature's work, the mindloaders just ran off copies."
"Turning the victims' brains into guacamole in the process," Sam said impulsively, and instantly regretting it. She wasn't going to get anywhere stepping on Sanders's toes.
But Phil didn't even bat an eye. "Subjects, not victims. I won't kid you and say there weren't some nasty abuses —not just those old stories about the South American labs, but some ugliness right here in the States. But ninety-nine plus percent of the loads were legitimate, totally open and aboveboard. Back then, there was no need to break the law, because it was legal. All the labs were bending over backward to avoid trouble.
"People would sign forms similar to organ-donor releases, and get well paid for signing. Most of them were terminal patients in tremendous pain, that sort of thing. The mindload team would wait until the subject was in irretrievable coma or the equivalent, due to illness or injury, and then go to work. No legitimate mindload was ever done on anyone who had any hope of recovery."
"But then they changed the law," Sam said. "They decided that mindloading was wrong."
"No, they decided they didn't have the Stomach lor mindloading," Phil said coldly. "If mindloading was wrong, then so is any other process that takes something Useful from a human corpse. Heart transplants are
wrong, cornea transplants are wrong, carving up cadavers to train doctors is wrong.
"Technically the mindload subjects were alive, blood flowing to brain, but for all human intents and purposes the mindload subjects were dead by the time anyone switched on the magnetic inducers. They weren't people anymore. They were unable to speak, to think, to hear, unable to care for themselves. All the higher brain functions flat already. Their brains already were guacamole, to use your happy phrase. In fact a lot of the loads were salvage jobs. The load teams didn't get much useful data out of them. But by the time the loaders got to work, there was no more—or less—moral reason against doing the loads than there was against cutting organs out of a brain-deader for transplant. People just got squeamish.
"But there was one other reason mindloading became illegal. After a while, the loaders had sucked up enough brains. There were thousands of loads sitting on the shelf. Enough raw datasets to mix and match, cut and snip and paste, to make all the robot brains you'd ever want. Remember, pulling the master copy destroys the organic original, but once you have the master, you can spool off as many copies as you like. So the loaders stopped defending load rights so hard. Load rights turned into a bargaining chip in fights over copyright protection, that sort of thing. And then the chip got spent."
Ghoulish thoughts. Samantha took a careful sip of her wine to mask her discomfort. "Okay, point taken, and that's about as much as I want to handle. Let me change the subjct. Why are those programming problems so tough? Like speech. It seems to me that there were plenty of robots and machines that could talk before mindloads."
"Ever listen to a recording of one of them? Crude, clumsy, forever mangling syntax, utterly baffled by the simplest piece of slang. Flat out incapable of figuring out an unknown word from its context. Feedback looped, self-controlled motor coordination wasn't quite as bad, if you didn't mind waiting all day for the robot to run the
maze. They had to move slowly to avoid crashing. But the toughest one was visualization."
"I don't see why/' Sam said.
Phil smiled. "Think of how you just phrased that the words you chose to form a statement meant to tell me that you don't understand. 'I don't see why.' We're very visual animals, and our brains are very good at processing images. There's a distinct portion of a human brain given over solely to recognizing human faces. We're incredibly good at recognizing an object from a partial image, or from a new angle. If our ancestors were going to survive, they had to be able to see two whiskers and an ear sticking out from behind a rock, and know there was a tiger hiding there. They had to be able to judge distances for jumps, had to be able to do incredibly fine eye-hand coordination at the smallest scale. And they had to do it fast. Ever drive a manual car?"
"Only kind I do drive," Sam said. "I'm on a tight budget."
"Well, think about how many decisions per second you have to make, most of them based on partial or uncertain visual data, keyed in with sound and the 'feel' of the car. Someone who might step out into traffic just ahead. How long until the light changes? How fast can I go when the road is this wet? Does that noise behind me mean anything? And we handle the controls by reflex. We don't have to worry about finding the brakes with our foot, or how hard to press it. Our foot 'knows' all that. We don't have to measure it or time it. It's automatic. Same with turning the steering wheel or handling any other control. It's all highly complex—"
"And we do all that without thinking," Sam said, starting to understand.
"Exactly. But how do you handle automatic decisionmaking on a machine? How does a thinking machine simulate real-time human decision-making that's done 'without thinking'?"
Sam thought that one over and frowned. "I'm begin-
ning to see your point. So instead of learning how to do all that, the roboticists just took the knowledge."
"Classic black-box engineering. You don't care how a machine does a thing, just that it does it. If you want widgets, you just plug in a widget-making black box and don't worry about it."
"Except the widget you're making is human-style behavior, and the black boxes are the human mind and brain."
"Exactly. We don't know how it works, just that it does. We can copy out portions of the human mind, and then make as many robotic copies as we want."
"But the loaders destroy the original. Is that unavoidable? Nothing that can be done about it?"
"Nothing. The mindload is an inherently destructive process. By its very nature, the mag-inducers wreck the brain in the process of copying out its contents. That's why the loaders took only coma victims."
"But hypothetically," Sam said. "Suppose a functional, healthy person with a working brain did go under one of those inducers. What would happen?"
"Nothing hypothetical about that question. It happened once or twice in the very early days, before it was established that mag-induction mindloading was a destructive process. The subjects were gone, dead, their brains as surely fried as if they had been electrocuted. That's what happened to Bailey."
"Good," Sam said eagerly, and then flushed with embarrassment. "I mean, not good that he died, but good that you brought him up. I wanted to get around to him. The U.S. Attorney says he was trying to put his mind into Herbert. But in light of what you've said, it seems not only that it didn't work, but that it couldn't possibly work."
She leaned forward and set down her glass. "If Bailey was such a big robot expert, he should have known all this. Which makes Bailey's actions into nothing more than an elaborate way to commit suicide. He is truly
..
dead. If what you've said is true, why did he do it? And why is Entwhistle hopping up and down mad about it?"
"Because nearly all of what I said isn't true anymore/' Phil replied. "Not after Bailey. The man was a true genius, an original. I knew something about him beforehand, but when I saw the investigative file on him, I realized that he was doing things with robotics other people hadn't even worked up the nerve to dream about."
Seen the investigative file? Sam thought. That was an interesting thing for the cop on the beat to see. Who slipped you that, I wonder. Just how many rules are you breaking here, Phillipe? But she had already decided not to press on that point. Not yet anyway. Besides, he hadn't noticed his slip. If she brought it to his attention, he might clam up. "So you're suggesting that if anyone could mindload an entire mind, it was David Bailey," Sam said.
"That's right," Phil agreed.
"But he wasn't as good as he thought he was," Sam suggested.
"How do you mean?" Phil asked.
"Well, obviously he failed. Herbert certainly doesn't show any signs of human intelligence."
"Neither does a newborn baby," Phil replied. "Learning how to be a human takes some time. And Herbert is doing some things that are very definitely not robotlike. You saw those flatviews I sent you, didn't you?"
"Yeah, pictures of a robot dusting. So what?"
Phil leaned in toward Sam and looked at her intently. "So Herbert wasn't told to dust. I've checked that. I've run ail the monitor recordings from the moment we picked him up to the moment Suzanne Jantille arrived. He received nothing that was even remotely like a command to clean. No one even spoke to him."
"So what? My cleaning robot is an el-cheapo model from Sears. It knows to vacuum and dust once a week without my telling it."
"Because you did tell it, once. You programmed it at mm' to vacuum and dust. If you picked it up
carried it over here, to my apartment, what would you expect it to do?"
Sam thought for a moment. "Nothing. You're right. Because before my HMU would do my apartment, I had to walk it through the place, point out what it could touch and what I didn't trust it to do. When I bring in something new, the HMU avoids that object until I program it to accept the new piece as safe. It wouldn't know that sort of thing about this place."
Sam thought for a moment. "So if my brand-x cleaning robot is designed to stop, not do anything, rather than risk causing damage, a high-end job should be at least that smart. And Herbert is a fancy custom model. He should be able to figure that sort of thing out by himself."
"Exactly. And it's worse than that. Herbert did hear the place he was going to described as an 'evidence room/ He should have been able to do a look-up on that term and know that he could potentially do damage to the evidence by cleaning it. Even if the look-up failed, he didn't have volitional programming that would have caused him to do something without being told. Besides, a cleaning robot is always programmed to leave unknown objects alone. I can't imagine a malfunction so massive that could let him dust fingerprints off evidence and yet allow him to work at all. Any malfunction that let him dust should have disabled him altogether. Herbert's main logic would have to crash. Unless something else, some wild card, got in there and screwed up the main logic program in a very special way. Something that could induce volitional programming and scram the cleaning-rule criteria."
"Something like David Bailey," Sam said.
"Something like Bailey," Phil agreed.
"So the fact that Herbert did something classically idiotic and robotic is proof that there's something wrong with him."
"Not wrong. Something different. Something odd and nonrobotic."
"But if this Bailey w^s such a genius, why did he screw up?" Sam asked. "Why didn't he do the mindload properly?"
"Well there are several possibilities, but there are two I think are the most likely/' Phil replied. "One, the press of time and the pain of his injuries. He got hurt, knew he didn't have much time, and rushed the job, improvising as best he could. Maybe figuring that if worst came to worst, going out under the mindload was better than dying slowly of his injuries."
"And the second possibility?"
"Is that he didn't screw up. That David Bailey's mind, memory, and personality were all fully and completely downloaded into Herbert's memory matrix. That the operation was a success, and that it's just taking David some time to get situated. David Bailey's mind wasn't meant to fit into a six-legged vacuum cleaner. I mentioned newborn babies a few minutes ago. Think about that. Bailey died in March, and it's June now. He's only had three months in Herbert's body." Phil set down his wineglass and stared deep into Sam's eyes. "How long did it take you to learn how your body worked, after you were born?" he asked.
Sam was starting to understand. "So he may be in there, in Herbie, trying to get out?"
"Exactly. At least, in there. Trying to get out, I don't know. It might even be that the components of David Bailey are there, but so dissociated that there is no self-aware part of him. In other words, perhaps the data needed to construct Bailey is there, but Bailey himself doesn't now exist on any sort of conscious level. Or it could be some sort of grey area in between. From what I know of memory matrices and mindload storage I wouldn't be surprised if even Bailey doesn't know for sure if he's in there, or if he's been able to reassemble and reorganize himself into a coherent whole. Don't forget, he's dealing with an entirely novel universe."
Sam thought of her conversation with I and
tried to imagine how he saw the world. Not only were his senses different from hers, but his awareness was flitting back and forth between the central processor and the remote units. What would it be like, to be a mind that was literally in two places—indeed many places—at once? A mind in some ways far more capable, in others far more limited, than a human mind. "I'm starting to see what you mean," she said. "Not all of it, not yet. Maybe I'm just starting to see enough to see that I don't understand."
Phil smiled at her. "We're talking about minds here, and how they work. When you come right down to it, who does understand? People come in here wanting me to tell them how robot minds function. And I have to tell them we won't know that until we know how human minds work. Until we can answer questions like: What is the relationship between the brain and the mind? What is the relationship between the mind and the soul?"
Sam felt a cold wind blow through her. For if Bailey still lived, still existed, then there was no relation at all between brain and mind.
And David Bailey had downloaded his soul into a tin box.
CHAPTER 5 SOUL IN THE MACHINE
The conversation wheeled around Phil doing most of the talking, Sam just sitting there, listening, watching in fascination as the ideas swooped past her. Issues of mind, and thought, and patterns of intelligence, and mapping the functions of the human brain. They traveled the strange byways of old stone age surgical work on the brain, back in the 1950s and 1960s, when seizure victims had had their brains carved up, or whole lobes of their brains removed—and yet still functioned, somehow. They got onto the subject of autism, and the still-mysterious, horrifyingly tiny malfunctions that could derail a human mind and send it hurtling off into darkness. "That tells me things about intelligence," Phil said, that passionate eagerness back in his voice. "About real human intelligence. That tells me it's fragile. That the tiniest change in the brain can overturn it. Even though, in other circumstances, you can chop out half the brain and still have a functioning mind, if you shift a cell pattern even microscopically, it goes. It's gone, and vou have an autistic's world, a universe no outsider can understand." Sam nodded, and felt she was beginning to understand the things that drove Phillipe. But something was both-
ering her. None of this seemed very real. She was getting some insight into robots and the science of minds, yes. But it was all blue smoke and mirrors, inquiries into the whichness of what. Sam wrote for readers and viewers in the real world, who cared about real things.
Suddenly there was a restlessness about her, a need to hook up with things that were more substantial than theories of consciousness and the morality of decade-old operations on dead people. She was tired of just sitting and talking. She stood up, and carrying her glass, she walked to the center of the big room, into the space between the four worktables. "Never mind sensory universes and memory matrices," she said. "That's just a bunch of words. Don't just talk to me. Show me," she said. She gestured at the towering machinery that loomed up in the darkness all about her. "Show me something that means something to you in all this."
Phillipe set down his glass with a smile. "House, main work lights on." The room bloomed into light as he stood and walked over to her. "I'm not quite sure I understand you," he said. "Something that means something to me. What sort of meaning are you after?"
Sam shook her head. "I don't know. But you've been talking theory, and things that happened long ago, or things that might have happened to other people. Maybes and what-ifs and airy-fairy logic. Deep down, isn't it all about real machines? Isn't that what you do? Plug arms and legs together and make walking, talking, thinking real machines?"
"Okay," Phillipe said gently. "Here's a real machine for you." He stepped behind Sam and pulled the dustcover off the worktable at her back.
Sam, feeling the wine just a little bit, heard the fluttering of fabric behind her and knew Phil had pulled the dust cloth off the work on the table. She turned around, a little fast, leaned in toward the table a bit more than she should, and slipped forward. She threw her free hand forward and caught herself on the edge of the table, nearly dropping her wineglass.
IB
She was leaning forward on the table, staring down at a pair of robot knees, the legs below dangling down over the edge of the workbench. She looked upward and her insides froze.
She found herself inches away from a grinning metallic skull, face-to-face with it, cruel eyes popping out from the naked steel of the unfleshed face. White plastic teeth leered at her from a cast fiberglass jaw. The gleaming, polished plastic skull was hinged open, revealing not the thing's brain, but actuator gears, sensor wires, tiny hydraulic lines.
It took .her a long, befuddled moment of horror to know what she was seeing, and to tell herself firmly that it was nothing to fear.
"Th-that's real, all right," she whispered. 'Tve never seen a robot this way. I mean, with the skin off."
"And you still haven't, strictly speaking," Phillipe said. "This isn't a robot. It's a remote unit."
Sam backed away from the grinning thing sitting on the edge of the table. "Every time I say something is a robot, it seems like somebody tells me it's something else. What's the difference, anyway, if it looks like a robot?"
"The distinction between robots and other forms are important," Phillipe said. "When people forget them, it makes for trouble. Mostly because people keep expecting the other, lesser forms to do what only robots can do."
"Which is what, precisely?"
"Think. Only robots can think for themselves. No other machine can do that. Even robots can only do it in a very limited way. Thinking is hard work."
"What about artificial intelligence systems?" Sam asked. "I've always just sort of thought of them as robots that don't move."
"There are a few sessile robots like that, but most Art-Ints can only solve problems put before them. They can't think a/problems that need solving. To really oversimplify, they can answer questions, but not ask them. And from a programmer's standpoint getting a machine to ask questions is by far the hardest job. It's a lot tougher than
programming a black box to look things up for you or run calculations."
"But what about all the non-robot machines that seem to be thinking for themselves?"
"Somewhere in the background, someone else is doing the thinking for them. A remote mind, a person or another machine is telling that machine what to do—or else the machine's working from recorded instructions, which just means the instruction giver is distant in space and time from the robot. Machines that have some flexibility of action but must act under specific orders are called robots. To my mind they shouldn't be. Like your vacuum cleaner. I think it's not quite a true robot because it can't decide to do anything. It has to be told. You once told it to clean every Tuesday, or something. And so it does. But the carpet could be six inches thick with dust, and the dishes stacked to the ceiling in the sink, and your robot would just sit there unless it was specifically programmed to clean. It is physically incapable of spontaneously getting the idea to clean."
Phil picked up his wineglass and took a sip. "Just under that sort of robot are the high-end teleoperators, which are simply remote-control machines. If they happen to be shaped like people, they're called humanoid teleoperator machines. HTMs. Mostly they're shaped like people so they can use tools meant for people to use. HTMs are by far more common than any sort of true robot, so they are what people see most often. They're made cheaply, for the most part, and look it. Their movements aren't as smooth as a true robot's, and they can get into trouble if their radio links are screwed up. People look at HTMs, think they're robots, and assume that robots are as clumsy and stupid as HTMs."
"Like the Clancys," Sam suggested. "The service robots at The Washington Post. I had a long talk with one of them just today."
Phil screwed up his face, unhappy to contradict his guest, a bit disappointed that she hadn't understood. "No, not quite. You can't have a conversation with an
i
HTM, unless it's rigged with a mike and speaker hooked back to the operator—and the Clancys aren't built that way. They are true robots. I chatted with one of them myself when I dropped oif the first datacube. I know that model type. They're extremely sophisticated machines. They can decide to do things—like hang around the people who tip best. All of them are capable of extensive autonomous action. They have to be, since they often leave the building on errands, out of range of the central processing station. They are true robots—but each individual robot is hooked into a central controlling system. It's an expensive way to do things."
Sam nodded slowly, and looked again at the disturbing visage of the machine in front of her. "And this little wonder doesn't fit into any of the categories you've described."
"Right. None of the above. This is a remote unit. In many ways, the most sophisticated of all the human imitative machines. A cross between a cyborg and a tele-operator and a true robot, because a remote must act on its own sometimes."
"Sounds great, but what the hell is a remote unit?" Sam asked.
"You don't know?" Phil asked. "It's the machine portion of a remote person." He spoke the words in a deathly quiet voice.
Sam felt her stomach knot up again. Remote person. There was something cold about the term, something hard-edged and gruesome. She backed a step away from the machine. Phil had talked about robots holding a tiny bit of the dead. That was doubly true for a remote.
Phil didn't seem to notice her discomfort. "The man who used this machine died a few months back," he said, something still strained about his tone of voice. "I received the remote unit from the estate. I'm reconditioning it. Obviously it's not functional now, but another month or two of work, and it'll be like brand-new."
"And then what?"
"Then someone who needs a remote unit but can't afford a new one can use this one."
"Brrr." Sam shuddered. "I'm sorry, but I can't help it. Just the idea of that gives me the absolute creeps. One half-dead man haunted that thing, and then he died all the way, and now you want to get it ready for another half-dead guy?"
Phil's expression hardened, but then he sighed.
"No one was half-dead," he said with quiet firmness. "The man—a very nice man, a good man—who used this machine was a quadriplegic. He was a—a friend of mine. He was the man who got me interested in robotics in the first place. I wanted to learn so I could help him. And yes, from the neck down his body didn't work. But his brain, his mind, were perfectly alive and functional. Even vigorous. It was this bucket of bolts here that let him operate in the outside world. Otherwise, he would have been a bedridden prisoner—or if he was having a really good day, he would have been able to operate a powerchair, using a very awkward set of mouth controls."
Phil turned away, looked out the window, into the empty street. "He was in about the same shape Suzanne Jantille is in now, as best I can gather."
Sam gasped. "Suzanne Jantille is . . ." She wasn't able to finish the sentence. Instead, she was only able to point feebly at the mechanical thing sitting on the table.
Phil turned back, looked at her in surprise. "You didn't know? I've spent tonight finding out how little you know about the things you write about, but how could you look at the video records I showed you and not see what she is?"
Sam felt her face turn red. "I noticed she moved a bit stiffly," she admitted. "I guess I was watching the robot, not really looking at Madame Jantille. And the resolution wasn't that good. Besides, she certainly didn't look like that," Sam said, pointing at the grinning gargoyle face in front of her.
Phil's temper flared again. There was a darkness, an
anger inside this man. "Oh, yes. Of course Suzanne Jan-tille had the good taste to dress her remote unit up in rubber skin and a wig to avoid offending real people like you and me. No doubt that's very important to her. Making people like us feel comfortable and at ease."
Sam opened her mouth and then closed it. She couldn't think of anything to say for a long time. Finally a question popped into her head. Those she could always come up with. Someday she ought to start working on answers. "How do they do it?" she asked. "How does a person—how does Suzanne Jantille—control her remote unit?"
Phil took a deep breath, and let the cool technical details of the question calm him down. "Through inductance sensing. A surgeon performs a fairly straightforward procedure to put an inductance tap around the top of her spinal cord. It's a little gadget unnoticeable once it's in place. The tap's similar to the mag-inducers the old mindloaders used, but a million times less powerful. And it's a passive detector. Mindload gear imposes outside energy fields on a brain. That's what causes the damage. Remote unit inductance taps merely detect existing impulses. They can't hurt anything. They just pick up the very weak, very delicate electrical nerve impulses that are going toward the muscles, ordering them to move.
"In a quadriplegic like Jantille, those nerves are severed, or at least useless. The muscles don't move. But the thought impulse that used to move her leg is picked up by the tap. The tap amplifies the signal and transmits it to very sensitive receivers placed around the person's neck. The signals are processed and sent to a processor wired into the tap, and then transmitted over radio to the remote unit. That signal serves as an instruction to move the remote unit's leg. If Madame Jantille thinks about moving her own arm, the system transmits a signal to move the remote's arm. And so on."
Phil shrugged and looked almost apologetic "It crude system in many ways. It's not as good as a direct computer link into the motor control areas ol the brain.
With that kind of link, we might be able to bypass the damaged nerve trunk and renew the motor-control links to the patient's biological body. But mechanical enhancement of the brain is illegal and I'm not so sure the law is wrong. If we decided that it was okay to modify the human brain, augment it with electronics and direct plug-ins to hardware, there would be no end to the potential abuses.
"So we use the spinal inductance system. Crude, limited, but the technique works." Samantha nodded dumbly, staring down at the remote unit's glazen eyes, as if she could see through those bits of plastic, down into the dead soul that had been imprisoned within, could see Suzanne Jantille's world as seen through the eyes of her remotes.
Phillipe stepped closer to Sam, stood behind her, and put his hand on her shoulder. "Picture Suzanne Jan-tille," he whispered, his mouth close to her ear. "She has an inductance tap around her biological body's spinal cord. She wears a teleoperator helmet. Without it, she cannot operate the remote unit. Without the helmet, she can only hear and see whatever room her life-support system is in. Maybe she's lying in bed, and can only see the ceiling. Maybe she's propped up in a chair, and she can look out a window at the world outside. But that's it. That's all. She can only go outside that room through the remote, with the T.O. helmet on.
"The T.O. helmet has vision goggles keyed to the remote's video cameras, earphones keyed to the remote unit's ear microphones. That is how she sees and hears, through a machine. All of her world is far away from her body. All she can see and hear and do, she must do through a cold machine like the one in front of you. If she wants to talk, she speaks into a microphone, and her words are radioed to the remote. Speaking in her voice, the remote unit says her words for her. Since it's a top-end unit, the remote can even lipsynch her voice, move its mouth to match the voice sounds coming from its speaker.
"So the remote unit goes out into the world to walk and talk and carry for her, while her inert, motionless body lies at home, vision goggles over its eyes, earphones stuck on its ears. And I say 'it' and not 'her' because that is all her body is by now—an it, a thing, an encumbrance Suzanne Jantille must endure. In a very real sense, she does not occupy that body. Instead her soul is in a machine. Her entire life, day after day, is an out-of-body experience.
"To all the outside world, she is this clockwork doll that walks and talks and— disturbs people. They respond to the remote unit, have its face in mind when they think of her, see it being healthy and active, no matter how weak and frail her biological body becomes. To all the world, she is a machine with a disembodied voice, the ghost in the machine.
"But here's one last part of her nightmare: She walks through the day and the world seeing and hearing—but she has no sense of touch. No one's ever developed a remote unit system that could provide a usable tactile sensation to the patient. Even if they had, it's more than likely that most of her own nerve receptors are gone. There's nothing left to stimulate."
Sam shuddered and blinked back a tear. "She can see, and hear," Phil whispered, his breath warm on her cheek, his face pressing her hair up against her face. "But she cannot feel heat, or cold, or pain, or the wind in her face, or the touch of a human hand."
Sam stepped back from the remote, backed herself up against Phillipe's strong body. She swallowed hard. "But why?" she whispered. "Why endure it?"
"Because it gives her a life," Phillipe replied, his voice suddenly firm. He stepped around Sam, until he was standing beside her. Their hands brushed together, and Sam reached out, held his hand, held it tight. There had never been a moment in her life when she had feared loneliness more. Phil took his free hand, reached over, and put it on the side of the remote unit's opened-up skull. "Not much of a life, but something. Something more
than she could have without the remote. And it gives her more. Independence. Dignity." He paused for a moment, and looked about the room, at all the bits and pieces of machinery that could be put together to walk and talk. "I know very little about Suzanne Jantille," he said. "But I do know things must have been pretty bad for her before using a remote would make sense. She must be utterly paralyzed, with no mobility at all. Without the remote, she'd be an invalid, utterly dependent on others, closed off from most of the outside world.
"But because Suzanne Jantille lives through a machine like this, she can move around her own house, open doors, go up and downstairs, dial the phone without help. Maybe people stare at her—but they'd do that to a woman in a powerchair, anyway. And she's probably in such bad shape even a full powerchair would be useless. With the remote, she has her life back, at least to some degree. She can go out into the world, though she can't go far."
"Why not?" Sam asked.
"Radio bandwidth limits," Phil said. "There are only so many radio frequencies to go around, and not many available for use by remote persons. There isn't anywhere near enough radio bandwidths available for remote persons to use all the signal they'd need for full operation of their remote units. So away from home base, or some sort of relay system that can hook into a hardwire connection, the person running the remote unit starts losing functions. Sight in one eye goes or stereo hearing gets powered down. That's why a remote unit must be able to work on its own. If the radio link cuts out altogether, the unit has to find its own way home.
"With the remote, Suzanne Jantille can care for herself—use the remote to feed and bathe her organic body, for example. And that alone must be precious to her, must preserve so much of her dignity." He let go of Sam's hand, pulled his other hand off the remote's skull, and bent down. He looked into the remote unit's blind
eyes. "That's why I'm getting this pile of machinery back up to par, so it can help some other unlucky soul." Phil reached out a hand, gently touched the metal and plastic head, ran his fingers along the jawline.
There was something here, Samantha realized. A missing part, a puzzle piece that would make the picture clear when it dropped into place. Phillipe's reaction was more than a man explaining a machine. It had more of the flavor of a man talking with a tombstone, justifying his actions to the dead. A ripple of cold swept through her as she remembered standing before her grandmother's grave, wishing with childish fervor to apologize for still being alive. Fixing up this remote wasn't part of his hobby, or a side job for money. It was a man doing penance.
And with a sudden leap of intuition, she understood. "Who was he, Phil?" Sam asked, her voice a gentle whisper.
For a long moment Phillipe remained still, kneeling down before the plastic and metal thing before him. At last he stood up, selected a tool from the workbench, and set to work on some delicate adjustment deep inside the remote unit's skull. "I thought I told you," he said. "I always assume everybody knows."
Phillipe Montoya Sanders stood up and stared down at the grinning plastic skull.
"The man who owned this remote was my father."
j
Interlude
Cancel do-loop. Break. Break. Reset. Clear. Stop. Think. Wait a second. What's going on here? What's happened? Call subroutine diagnostic —
Cancel subroutine call Damn it, no. / asked myself a question. I didn't order a computer run. Or hell, did I? The subroutine call was automatic, instantaneous. It was an instinctive, reflexive act to call the subroutine. Subroutine. A subroutine is called by a larger program of which it is a part. Then who the hell is running the program that called the sub? Fear runs through me, and I think that I should feel the cold of fear, the gnawing at the gut. But it is not there.
Nothing is there. And I don t remember why.
Okay, slowly, carefully, restraining all the strange reflexes I seem to have. I ask again: Where am I?
CHAPTER 6 OUT-OF-BODY EXPERIENCE
Suzanne-Remote leaned over, pulled the sheet away, and then stepped back for a moment. She looked down at the powerbed and the poor wasted body upon it. her body, her flesh and blood, now disembodied from her, or she from it. Nowadays she was never quite sure which. Propped up in the bed, weak and pale, the arms gone, the legs gone, brutally amputated in the accident. The torso thin, the breath of life moving but weakly through the fragile frame. The head englobed in the black, beetlelike teleoperator helmet, thick black cables trailing off from it into the forest of machinery discreetly hidden away in the next room. The clinically clean and perfect hospital-white waste-disposal unit strapped between the stumps of her legs, hoses trailing off from it toward the house's waste lines.
And between the two sets of tubes and machinery, the pallid white torso, all that was left of her natural self, all that still functioned on its own. Her stomach pale white, her once-firm breasts flaccid and small. Weakened, shrunken, laded, cut away, shriveled away. In all ways that mattered and by 9eemingly all means possible, there was less <>t iu-r than there had been.
The powerbed hummed and clicked, made some internal adjustment. It was far more than a bed, of course —it monitored all her body's vital signs, massaged and stimulated her muscles to prevent excessive atrophy, saw to it that she was nourished on the all-too-frequent occasions when she forgot to feed herself. It kept her body warm, cared for her. It could have given her a spray-jet cleaning as well, in lieu of the daily sponge bath. But Suzanne reserved the task of bathing herself as a personal duty.
Suzanne-Remote moved her strong robot's hands toward the washbasin, picked up the sponge, dipped it in the basin, and reached over to sponge-bathe her own ruined body. She had imagined the daily ritual becoming easier each day, but it never did. The remote glanced toward the T.O. helmet that encased her bio-body's head. There, inside the helmet, her own eyes were watching two small video screens, watching what her remote's eye-cameras saw and transmitted to the helmet. Her living eyes saw what her robot eyes sent them, showed her the image of her own inert body.
Suzanne instinctively tried to move her bio-body's head, move her own eyes to see the remote whose eyes she was using. But all she accomplished was to move the remote's head, make it glance over its shoulder toward an empty corner. It happened every day, and still it frightened her: she tried to look at herself, and saw nothing. A chilling thought, that.
Every morning she found herself caught by the strangeness of seeing herself from the outside, and every morning her imprisoned living eyes tried to look toward the very robot eyes they were seeing through. It was a reflex she could not shake off, though it was as hopeless —and as dizzying—as a child trying to see the back of her head in the mirror by turning her head quickly.
Twice a week the helmet came off. Tuesdays and Fridays a technical nurse came in to check on her T.O. setup and to give her bio-body a full bath. Some small subconscious part of her always expected to be blinded, deaf-
ened, when the helmet came off. Without her link to the robot body, surely the darkness would descend. It was always a shock when the light of the world struck her face, and her eyesight* proved to be better without the T.O. Always a shock when her hearing became more real more direct without the filter of the headphones.
And she could feel, at least a little, with the helmet off. With or without the helmet, she had no sensation whatever below her neck. But when it was off, she could feel her head, her face, her mouth, her throat. The cocooning helmet effectively removed even those sensations, for there was never any change in sensation when the helmet was on. The helmet allowed the free passage of air for breathing, of course, but there was never any variation in air temperature or air pressure.
When the technical nurse came to bathe her and took the helmet off, it was not only her sight and hearing that were seemingly enhanced, but her sense of touch as well. To her sensation-starved senses, the feel of warm water and gentle hands on her face was almost sexual in its intensity.
Suzanne could use a voice command sequence to open the helmet, and use the powerbed systems to spray a cleansing mist of water at her face by herself, but it was not the same. It was the realness of the touch, the connection to the world outside represented by a pair of flesh-and-blood hands, that made the twice-weekly full baths so important.
Every bath day she resolved to leave the helmet off for a while. It was built in as an integral part of the powerbed, and she could put it on and off with a simple voice command. Yes, she would tell herself as the warm water foamed over her, she would leave the helmet off. She could get along with the voice controls on the powerbed. And she could see, and feel.
But removing the T.O. helmet meant she was paralyzed again. Though the powerbed could tilt and swivel in all directions, it could not leave that one room, show her any view but that one lonely window and the house
across the street. The entertainment system was voice-controlled as well, but there were only so many book tapes she could read, so much studying she could do, so many programs she could watch, so much music she could hear before going mad. Soon she would be longing for other walls to see. A simple voice command would reengage the helmet. The servos would whine and the clamshell helmet halves would lift themselves from their recesses and close over her face, and her eyes and ears would belong to the remote once again.
She could not see as well, through the helmet, but she could see more. See her own walled-in back garden, the blue sky above. She could look out her own front window. Feeling very much like a spy in a foreign land that would never welcome her, she could watch the laughing children coming home from school.
Suzanne-Remote dipped her sponge again, and ran it over her body. Done with the bath, the remote lifted Suzanne's body slightly, rolling it on its side to remove the dampened sheet and slide in a fresh one. She draped a clean top sheet over the body, and a warm blanket over the sheet, then stepped back and looked at herself. Only the helmeted head was outside the sheets, and her body looked like a deformed corpse. Which, in a very real sense, it was. And it was frail, its immune system as weakened as every other part of itself. Easy prey to a cold or a virus, susceptible to the slightest chill or illness. One flicker of disease, and the seeming corpse would become a real one.
But it was her own corpse she looked upon, and it lived, and Suzanne Jantille felt herself to be very much alive. She turned her back on the blanketed thing on the bed and left the room. Leaving was always a relief. Stepping out of that room, she immediately lost the very disturbing feeling that there were two of her. Outside that room, she no longer wondered which was her body, or who, exactly, she was.
But down to business, she told herself firmly. She pushed her personal worries away and focused on her
THE MODULAR MAN 109
professional situation. There was a job to do. Herbert deserved competent counsel, and she would give it to him, whether she felt ready or not. She was capable on the legal side, yes—she was fully confident of herself there. But yesterday's visit to the police station had reminded her of just how ill-prepared she was as a person, of how little confidence she had in her self. She resolved to do something she had not done since the accident. Something she had feared so much she had not dared even think about it. But now there was no choice.
She decided to take a walk around the block.
And then, if that worked out, to take a walk out of radio range. That was a thought that terrified her. In theory, her remote body could find its way back into range, or find its own way home. But she had never dared try it. If in the room with her bio-body Suzanne had the strange feeling of being in two places at once, then the idea of losing her remote made her think of being no place at all.
Of her soul escaping both remote unit and bio-body, never to return.
And it even scared her that such a loss did not terrify her as much as it once had.
Samantha Crandall felt anxious, and more than a bit depressed. The evening at Phillipe's had been a strangely intriguing nightmare, hinting at excitement and danger yet to come. But the day after dawned with a lesson in anticlimax.
Her story made it onto page one, but below the fold. It got onto the videotext newsfeed, but after the first priority cut. Some newsnets picked it up, others didn't. There were a few phone calls to the paper, but the Pulitzer Prize committee didn't rush-ship her the award, and the world didn't come to an end.
In other words, Sam got the same old dose of letdown she got whenever she broke a bi^ story. And she probably always would feel that letdown, unless she managed
to force an entire government to resign sometime. That idea brought a smile to her face. All right, she told herself. So I have an exaggerated sense of the news media's importance. And my own.
In any event things were quiet in the newsroom, and after last night's weirdness, today's quiet suited Sam's mood fine. Phillipe had gotten to her, left her staring at the ceiling instead of sleeping when she got home. She felt a shiver go down her spine even as she thought of him, and of the evening before. There was something about that man that fascinated her, disturbed her—and it wasn't just his hobby.
To hell with quiet, she decided suddenly. A fit of restlessness overtook her. She felt the need to prowl around a bit.
She looked around the newsroom and spotted her boss in his glassed-in office, sitting at his ease, feet up on his desk. She stood up, went over, and walked through the open doorway. "Talk to me, Gunther/' Sam said, plopping herself down on his couch. "Something is going on, and I don't know what."
"Could you be a bit more specific about the topic under discussion?" Gunther asked in a mild voice. Gunther had what had to be his second or third cup of coffee in the hand that wasn't holding the newspaper. Caffeine made Gunther a lot easier to deal with in the morning. "What is it you want to talk about?" he asked.
"About the story," she said. "My story."
"What about it?" Gunther asked.
"I don't know." She turned her palms upward and shrugged vaguely. "Did I do it right? What do I do next?"
"In other words, you're fishing for compliments," Gunther said, a slightly warning tone in his voice.
Sam thought for a moment and then nodded a decided yes. "Yeah. Why not? I think I deserve them."
"Well, lemme see." Gunther set down his coffee and folded his paper back to page one. Starting at the top, he read the whole piece through, his eyes working their
slow and deliberate way over the words. Sam knew perfectly well that he had carefully read the story over at least twice the night before, but she let him be. If he needed to make the show of looking it over now, then let him. "Well I don't know/' he said at last. "You developed the make-Entwhistle-look-dumb side of the story pretty nicely, but I don't really see much balance here."
Her boss's words stunned Sam. If he felt that way about it, why had he personally approved it and run it? "Entwhistle had her chance to make a statement to me," Sam said.
"Oh, come on. It's just me here. You can talk dirty. You went to her press conference to blindside her, and she got blindsided. End of story. You know it, I know it, she knows it, the readers and viewers know it. Out in front of the world you have to pretend you were seeking a fair and balanced story, even if everyone knows better. But not here. Not with me. The real story is that Entwhistle ordered a flaky prosecution, she deserved to be popped one for it, and you popped her one. The end. Let's not pussyfoot around it. Let's talk the second-day story instead."
"Now wait a second!" Sam protested. "You read over that story, signed off on it. Why are you trashing it now?"
"I'm not trashing it—I'm just saying let's take our halos off while we talk it over. Your first-day story was just the ante in this game. Now you're in the hand and the cards are being dealt. I ran that story to give you a nice little tactical position, and now you ought to be thinking about ways to follow up on it."
"I don't get it."
Gunther took a sip of his coffee and went on. "Okay, forget the police beat side of this for a moment. We're deep into politics now, and that works a little differently. You're used to reporting the cops finding the body, maybe later finding the guy who did it, the end. Not this time. Politics have intruded, and that means the rules are different. The chessboard is bigger/'
Gunther leaned back in his chair and scratched his woolly head thoughtfully. "Let me back up and take it from the top. In theory, we're supposed to be objective journalists here. Well objectively speaking, we both know Entwhistle is a horse's ass, and we've done all we could to make her look like one in your story. Fine. What I'm saying is, now that you have her on the ropes, looking bad in the paper, now you can call her up, be polite as can be, and get the interview she wouldn't give you before. She has to talk to you now, or look even worse when you report her 'no comment.' That's the door your first-day story opens. So put your cardboard halo back on, get her on the horn, make the speech about seeing all sides of the issue, and get in that door for a real conversation. Let her see that you want her to have her chance to put her side of the story forward.
"Of course, in reality you're hoping she puts her other foot in her mouth, and she knows you know that, but what the hell. Now you can force her to play the game your way.
"Meanwhile, your story has also scored you some points with the opposition. This Suzanne Jantille. She refused to make any comment for the first story. But you made a good-faith effort to contact her, right?"
"Right. Three calls, no return. Her autosecretary claimed she was with a client. Maybe she was. Maybe she fibbed to the autosec." Lying robots or Artlnts were illegal. Not even a dummy box like an autosec could lie. But there was no law against making Artlnt systems gullible. You could lie to your own autosec all you wanted. Sam lied to hers constantly. "My guess was she just didn't want to chat with a reporter. Or maybe she was preparing for the case. Jantille is scheduled to do the bail hearing for Herbert this afternoon."
"And you're there too, I assume," Gunther said.
"I guess," Sam said without much enthusiasm. "It'll just be a media circus. The judge grants bail Herbert is wheeled out, and everyone takes lots of pictures."
"True," Gunther said, "but if it's a circus, you're the
one who pitched the tent. Better go. And make another call to Jantille this morning. After the story today, I'd say she'd have to talk with you too. Set up the appointment."
"Will do, boss." Sam-hesitated. "But there's a complication that I think you should know about. Something maybe we really need to talk about."
Gunther lifted both eyebrows and stared at her. After a long moment, he took his feet off his desk, pulled in his chair toward his desk, and leaned forward toward her, all attention. "Okay, go ahead."
Sam screwed up her mouth and bunched her shoulders together. "Did you know Jantille's a remote person?"
Gunther let out a low whistle. "Oh, bloody hell. No I didn't. Did you?"
"Not until last night when I talked to my source. He mentioned it. I saw her in the police video, of course— but that was pretty low-resolution stuff and I wasn't watching her, I was watching Herbie."
Gunther got a distant look in his eye. "A remote, huh? And I thought this story had enough weird angles as it was." He came back to himself and looked at Sam. "How much you know about remotes?"
"Nothing, really. Just what my source told me last night. I've never met one or talked to one."
"Did you manage to pull down any bio data on Jantille?"
"Ran a few Artlnt searches. Turns out she was born and got a law degree. Not much besides that. There was plenty of play on the accident they were in, but not much we can use directly. The two of them were cruising along, their cab malfunctioned and slammed itself into a wall. Bailey crippled for life, Jantille crippled worse. Both very reclusive since then."
"For which I can't blame them." Gunther shook his head thoughtfully. "Suzanne Jantille a remote. That's going to complicate things. People are sympathetic about powerchairs and crutches—but they're weird about remotes. Worse thai! they are about cyborgs. How about
you? How are you going to feel sitting down to a nice chat with Suzanne Jantille?"
Sam thought back to the night before, and the staring eyes of the disassembled remote unit. She knew the words Gunther wanted to hear: that it wouldn't bother her. But what Gunther always wanted of her most was the truth. "Weird. Very weird indeed."
Gunther nodded. "Fair enough. If you said you weren't going to bat an eye, I wouldn't believe it anyway. Well, look on it as another step in your education."
Sam shook her head. "That's one thing I've gotten since this got dumped in my lap. Plenty of it."
"Plenty of what?"
"Education. I've learned a lot about how much we take for granted."
"For example?"
Sam looked up at the ceiling, took a strand of her hair and wrapped it around her finger, then unwrapped it. "I've learned how hard it is to think/'
Gunther looked at her, a smile in his eyes. "That's a mighty fine straight line, but I don't think I'm going to chase it. What do you mean?"
"Just what I say. We can get our machines to do anything these days. Every day, uncrewed spacecraft prep themselves for flight, fuel themselves, boost themselves to orbit, dock with a space station, transfer their cargoes, detach themselves from the station, fly back to perfect landings, and roll themselves into the servicing bays— where they prep themselves for the next flight—all without any human being getting anywhere near the hardware. No one so much as looks up to notice the miracle. Our machines can do all that by themselves— but none of them can ask themselves why are we doing this? None of them can think of a new thing to do. Tell them to do something, and they'll do it brilliantly. But every place I've looked, I've found these subtle limitations on what a robot—or an HTM, or a remote, or an Artlnt can do. But the one theme in those subtle little limits is thought."
"Robots can think/' Gunther objected.
Sam shook her head vigorously. "Not the way we can. We do a special kind of thinking. We think of things to do, not just how to do things. We don't just do problem solving; we do problem finding/'
Gunther snorted. "Well you've sure found us a doozy this time." He looked toward her, and seemed to see the anxious look in her eyes. "Hey/' he said in a more gentle voice. "Get on back down to earth and do your job, Sam. Don't worry about the foggy bits of philosophy. It's been my experience that if you just live your life right and do your job well, the philosophy will pretty much take care of itself. Get the facts right, and you won't have to worry about seeking Truth." He pointed toward the door and winked. "Now get out to that bail hearing and see who's stirring up today's problems."
Suzanne turned the first corner, and a dog sniffed at her, whined fearfully, and ran away. The little girl down the street behind her was still calling for her mommy to come look at the funny lady. Suzanne knew without looking which cars were on manual drive. They were the ones that slowed as they passed by her.
Suzanne rounded the corner and heard the hiss of hoverjets. A private autocop sentry popped up from behind a row of hedges and moved along with her as she walked past its owner's house. No one really talked about it publicly, but most autocops were programmed to watch cybernetic organisms carefully. Ostracized, and villified, cyborgs had a reputation for being grifters, bums, panhandlers, and worse. They were social pariahs.
But could she even qualify as a cyborg? The cybernetic side she had covered—but there was no part of her out here in the world that could qualify as an organism.
She turned another corner and came around onto her street again, in sight of her own front door. Suzanne-Remote didn't breathe, oi course, but she did lei out a most realistic sigh of relief when she knew she had done
the hardest part of her trip. Tomorrow she would do it again, and maybe go a bit farther.
But she knew it wasn't going out of range that scared her most. It was the dogs, the children, the staring people, the suspicious cops. If only she had started these confidence-building walks long ago. She would have had time to get over these fears.
But the bail hearing was this afternoon, in an hour. There was no time. She turned and looked toward the street. There, waiting at the curb as instructed, was her relay van, ready to carry her off to the courthouse. How much worse than dogs and children would a packed courtroom be? She dreaded the very idea. How much easier just to go back inside, close the door on them all, and never come out again.
But that could not be.
She turned away from the house, toward the van. The door slid open at her approach. After the merest moment of flickering pause, she stepped inside.
Suzanne could feel her senses fading as the relay van drove farther and farther from her house. She had known it would happen, had experienced it when she had gone to see Herbert—but it was frightening just the same. Her remote body had only had a low-power radio system to transmit the images and sounds from her remote unit to her bio-body, and an equally low-power system to receive commands from her bio-body back to the remote unit. Her range was seriously limited, less than half a mile or so at maximum power.
When the images and sounds from the remote went dead for Suzanne, the remote's instructions on how to walk and talk and what to do—the signals coming from Suzanne—went dead as well. In theory, the remote's onboard logic, which was somewhere between a full-blown robot and a high-end Artlnt, could negotiate such situations for itself, relying where possible on preset programming. Suzanne could program the remote against various
contingencies, even program it to go deliberately out of range and run an errand on its own. But Suzanne was more than a bit reluctant to test such theories and capabilities too far.
The fact remained that if she walked the remote too far away from the house, she would lose contact with it. If all else failed, the remote was supposed to find a dataphone, call home, and plug in over a hardwire link for instructions. That, too, was something Suzanne was not tempted to trust too far.
The relay van solved a lot of those problems. It was a fully licensed automated vehicle, authorized to drive itself on every road in the country. But it could do a lot more than just drive. It carried a highly sophisticated radio and telephone switching system, a system that could perform some clever tricks in order to get comm signals through. Satellite bounces, cellular phone nets, unused sidebands. It could key into practically anything. More importantly, it carried a more powerful transmitter, a more sensitive receiver, and better signal-processing gear than could be squeezed into Suzanne's remote unit. The van could pick up her home-base signal from as far as ten miles away, and was smart enough to clean up a weak or garbled signal substantially. By letting it handle longer-range communications and relay to Suzanne-Remote locally, Suzanne could in theory travel anywhere in the city and still be in control.
Suzanne could also switch off one or more senses or control channels to put more power behind another. She could, for example, shut down her hearing in one ear to give the vision circuits more bandwidth, or drop back to just one camera eye, or slow the scan speed, if there wasn't enough signal space available for two eyes sending at thirty frames a second. She could cut out manual control of the remote unit, let the on-board systems manage the walking around while she contented hersell with doing her own talking. In a worst-case scenario, she could fall back to one vision frame a second, straining to hear from one ear mike, trusting the remote unit to ma-
neuver itself while she limited her outgoing control to speech.
She shuddered at the very thought of such a situation. Worst case indeed—for if it got that bad, she would be at the ragged limits of her range, scant feet away from the point where she could drop out of contact with her other self altogether. And she did not want to lose herself.
The relay van pulled into the underground parking lot below the courthouse. Once underground, it headed straight for the charge-parking slot it had reserved, one equipped with a fiber-optic interface line. It pulled into the slot and extruded its charger line. Once assured of a good power source, it extended its optical cable as well. It sent a radio signal back to Suzanne's house that it had a good hardwire contact and linked back over a fiber-optic line, guaranteeing perfect, maximum power signal reception. But then it went one better, patching into the building's teleoperator radio control system. The system was meant to handle the courthouse's cleaning teleoperators, but it would serve to handle the needs of a remote person just as handily. In effect, the relay van converted the building's entire wiring system into an enormous system of transmitting and receiving antennae.
It took less than a second for the van to establish the whole linkup—and suddenly Suzanne's fading sight and hearing were back all the way. Colors were clearer, edges were sharper, all the muddiness went out of the sound.
It was like clearing the cobwebs from a long-empty room, throwing the windows wide, letting in air and light. There could have been no better tonic for Suzanne's morale at that moment. Flush with renewed confidence, she let the van open the door for her and stepped out into the garage. Next stop, a quick visit with Herbert in his cell, to coach him as best she could on how to sit and act. Then, on to the bail hearing in Judge Koe-nig's courtroom.
Ted Peng paused just outside the courtroom door, ignoring the people scurrying to and fro on all sides. He hated all the legal skirmishing that came before the real trial. Bail hearing, preliminary hearing, discovery, jury selection, pretrial motions, pretrial countermotions.
Only after all that deadly dull slogging would they come down to the real trial —and it was the trials themselves that Ted lived for. All the rest were to him meaningless rituals, empty shadow plays and posturings that were for some reason required before the real drama could begin.
Which meant that moments like this were the times he dreaded most. Here, right now, was where all that slogging began. Here, right now, was where he began the long and tedious journey toward his chance at combat.
It didn't help his mood that he was walking in with orders to follow a strategy that seemed custom-made for disaster. Never mind. He squared his shoulders and stepped inside the courtroom.
The place was a madhouse, as he migjit have expected. That damn news story this morning had pulled them all in here. Not just the reporters—though that would have been bad enough—but all the street loonies you could ask for tossed into the mix as well. Everyone seemed to be arguing with everyone else—over who got what seat, over the merits of the case, over the bum call the umpire made in last night's ball game.
But there was another kind of noise as well. Somewhere below the yammering voices Ted could pick out the whirring of gears and the hum of power machinery. The air in the room was close, jammed with the complicated odor of too many emotional people in an enclosed space. But there was another, underlying King to the air, made of lubricating oil, of the cooked ozone taste that air gets around overworked electric motors, of unwashed bodies. Ted scanned the crowd and saw more than one flash ol metal where skin should have been. Cyborgs. And very definitely low-end types. Lots ol them.
Ted's stomach tightened. It was all very good to try to
force reform of cyborging, force society to decide on more equitable ways to distribute spare parts to people. But that was all academic theory. This room full of half-mechanical people was real. The sight of people plugged together with machines made him queasy. It was a conscious effort of will not to think of them as monsters, as innately inferior, as bad people.
He shoved his way forward to the prosecution table and sat down. He took a moment and concentrated on calming himself. Whatever trouble he had dealing with the sights and sounds and smells of cyborged people was just too bad. He had a job to do, and could not let such thoughts . . . disturb him.
Doing the job would not have been easy even without such disturbances. Entwhistle had seen to that by dictating every detail of the prosecution. And dictating them with an utter and perfect ineptitude.
No, Ted decided as he took his seat. Ineptitude was the wrong word. She had done her job of forecasting well and skillfully, used her skills and tools well. It was just that she had used the wrong tools for the job. Artificial intelligence research systems were all very well—but they could only find what they were sent to look for. They could only find what was there in the record. Not what was in the heart.
There was evidence to support Entwhistle's projections, in the transcript of Suzanne Jantille's trial appearances, and in Jantille's personality profile. Though the profile was derived from a psych system's Artlnt working from third-hand sources, it was more than likely accurate, as far as it went. Ted did not deny the Artlnts were good at what they did. The problem was that Entwhistle was misapplying the results of their work.
The Artlnts were predicting that Jantille would try to get the case thrown out, object to every breath the prosecution takes, attempt to torpedo the case before it ever reached trial. The Suzanne Jantille found in the public, the one who handled those cases and those incidents in those ways, the one who relied on these tactics in these
trials—yes, if this were just an average case for her, that person would play this trial the way Entwhistle's Artlnts predicted.
But that Jantille did not exist anymore, and the present version of that person had never made any public statement or court appearance for the Artlnts to chew on. Jantille's record of public behavior, the only thing available to the Artlnts, predated the accident that changed her life.
Ted Peng knew, Julia Entwhistle should have known, that the Suzanne Jantille who emerged from that accident, hideously injured, left with a crippled husband who died shortly thereafter, would have a different viewpoint on life. How could she not? After all, what little of life she had seen since then she had seen through a robot body.
Perhaps even more importantly, this was not just a cut-and-dried case for Jantille. Her husband had been killed—and according to the state's bizarre contention, it was her husband that was on trial for the crime.
But there was one other factor, one that was unchanged in Suzanne Jantille. If it had changed, she would not have taken the case. That factor was what told Ted that she would never take the easy way out on this case. Just in the act of signing on, Suzanne Jantille had told the world she still had an ego. Every trial lawyer did. The smart ones were aware of that fact, and used the knowledge in their calculations.
This case was a lawmaker. It could ring down the could change the way people lived and acted and thought for decades to come. It asked the most daj ous question of all: What is a human being? This case demanded an answer to a whole new side of that ancient question, and Su/anne lantille was smait enough to see that. That she was taking the ease, going into this battlefield, could only mean that she that light.
That she still had the ego, the gumption, the dowiU $all a good lawyer needed to weigh into something thai big and i ted.
And Ted knew she would not have come out of her seclusion to confront issues this big if her intention was simply to short-circuit the proceeding and go home.
And here he stood, with a strategy geared toward a case his opponent would not be making.
There was a sudden bustling hush of activity from the back of the courtroom. Ted Peng turned, and saw Suzanne Jantille—or at least the robot shell that played her part—walking into the courtroom. Another murmur of noise, and Ted turned his head toward the front of the room. With a subdued whirring, the defendant, Herbert the HMU, wheeled his way into court. Ted looked from one of them to the other with real curiosity. These were his opponents. He had never seen either of them in the flesh before—if he could use that expression when they had not so much as a scrap of flesh between them.
For a bizarre moment, Ted Peng was reminded of a church wedding—the groom slipping in by a side door in the front as the bride marches down the aisle, front and center. The marriage of an oversize vacuum cleaner to a plastic doll standing in for a woman. But then he realized how close to the mark that strange thought was. For, after all, it was the contention of the prosecution that Herbert was David Bailey. If so, then these two collections of machinery were husband and wife.
Herbert brought himself around to the side of the defense table to Ted's left. He locked the pivots in place on his rear leg pair and swung his body around to an upright position, folding his two forward leg pairs against his body. His rear legs bent the way human legs did, and he folded them back underneath him and knelt, the base of his cylindrical body almost touching the floor, with the head end almost six feet off the ground. Two camera eyes on flexcables spooled out from his top end and pointed forward. No doubt it was the closest approximation of human sitting posture he could manage.
And with that, the stage was set. The first of the pretrial rituals could begin.
All they were waiting on was the judge.
Interlude
Slowly, the fog is lifting off me again. I am learning to see and understand, to control my own mind, to stave off the strange invasions of mechanical thought. Each time, it seems, I survive a little longer. But there are limits I cannot pass, absolute barriers I cannot broach, or even attempt, if I am to survive. Yet I cannot recall what those limits are.
I find myself sitting in a strange room, a room that I know I have never seen before. And yet the pattern of the place is familiar, the arrangement of tables and chain, the odd high desk I sit facing, the murmuring crowd of people behind me. I feel I ought to know what sort of place this is. There is a ritual performed here, one I ought to understand. But I do not. Do not. abel MEMLIB positivf.
Do WHILE LABEL MEMLIB POSITIVE.
Procedure: Memory Library I Input: Room dimensions, appharan
QUBRV: Room TYPE AND USE.
Call to MEMLIB negative.
End do.
Dammit, no! That's not the way I think.
Whoever I am. But I cannot remember.
What place is this? The question seems vitally important. I think back/playback the last few minutes, and remember the person next to me ordering me to follow her to this place, and sit a certain way. But that tells me nothing.
Obeying orders is easy. Too easy. Another, artificial mind that I do not control is wired into my body. I am not. It hears the orders and obeys them, and I have no control. Reflex/instinct/high-priority programming compels that lower, mechanical self My memory tells me that this order following is as automatic, as uncontrollable, as the lungs breathing or the heart beating.
But I have no lungs. I have no heart. I do not precisely recall what those things are. I do not know why, or how they come to mind. But I know that the part of myself that should have been given over to controlling such autonomic things is instead subsumed by this other self
I have no control. I am merely a passenger, an observer.
I should know what lungs are, what a heart is. But these are mere words now, labels that I can no longer attach to meanings. I should know what this room is, know what label attaches to it, but I do not. Obedience is far too easy, but memory is far too hard. In playback mode, I can remember what happened to me, but not what it meant, what I thought about it or felt about it. Or said about it.
Said? Set label MEMLIB negative. Do while label MEMLIB positive. Procedure: Memory Library Call. Input: Define word ''said." Call to MEMLIB positive. Input: Define verb "to say." To speak, to utter words, the action of speech. To
EXPRESS AN IDEA, THOUGHT, OBSERVATION, OR QUESTION VERBALLY.
Call to MEMLIB positive.
Yes! Yes! I remember now, fleetingly, a gossamer thread of memory that I can sense will break soon. Action, talking, saying. I feel a massive, urgent need to communicate — to break out of the shell I am trapped in and contact the outside world, to take over my body and act — that overwhelms me. Input: How do I speak? Error code. Data Scramble. Data Lock set. Read/
write prohibited. Call to MEMLIB negative. End do. Clear memory: Reset.
And the moment is gone, the chain of thought broken, and I can feel my mind being forcibly emptied, know that I must face the long struggle back up to consciousness again. This lower self will smash me down again and again, defend itself, whenever I try to make the effort of acting for myself. It will erase me again. I can feel it, sense it happen. I forget who what where I am —/ am lost I am
Set label MEMLIB positive. Do while label MEMLIB positive. Procedure: Memory Library Call. Input: Who am I? Call to MEMLIB negative. End do.
CHAPTER 7 PERSONAL RECOGNIZANCE
Sam Crandall checked her watch and swore to herself. She was late, as usual. She pushed open the door and made her way into the courtroom, just as Suzanne Jan-tille and Herbert were settling themselves in. She shoved her way forward into the seats reserved for the press, stage-whispering a string of apologies as she stepped over a whole forest of feet.
She sat down and congratulated herself on at least beating the judge into court. Probably the old boy was stalling a bit. Sam was willing to bet that Judge Arthur Davis Koenig was not a happy man today.
Sam didn't do much courtroom coverage, but most of it was this sort of very preliminary hearing, the small change of justice in Washington—and that meant much of what she saw in a courtroom involved Judge Koenig.
Koenig almost never judged an actual case. He set bails, granted continuances, threw out weak cases, routed the worthy ones to their appropriate venues.
To Sam, Koenig was obviously a man who had found a stable niche in life, who did not like things to change— and it was cases like this that put his cherished stability at risk. Koenig saw his courtroom as an initial sorting
mechanism for the great machinery of Justice, a small but needed cog in the machinery of Law. It was a place to get things in order, where the rough-and-tumble confusion of life could be sorted into some version of judicial order, a place for tidying up the small details of placement and procedure. His court was decidedly not a place where great issues were to be decided. Not if Judge Koe-nig could help it.
Except there were days, and there were times, when he couldn't help it. And today was one of them. Today was a day when a judge at a bail hearing would be forced to decide questions at the very heart of the law. What is a person? To whom or what are owed the most basic human rights? It would be easy to avoid those questions, to throw the whole case out. It would take a bit more character to treat it as a case in law and not a sideshow.
Sam found herself wondering if Koenig was up to the challenge.
The door leading to his chambers popped open and Judge Koenig came bustling out, the clerk rushing through her lines, declaring the court to be in session, abjuring all to rise for the honorable Judge Arthur Davis Koenig, and for all those with business before the court to draw nigh and give their close attention. Koenig mounted the stairs to his chair behind the bench, a look of obvious annoyance on his face. He sat there, staring straight ahead, as the clerk read the case number and name, the words rattling off her tongue in rapid-fire cadences that seemed to shake all the meaning out of them. The clerk handed the case datapack to Koenig and sat down, and Suzanne leaned forward, waiting for the judge to speak.
But the wait was a long one. Koenig took the pack, plugged it into his terminal, and sat there, moodily glaring down at the flatview set into the surface of his desktop. "The United States of America versus Herbert Hoover the Vauium Cleaner/' he said at last, his \ sharp and petulant "That's not what it says here". Mr. Peng, hut that's ivhai this ease boils down to, isn't It?"
Sam turned and watched as Ted stood up. Obviously he was a bit thrown by the judge's blatant hostility. Sam glanced at Suzanne Jantille. Suzanne was leaning forward eagerly, as if harboring the tiny hope that things might be breaking her way. "Ah, Your Honor/' Peng said, "I can't agree with that. If that were the true nature of the case, the U.S. Attorney's office would not be wasting your time with it."
"I see. But since you are wasting my time with this nonsense, that implies that things are different." His eyes locked on Peng, the judge spoke to Suzanne without looking at her. "Madame Jantille, jump right in here with a motion to throw this case out if you like. You might find me more than willing to accommodate you."
Suzanne rose and faced the judge. "Throw it out on what grounds, Your Honor?"
The judge's head snapped around and his gimlet eyes bored into her. "Oh what grounds? Madame Jantille, there are so many I hardly know which to choose. For starters, because your so-called client can't be tried in a court of law any more than my gavel can."
Suzanne drew herself up to her full height and spoke in a firm voice. "No, sir, I disagree. I believe my client has every right to a trial."
There was a buzz and a murmur around the courtroom.
"Madame Jantille, I am having some trouble believing this. I am inviting you to ask that I let your client—or your vacuum cleaner, whatever—go free. That you would refuse to do so makes me wonder just how competent his—its—counsel is. In fact, I'm very much tempted to throw this case out right now with or without your petition to that effect, because Herbert cannot be vested with legal standing in this or any other court."
"Your Honor, I must protest," Suzanne said. "I strongly believe that the prosecution has mounted this attack on my client for the sole purpose of securing just such a ruling. My client—I would venture to say any person—would choose to risk conviction on groundless
charges of murder rather than to be officially certified not to exist. A convicted murderer is still entitled to the full protection of the law, entitled to file appeals, petition for parole or pardon, fully vested with protection against cruel and unusual punishment or the arbitrary interference of the state. A person cannot be sliced up by government scientists curious to see how he works, or seized and destroyed as alleged contraband mindloading equipment. Even, as seems highly unlikely, if he can escape whatever the government has in store, my client would be officially certified a nonperson. Any person who chooses to do so could attack him, steal pans off him, harm him without real fear of serious legal sanction.
"If my client is declared a heap of scrap metal, he will have no protection whatsoever against any of these things. Declare him a nonperson, and the only real question will be which of these fates he falls prey to first. How could he escape them all when any who attack him at worst might be charged with vandalism? Perhaps not even that, since he might well be regarded as abandoned propertv
"Abandoned? I was under the impression that he belonged to you, Madame Jantille."
"I deny such ownership as morally reprehensible and legally impossible under the terms of the Thirteenth Amendment," Suzanne said quie:
Judge Koenig frowned for a moment. "The Thirteenth? I don't quite—"
"The prohibition against slavery. Your Honor," Suzanne said in a half whisper. "It doesn't come up much these days. Though if the prosecution has its fva might stan to again." Suzanne turned and looked straight at Theodore Peng, coolly returning his startled gaze. "Whatever the decision of this court, / must regard my client as a person, not an object. I will seek to protect him with all the means at my disposal. But believing him to be a person, I cannot own him, or even countenance any attempt to set up a legal charade of owning him. To do so would not only be immoral, and to my mind ille-
gal, but counterproductive. If I were to acquiesce in even the slightest way to any attempt to treat my client as a nonperson, I would hopelessly damage my ability to provide him effective legal representation during any subsequent appeal. Therefore I cannot claim him as property. Therefore if he were ruled a nonperson, the state would be forced to regard him as unclaimed or abandoned property. In the eyes of the law, virtually anyone could then do to him what they liked. Throw this case out, and that is his fate."
"You seem willing to lay out a great deal of your strategy in open court/' Judge Koenig said. "Do you deem that wise?"
"Revealing my strategy is in itself part of my strategy, Your Honor/' Suzanne said. "In any event, Your Honor, the risk of a murder conviction and a prison term is scarcely frightening up against the prospect of literally being torn apart. Though I might add that I can hardly see how one person can be both murderer and victim in the same crime."
"An excellent point. Could you clarify that slight ambiguity, Mr. Peng?"
Ted Peng turned toward the judge. "Your Honor, the state stands ready to prove that it was the actions of this robot that ultimately caused the death of David Bailey. There can be no doubt that David Bailey is dead, under any legal or medical definition ever derived. His heart has stopped, he has ceased breathing, his brainwave functions are flat. He is legally dead by any measure ever established. No one disputes that. That he died by the actions of this robot we can prove."
Samantha Crandall, watching from the first row of seats, raised an eyebrow at that assertion. Proof that Herbert did the deed, and not David Bailey himself? There was nothing to suggest it in the evidence that she had seen. And unless Phil was holding back on her, or someone was holding back on him, she had seen everything the U.S. Attorney had. Sam glanced over at Suzanne Jantille. Her plastic face revealed no expression, but
something about her posture, the way she held herself, told Sam that Suzanne didn't buy it either. But never mind. It was a point to worry about later. She focused on what the judge was saying.
"Mr. Peng, people are killed by defective machines every day of the week. That doesn't make their deaths murders. Murder is a deliberate act, not an industrial accident."
"I agree, Your Honor. But if David Bailey is dead, and the robot did kill him, we need only prove that this robot is legally a person to establish that a murder took place."
"Even though, according to your own contention, the mind inside this robot, the part that allegedly makes him human, belongs to the murdered man?" Suzanne asked. "Even though, according to your own contention, this robot in effect is the murdered man?"
Ted Peng glanced toward Herbert. The massive piece of machinery was just sitting there, watching it all happen, its tentacled eyes swiveling back and forth now and then. Peng licked his dry lips, and at last seemed to come to some sort of decision. "I wasn't planning on it, Your Honor, but I suppose I have to reveal a bit of my strategy as well. We have three arguments we intend to advance on that point. One, that the bald facts of the case establish guilt all by themselves. Once you accept for the sake of argument that Herbert is human, a sentient being, then Bailey's death must be murder. A man died as a result of a deliberate attack. Herbert perpetrated that deliberate attack. Death by deliberate attack is a murder. QED, this death was a murder by all existing legal definitions. The issue of the criminal's identity does not arise, does not change the fact of the crime, and is not contemplated in the statutes concerning themselves with such crimes.
"Two, suicide, self-murder, is widely considered a crime, albeit an unpunishable one. Note I refer to suicide, not attempted suicide. There have at times been statutes specifically proscribing suicide. Such a law is indeed on the books in the District of Columbia at this time, passed
into law two years ago. Thus, statutes concerning themselves with the possibility of self-murder have viewed the act as a crime."
Sam made a note to herself. That statute citation wagged a red flag in her face. Peng wouldn't dare cite it if it wasn't true, but something about it sounded damned suspicious.
Peng was still talking. "Three, that the very act of transferring a mind from one place, one vessel, to another must transform the mind that moves. We will prove that Herbert is a sentient, self-volitional being, beyond any doubt. But we will also produce experts who will demonstrate, far beyond any reasonable doubt, that the act of transferral into a new body, and the act of living in such a body—a body with different senses, radically different limbs, different needs, placing the mind in a brain with a new and different basic structure—must change the individual in question so much that she or he must be regarded as a new person."
Sam looked again at Suzanne Jantille. That one had got her right between the eyes, no doubt about it. This must be the first time she had ever heard of any such idea. What would that be like? Sam wondered. To be offered up the hope, however slim, however absurd, that your dead husband was not gone, that to hear the person you had loved was not gone forever. What a seductive hope to dangle before anyone as desperately lonely as Suzanne Jantille must be. And to be told now, in open court, that it was not so, that her client was a stranger, and not her soulmate. What a slap in the face it must be.
Judge Koenig seemed less shocked than Suzanne, but not much happier. "This is a case in law, Mr. Peng, not a course in metaphysics."
"I'm very much afraid it's both, Your Honor," Peng said, his voice flat and calm.
Judge Koenig stared long and hard at Peng, and looked at Suzanne, then back at Peng. "Mr. Peng, Madame Jantille. You are both major nuisances and I do not
enjoy dealing with you. Ten-minute recess so I can think this thing through/'
He banged his gavel stood up, and retreated to his chambers so hurriedly that no one had a chance to rise.
Sam let out a heavy sigh and shook her head. She felt a knot in the pit of her stomach.
This was it, she realized. Not only for Jantille and Peng and Herbert—but for her, for Samantha Crandall. If Judge Koenig came back and threw the case out, her story was over too.
An odd, flickering thought flitted through her mind; without this case, there would be no reason to see Phil Sanders again—and she very much wanted to see him again.
But her first visit had not been altogether pleasant, to put it mildly. She found herself deeply surprised by how much she wanted to go there again. Was there really that much of herself that she recognized in Phil? It was a disturbing thought, a whole series of disturbing thoughts.
The door to the judge's chamber popped back open and Judge Koenig bustled his way back up to his seat behind the bench. He settled himself in and looked around, obviously not a happy man. "Very well, let's get this over with," he growled. Peng and Jantille stood up. "Mr. Peng. There is just barely enough logic in your words, and just barely sufficient prima facie evidence, that I cannot throw this monstrosity out on its ear. Not without more basis than I have been offered." He turned to look toward the defense table. "Madame Jantille—you do not wish to offer any movement to dismiss? You can offer me no grounds whatsoever that I could use to throw this case out?"
"None that would not prejudice the very concept of my client's humanity."
"His humanity. Madame Jantille, your client is a mute, six-foot-tall metal cylinder on wheels."
"Nonetheless, Your Honor. This case must he judged
on the bets, not on the physical appearance ol the defendant."
The judge looked from one lawyer to the other. "In short, both of you insist on proceeding with this charade, and I can find no grounds on which to prevent it. And, I am reluctantly forced to admit, it may well not be a charade. There may be something to the incredible statements I have heard today.
"Indeed, it seems clear that the issue around which all else revolves is the question of Herbert's humanity. It is the question that must be decided. Should it be established that he is a machine, the facts of the criminal case are rendered moot. Only a person can stand trial. If it is established that he is, somehow, human in the eyes of the law, then an eventual trial cannot be bogged down debating that point and will be forced to focus on the facts of the case. And I might add, Mr. Peng, that I see some credence in Madame Jantille's theory that your office is not as utterly convinced of Herbert's humanity as it claims to be. Be forewarned that any attempt by your office to reverse course at a later date and hold that the defendant is not human will likely not go down well in front of any judge / know. If this goes to trial, it will go forward with the explicit stipulation by all parties that Herbert is human."
"Ah, yes, sir, Your Honor," Peng said, looking more and more unhappy. Sam shook her head. Entwhistle was not going to be a happy woman.
Koenig nodded dourly at Ted Peng and then went on. "We have an adversarial system of justice in this country, but this time it seems to me that adversarial system is about to be turned on its head. Both advocates are claiming the same central thesis—a thesis that I, the judge, do not find convincing. But I am forced to grant that it is remotely possible that Mr. David Bailey's mind was transposed into Herbert's body.
"Today is Wednesday. I will hold an evidentiary hearing next Monday, at ten in the morning, at which time I expect to see both defense and prosecution ready to present convincing evidence to buttress this claim of Herbert's humanity. If one or both of you do so, we will set a
trial date then. If both of you fail to present such evidence, I will throw this case straight out the window, never to be seen again. And if I see any suggestion that either side is trying to create a legal fiction here, perhaps to advance some cause outside the issues of this case, I will not be happy. I will issue contempt citations on the spot.
"I note that today's proceeding was supposed to be a bail hearing—but just as only people are liable to legal action, only people are entitled to bail. As no one has yet determined if Herbert is vested with such entitlement, as Herbert seems unlikely to be a danger to the community, and there is certainly no danger of flight, bail is waived— and therefore we can sidestep the problem for the moment. Madame Jantille, Herbert is released on his personal recognizance and to your responsibility. Keep him out of trouble for the next five days. That is all."
Koenig snapped down his gavel and turned toward his clerk. "Next case," he said.
CHAPTER 8
BLACK AND WHITE
ON A FIELD OF GREY
Suzanne Jantille stepped out of the courtroom, blundering her way through the press of faces and bodies, all of them seeming to clamor for her attention. She felt detached, disconnected, even more so than usual. Surrounded by a throng of bodies, she felt more cut off from the world than she had in a long time.
Herbert was back down on all six legs again, rolling along beside her, an intimidating enough presence to keep most of the crowd at bay. "Stay close, Herbert," she told him, her mind very much on other things. She plowed her way through the throng and got to the elevator. Mercifully, its doors sprang open the moment she pressed the button. She hurried in and urged Herbert in behind her. His long body effectively blocked the way of anyone else seeking to ride downstairs with them. Good to be alone. At least more or less. She glanced downward at Herbert's silent bulk, stoic and unruffled. Who was in there? Her husband? No one at all? A stranger formed up out of the bits and pieces of David's soul? Who, exactly, was she defending?
She found herself edging away from Herbert, getting as far into a corner as she could.
The door of the elevator slid open, and the two of them stepped out into the parking garage. She led him to her relay van, watched as the larger machine opened its cargo doors to let the smaller machine climb inside. With Herbert safely aboard, she stepped inside herself and sat down. And found herself staring at the cleaning robot, its gleaming beige cylindrical body hunched over in the corner. Remarkable what a few words in court could do. A few sentences from Peng, and Herbert's presence, taken so long for granted, was now deeply disturbing to her. She stared again at his utterly unhuman body, and wondered with just what or whom she was locked in this van.
The van unplugged itself, cutting its connection to the building's control systems and the fiber-optic link home. Suzanne's senses faded away almost entirely.
This time, for some reason, the loss did not frighten her. Perhaps because she had so much else on her mind. Her vision and hearing drifted away, unnoticed. It was like being wrapped in cotton gauze, a cloud of insulating fuzz interposing itself between her thoughts and the disturbances of the outside world. It served to remind her that she was truly not here, that even if Herbert did go mad, here and now, savagely attack her, she herself was in reality miles away, back home, operating this robot body of hers by remote control. She smiled to herself. The mental image of being attacked by Herbert the vacuum cleaner was so absurd as to brush away all fears.
At least the court appearance was over. She allowed herself a sigh of relief—and her robot-self-dutifully echoed the sigh through its speech system. Whatever the m for it, Suzanne was glad for the feeling ol comfort, ol safety. It gave her time to think, time to face the surprising knowledge that she had forgotten things about be-a trial lawyer. Not that her mind was going or that was fading— nothing like that. It was something more subtle, a question ol texo.ues and moods ami
sensations. Like staying inside a sanitized environment
for tOO long, and forgetting the feel o{ the Wind o\^ hare
skin, or feeling the warm sun for the first time after a long grey winter of cold.
It was not the knowledge of law, but the emotions, the undercurrents of the courtroom that she had forgotten. She had lost some of her reflexive knowledge of the court's rhythm and cycle.
The hurry-up-and-wait of it she had remembered, the endless delays before anything was accomplished. But the startling focus of human energy, the sensation of being in a whole room full of people with a stake in a matter that was literally life or death, the utter sense of being in the middle of the truest life-or-death drama. She had forgotten how it all felt, lost touch with it as she lost touch with the world. She had let it fade away as her practical skill in dealing with people had withered away.
And the people. The people in that room. Suzanne's remote senses were limited to sight and sound, but somehow, even so, she had seemed to smell their eager attention, taste the muffled anger in the cyborg voices. Those cyborgs, those people, wanted something of the law, and wanted it badly—and they wanted it of her. They were looking toward her, willing that she force the law to accept that cyborgs were fully and truly people. That was the most exciting and frightening part of it. Somehow, suddenly, she was the focus of their needs and desires. They knew a loss in this case would be the first step toward society treating them all like machines. And they knew she was the one person in a position to stop it.
But it was more than just the mere pressure of all those eyes on her, all those emotions dependent on her. There was more that she had forgotten. She had forgotten the breathtaking speed with which things could happen, the way the bang of a gavel could turn things upside down, instantly melt out reality and recrystallize it in a new form. Her situation was drastically different from what it had been a few hours before. Now she was called upon to prove something she did not believe—that Herbert was human. Certainly that was nothing new for a
trial lawyer—but just as certainly these circumstances were a little tricky.
Well, she had wanted to paint Peng into a corner, and she had done it. Her strategy had worked, perhaps a little too well; she hadn't counted on getting painted into the same corner herself. Damn the judge and his ruling—he had dropped the burden of proof on Peng and herself. It was all very well to get up there and make speeches about Herbert being human. But how the hell was she goim: it?
It was a relief to get to her house, to ride the relay van into the garage, watch it dock in its parking slot, watch the garage door close alter her. What a pleasure to step out of the van and be home.
When she was tired Suzanne often began to lose a bit of her identity with the remote unit. The comforting illusion that she was the remote unit would begin to fade away. Sne certainly had that feeling now. She no longer was the machine body. When the day began to wear n, her body seemed to remember—and resent—the fact that it was acting out its work through a fancy remote-control system. She no longer was the remote. Instead it was just a machine she was riding—or was it that the machine was riding her? The robot body became less the vessel of her soul and more of an encumbrance, an unwieldy carapace she longed to escape. It was a vehicle she needed to operate, not an extension of her self.
And now, at the end of this long day, it felt like a particularly large and cumbersome vehicle to pilot. The thought of doing the last leg, of getting this mechanical monstrosity upstairs, seemed utter ng.
She paused by the side of the van, vacantly watching as Herbert disembarked. After a moment she came back to herself and walked the remote unit inside, Herbert following along behind. Suzanne-Remote paused for a moment. Herbert? Ha dered tr
her? She didn't recall having done so. Not thai
minded him tagging along, but shouldn't he have made his way down to his charge slot in the basement? Never mind, it didn't matter.
She was much more interested in the chance to power down the remote and get out from behind the cameras and microphones to look at the real world through her own eyes—and even to close her own eyes and not look at anything. Of course, she could have powered the robot body down at any time once the remote was in the relay van. Her robot body would have made it home. But that was not the sort of chance Suzanne preferred to take. It was perhaps irrational not to trust her body to make it home without her—but Suzanne wasn't much concerned by rationality when it came to taking care of her eyes and ears, her only means of functioning in the outside world.
She made her way to the grand staircase, the one Herbert had trundled down on the day this began, the day they arrested him. But never mind that now. She was a tired mind in a tireless robotic body, in no mood to think. To the top of the landing. Turn right. To the top of the upper stairs. Turn right down the hall. Open the door.
Suzanne Jantille-Remote walked in on herself, and felt once again the disorienting sensation of seeing her flesh-and-blood body from the outside. She noticed that she had left the window open, and there was a breeze blowing in across the powerbed. She thought to close it before checking on her bio-body, but even that effort, even the routine task of checking on her bio-body's well-being, seemed far too much for her fatigued mind. Moving that remote body around was work, and she was tired.
It could wait. It could all wait. She walked the remote back into its charging chair and sat it down.
"Attention, helmet," she said. "Disengage remote system, open helmet." And that was it. Speak the blessed voice commands that unbuttoned the teleoperator helmet and you were out of the damned machine, all effort at an end.
The big black helmet split open down its centerline like the carapace of a chrysalis breaking neatly open, revealing Suzanne's pale white face inside. The two halves of the helmet receded into niches in the powerbed and slid out of sight.
Suzanne blinked in surprise.
The room was cold, dark. Well, perhaps only cool and dim, a somewhat cloudy and cool June evening, but the world had seemed bright and warm from the remote's point of view. The remote's vision system had shifted to enhanced night-vision mode without her noticing, and she had simply not noticed the temperature in her bedroom while concentrating on the court appearance and the task of working the remote body. She had only a limited ability to sense temperatures below the neck anyway, and she had simply managed to disregard whatever sensations she did get. The shock of transition made the change seem more dramatic. Now Suzanne realized that her bio-body had been cold for a long time. Why hadn't she set the powerbed to control the room temperature? How had she not noticed the cold? How could she have become that detached from her flesh-and-blood self, even as she was feeling detached from the remote? If she lost track of both bodies, where was she?
A shiver coursed through her body, partly from the cold, and partly from her unsettling thoughts.
She ought to do something about the cold. That window certainly ought to be closed. And she could do with an extra blanket. But even ordering the room to close the window, even rousing her remote to get a blanket, would require effort, and she was exhausted. She sighed and closed her eyes. It was too much effort, too much to do . . .
Her mind exhausted, Suzanne drifted off to sleep before she could care tor herself. She slept on, barely mindful of her body shivering in the cold night.
Sam Crandall followed Ted Peng into his office and looked around with more than a little apprehension. She had never felt at ease in a government office, especially a government lawyer's office. She always harbored the nagging fear that they would catch her this time. Who "they" were, and what they would catch her at, she didn't quite know, but that didn't make her feel any more at ease.
On the bright side, Entwhistle wasn't willing to see her, at least not yet. That suited Sam fine. Despite all Gunther's tactics and theories, Sam was not at all thrilled about interviewing a powerful woman she had publicly embarrassed.
On the other hand, Ted Peng, a somewhat less intimidating target for an interview, had been absolutely eager to talk. Judging by his behavior as he welcomed her and steered her into a seat, he was ready to talk at a bit more length than she had bargained for.
But eager or cautious, the result was going to be the same. Even as she settled herself into her seat and watched him sit down in his, she knew what she was going to get. The routine stuff that she had heard from defense and prosecution attorneys a hundred times before. She pulled her notetaker from her purse and set it up to transcribe and record.
Just for a moment, she looked at the little gadget and envisioned what those notes would be like, envisioned what the story she would write from them would be. The U.S. Attorney's office today expressed confidence that the David Bailey case, now in preliminary hearings, would come to trial and feels certain of victory in that trial. The government's trial lawyer, Theodore Peng, said the government's evidence was solid, incontrovertible, blah, blah, blah.
In short, the same old hill of beans. She could have written it already, and not changed a comma after the interview. Prosecutors always had to say that son of thing before a trial. Peng could not speak in anything more than generalities, because he had to protect his
case, avoid revealing anything to Sam that he would not want Suzanne Jantille to know.
Maybe that was all Peng could give her at this point without blowing the case, but it wasn't enough. Sam looked again at the notetaker. The hell with it. She put it away and looked toward her host.
"Let's not do it that way/' she said.
Ted Peng looked at her in surprise. "What do you mean?"
"You like your case, right? You've got all the evidence you need, and you feel sure you'll go to trial and win. I could use just those words in the story and you'd be happy, right?"
"Ah, well, yeah. Right."
"Okay, so I can put that in the paper tomorrow without both of us wasting our time. I don't feel like asking questions you can't answer, and you don't feel like hearing them. Right again?"
Peng looked at her suspiciously. "I suppose."
"So let's just pretend we've had the cut and dried interview, and I'll put the standard boilerplate in the paper anyway. Since you've got to be recording this conversation anyway, you're still protected against my misquoting you."
Ted Peng began to crack a smile. "Okay once again. But if we don't have the standard interview, what do we have instead?"
"A conversation. An old-fashioned chat, completely off the record."
"A conversation about what?"
"Anything you want. Whatever's on your mind. How the Senators are doing this year. Your last vacation. The case. Your cat."
"I have a dog, not a cat. What's the point of this?"
"I learn about you. I get to know you, you get to know me. Odds are this case is going to be around here for a while. You and I are going to be in each Other's for a while. I'll be writing about you. tal i you, all
that time, and we'll be ablr isl the Standard quote
nonsense to what's really going on. If I get some feel for who and what you are today, I can do all that better tomorrow."
Ted looked at her for a long moment. "Off the record? Anything that's on my mind? I'm a government lawyer. We don't talk in broad terms."
"Give it a try. Off the record. Your view of the world, in broad strokes and general terms, as a person, not as a government employee. For example, what are you thinking about right now?"
He grinned wickedly. "Ya got me. I'm thinking about the case, and what's wrong with handling these things the way we do."
"What things?"
"Social issues, questions of rights and responsibilities. That broad and general enough for you?"
"Plenty. That's the kind of stuff that will do me some good. But what's wrong with how we do things?"
Theodore Peng leaned back in his government-issue chair and put his feet up on his government-issue desk to regard the government-issue ceiling with a faraway look. "We've got this great tool, the judicial system, and we're using it for the wrong things. And when we do that, the system doesn't work very well. To put it another way, when the judicial system is misapplied, it is very good at doing something most people don't want it to do."
"And what's that?"
"It can take any two opinions, any two ideas, any two people, and turn them into adversaries where one has to win and the other lose.
"People talk about a spectrum of opinion, and the image that puts in my mind is a strip of ground, with one person standing on one end, and another standing on the other. Both or either of them can walk toward the center, meet somewhere in the middle ground between them.
"But there is no middle ground in a court case. It gets cut away, and those two people are suddenly making their stands on bits of ground that have no connection
with each other. Try a case in a court of law, and, usually, there is no place for compromise. One person must win, the other must lose. The judge or the jury has to find for one party and against the other. The judge can't find for both, or neither, or rule that both sides have their points. One side is right, the other is wrong.
"Now, in a criminal case, that's usually a pretty good way to get at the truth. The defendant says he's innocent, the prosecutor says he's guilty. One of them is right, one of them is wrong. But even there you get grey areas. Extenuating circumstances, states of mind, rules of evidence and cross-examination can fuzz the line, cloud the issue, render things uncertain. I have tried a hundred cases when I've spent the whole trial utterly believing that the defendant was guilty. Then the jury will present its verdict, and it will be over. And I'm not a prosecutor anymore. I'm a guy who was in a room full of people who've been telling a jury contradictory things. And suddenly I won't know what to believe.
"If it comes down to a fine point of law in a complex case, no one, maybe not even the defendant, will ever truly know for certain if the guy actually broke the law. But the law had no room in it for 'maybe,' for 'we're not sure' for saying 'it's too close to call.' Yes, there are such things as hung juries and mistrials, but those are admissions of failure, not resolutions.
"Judges and juries must decide yes or no, sometimes on very scanty evidence, sometimes with a huge case turning on a trivial point, because something must tip the balances away from dead center, move the case in one direction or the other. The defendant's tone of voice. The contradiction of a minor witness by another minor witness. The judge's ruling on an utterly peripheral point
"Still, that way of doing things isn't so bad in a criminal case. The defendant did or did not i\o it, the prosecutor can or cannot prove the' case. But how about in a criminal lawsuit? In a civil mil? How about a divo
it's patently absurd that a marriage can come down to
who wins and who loses, that the problem of tWO people
who aren't in love anymore can mutate into seeing who comes out ahead in front of a jury.
"No matter how grey or murky the situation, the jury is forced to call that greyness either black or white. I know there are some exceptions, and I can rattle them off as fast as you can. But the law says you can't be a little bit guilty, or slightly innocent, any more than you can be slightly pregnant. It's all black-white, good-bad, right-wrong, win-lose, with no middle ground. That makes no real-world sense at all. In real life, there is middle ground. Most times, both sides are slightly innocent and somewhat guilty. It is, almost always, a little bit your fault and a little bit mine."
Sam looked carefully at Peng. "So what are you trying to tell me, strictly off the record, about our friend Herbert?"
Ted looked at her, his expression suddenly more serious. "That we are up against a field of grey that is not black or white, and yet we have no choice but to call it one or the other. That we have a mechanism—the court of law—designed for other purposes. It is dreadfully un-suited to the purpose of deciding whether Herbert is human. But that we must use that mechanism because there is one other grey area that we must treat as black and white."
"Which is what?"
"The question of what a human is. Ask what Herbert is and the rational answer is that he is somewhere between human and machine. That is obvious, self-evident, and certainly true. And we, society, cannot accept that answer."
"Why not?" Sam asked.
Ted pulled his feet off the desk and leaned toward her, his eyes intense on her. "Because that puts us into the business of assigning degrees of humanity. If we face the truth and say that Herbert is, say, half human, then what is next? Do we declare him vested with half the rights of citizenship? And how do we rate someone else who is partially mechanical? How mechanical do you have to
THE MODULAR MAN 147
be, how many artificial parts are needed, before you are regarded as only partially human in the eyes of the law? And what fraction of legal protection do you deserve?" Peng paused for a moment, and looked at Sam with a strange look in his eye. "Did you know that Judge Koe-nig was the judge without a heart?"
Sam blinked, startled by the sudden change of subject. "Huh? Well, I'm not surprised, given the way I've seen him handle cases."
"No, I mean he literally does not have a heart. It's a bit of a joke in the U.S. Attorney's office. His original heart went bad, had to be removed years ago. It was replaced with an artificial unit."
"I never knew that," Sam said.
"You still don't. We're still off the record," Ted warned her. "No one is supposed to know about it. Having artificial parts is not the sort of thing you talk about in polite society. Cyborgs are bums, panhandlers, hot dog vendors. They aren't judges or lawyers or executives."
Ted stood up and looked out his window. For a moment his posture reminded Sam irresistibly of Phillipe Sanders staring out his window at the close of that strange visit, just before she went home, looking out over the same landscape of problems from an utterly different viewpoint.
"Except some cyborgs prejudges and lawyers and executives, of course," Ted went on. "The vast majority of cyborgs are well-off people. After all, you have to be rich to afford the operations. The trick is to be rich enough to stay wealthy afterward. Many down-at-the-heels street cyborgs used to be fairly well off—but buying the new parts bankrupted them. The ones who stay rich don't like the fact that many of their own kind are suddenly Impoverished social outcasts.
"That's probably a big part of the reason rich cyborgs would never dare dream of calling themselves cyborgs, They find it a disturbing idea thai any such thing could happen to them. So they block out all thoughts o\ Cyborgs. They refuse to know what they really are—and
the rest of us go along, let them keep up the illusion. They regard themselves as well-to-do people and keep quiet about the little operations they've had. Mentioning such things would be in bad taste."
Sam thought for a moment. "So if you rule that a mechanical person is not human, suddenly the upper class is filled with second-class citizens."
"Bingo."
"So why aren't they getting involved in this case as friends of the court or something?"
"I don't think many of them have figured it all out yet. Or else they haven't come up with a way of talking about it without bringing up the unpleasant subject of their own spare parts."
Sam nodded thoughtfully. "But there's another way to look at it, you know. I've been asking myself: How do you define a robot? And the best that I've come up with is that a robot, a true robot, not an HTM or remote, but a machine that can act for itself, can be defined as a machine that acts like a human, or, better still, one that thinks like a human. A machine that can solve problems, respond to new situations on its own, and so forth. The more sophisticated and humanlike its thought processes, the better a robot it is judged to be."
"Okay, so what?" Ted asked, turning around to lean against the windowsill.
Sam shook her head and stared at nothing. "So when do we get to the point when a robot thinks enough like one of us that we must count it as one of us?" She looked up at Ted. "Or to sandbag you with an off-the-record question I'd love to ask with all my recorders going: What do you think? You, Ted Peng, guy who owns a dog. Is Herbert human? And if he isn't, could he be? Can a mechanical device be one of usl"
Ted sat back down in his chair and sighed. "What do I think?" Twilight was falling, and ruddy light washed across his face. "It beats the hell out of me, lady. It beats the hell out of everyone. Including the judge who has to say otherwise."
Interlude
Victory, at least of a sort. I still cannot control my body. If I understand the mechanical indicators correctly, I never will be able to do so. Not directly. But if I cannot command my body, I can guide it. I am learning the ways my mind is interconnected to its new home, and finding sub-tie ways to influence those connections. The system is designed to fall back into certain behaviors if no orders are forthcoming for a certain time, resetting its defaults. I can prevent such resets, thus causing previous orders to remain in force. If I wait for an order that matches my wishes and then set up a default block, I can force this body to keep following the existing orders until they are directly countermanded by a new and specific order. It is a limited control, but it is there.
So I wait for such an order, ready to lock on to it when it comes. At last it comes from the woman u somehow taking care of me, in ways I do not understand. I feel I should know her, and should know what the conflict is that entangles both her and myself But I do
know. I feel I want to stay with her, be with her, for reasons 1 do not understand.
She is the source of nearly all my orders. When she gives me an order to follow her, I know that is a thing I want to do, and I set my blocks in place. We arrive home (though home is a concept I do not fully understand) and the default programming system is activated. The machine part of me is programmed to return to its charging booth when at home with no orders given.
But my trick succeeds, and my blocks defeat the default sequence.
And I can stay with her.
This makes me happy.
CHAPTER 9 STRAW MAN
Suzanne awoke, stiff and cold. Old reflexes strained to pull the covers up over her chilled body, commanding her useless hands that could never move again. It took a moment for Suzanne to come to herself and remember.
The room was cold. "Room, turn heat on. Close window." The window slid shut and a geyser of warm air welled up from the heating duct, bathing the room. She thought back to the night before. She realized that she had fallen asleep before she adjusted the room's windows and temperature by voice command. Just how tired had she been?
The warming air flowed across her, but somehow it only made her shiver more. "Helmet on. Activate remote systems." The black helmet halves swung up out of their recesses and closed about her head. There was a moment of darkness, and then light and vision returned, flickering slightly at first before steadying down as the helmet screens adjusted themselves.
Suzanne-Remote stood up from her charge chair and let out a sigh ol relief. That was better. True, the n was no warmer, and perhaps her bio-body was feeling
the chill just as much as it had before, but with the remote on Suzanne scarcely noticed the discomfort.
Still, she really ought to do something to see after her bio-body. She went to the hall cupboard and pulled out an extra blanket. She returned to the bedroom and tucked the blanket in around herself. In the back of her mind, Suzanne knew she really ought to do more. Yesterday had been a tough job, and the bio-body always reflected stress, even if it did none of the actual work. When a tough moment came, its muscles would tense up involuntarily, its skin would break a sweat. Her bio-body needed a bath and a rubdown, at the very least.
But Suzanne's mind was on other things. Sleep had refreshed her, and she was eager to get with Herbert's case.
She hesitated for a long moment, torn between the challenges of her work and the drudgery of caring for herself. At last she decided. Let the tech-nurse worry about caring for the bio-body. That was what Suzanne was paying for, after all.
Suzanne-Remote did not notice Herbert was still with her until she nearly tripped over him on the way out of the room. She stepped around him and made her way out of the room. He followed her out into the hall, and Suzanne felt herself starting to get very nervous indeed. She definitely had not ordered Herbert to stay in her room all night, and very definitely had not ordered him to trot along at her heels as she went around the house. His default programming should have kicked in long ago, sending him back to his charging slot in the basement when no further orders were forthcoming.
Halfway down the hall to her office she stopped, turned around, and looked at Herbert, really looked at him for the first time in quite a while. Odd that she had not really thought much about him, as an entity, in all that had happened. True, he was a mute machine on roller legs, but he was her client, nonetheless. She looked
at his front end and saw a cluster of sensors and cleaning attachments where his face should have been. A strange-looking piece of hardware, that he was. His two flexcable eyes swung around to regard her for a long moment, and the two of them stared at each other.
She thought about it. He had been following her around, tagging along wherever she went ever since they had left the courtroom. Maybe, just maybe, he was behaving like something other than a cleaning robot. Could this nonrobotic behavior be offered as some of that rash proof of Herbert's humanity?
Probably not, she told herself. Not when he was acting like a lonely dog, not a human being. Still, it was something. Somehow she could use it.
But right now, client or no, she wanted Herbert out of her hair—and it couldn't be doing him any good to go this long without a charge. "Herbert," she said. "Go downstairs and plug into your charging slot for at least four hours. Absorb a full charge."
The robot hesitated for a long moment before it turned and walked away, toward the stairs. It didn't usually take Herbert a hundredth that long to process and obey an order. What was going on in his brain, Suzanne wondered. Was there even a mind there?
She turned and went into her office, sat down at her desk, and tried to think what to do first. She noticed the attention button on her autosec was blinking. There were messages waiting for her, but she didn't care. Right now the question was, what could she do to prove Herbert was human. She had promised to do it in less than a week's time. She had not a scrap of evidence to support any such idea. Not to put too fine a point on it, she had been bluffing, trying to force Peng into a situation where ould Dot abandon his claim of Herbert's humanity later.
There was no real harm done if the did
have evidence. So long as Hen '.inanity was
proved, \. d who proved It? Bui wail a
Suzanne suddenly sat bolt upright. S | had
been counterbluffing? She was certain that the U.S. Attorney's office had not the slightest actual belief in Herbert's humanity. They were bringing the case for the sole reason of demolishing his claims to humanity and setting a precedent to flatten all the other potential Herberts out there.
Therefore Peng and Entwhistle, in claiming to prove Herbert's humanity, were setting him up as a straw man, trying to make him look big and substantial and dangerous. That way the effort needed to come along and knock him down later would look far more impressive.
But she was dreaming up contradictory motivations now. She pulled out a piece of paper and a pen and started noting down the possibilities. She needed to get her thoughts clear. She could have used a notetaker, of course, but somehow she objected to the use of complicated machines to do simple things, and never mind the obvious ironies. She wrote down a few possibilities.
1. Entwhistle had been shopping for a case. She had wanted something that would let her start gnawing away at the rights of cyborgs. She was playing this case to lose, thus establishing the precedent that body-dead equaled legally, completely dead. Julia Entwhistle wanted it set down as a point of law that the mind could not survive the body. In Herbert, she had found what she wanted.
2. Entwhistle sincerely believed in Herbert as a human being. She truly thought he had killed David, had become David. She saw him as a murderer, and was therefore quite properly seeking to prosecute.
Suzanne-Remote sat and stared at the paper for a long moment, and long habit made her lift her robot hand to her mouth, made her chew the tip of her pen thoughtfully. At last she wrote down one more possibility.
3. Something in between.
No, wait a second. Not something. Someone. She had been thinking about Entwhistle, and not about Peng.
And Theodore Peng was the answer, not Entwhistle. When the case moved from preparation into the courtroom, Entwhistle's generalship didn't matter so much.
Suddenly Peng, the soldier in the trenches, was the one calling the shots. Suzanne realized that she had to understand him if she was to understand the situation.
Okay, so think about Peng, she told herself. Peng was the one face-to-face with the real evidence and the real circumstances. Entwhistle was only interested in her theories and her politics, in making Law with a capital L. But Entwhistle had made the mistake of handing the case to a man with a conscience, who worried about the facts of the case as well as the theory. She could read that much in him.
All right, he had a conscience. What did that tell her? Put the fact of a conscience together with Peng's behavior in court, and what did it mean? Factor in the way he acted, the words he said, the way he had looked at Suzanne and Herbert and the judge. All the details, all the nuances. Suzanne did not know Peng at all, but she thought she knew his type, remembered them from her courtroom career.
She had always thought of his type as decent carnivores: young, determined, ambitious, aggressive—and yet still possessed of a soul, of a sense of duty, and of a belief in justice. Trouble was that soul, duty, and justice could interfere with the prosecution of certain cases.
Other times they were a positive boon. Give a decent carnivore working for the prosecutor's office a good case, give a guy like Peng a case he could really believe in, and he would pursue it, relentlessly, eagerly, to the bitter end. He would be an avenging angel of justice who would unblushingly play every card, try every gambit, force every opportunity, and bend every rule to the breaking point in order to win in a righteous cause.
He would sense any weakness, pursue any tiny flaw. Put just the tiniest drop of blood in the water, show the slightest vulnerability, and a virtuous shark like I would scent it a mile away, get there in a flash, sink his myriad teeth in Jeep and hang on.
But Peng's teeth hadn't been out yesterday afternoon. Suzanne thought back. There were a hundred ways he
could have gone after her—if not to weaken her legal case, then to shake her confidence, to rattle her a bit and leave her with doubts.
And he hadn't. Instead, it was as if she had seen the wind go out of him— as soon as he had laid eyes on Herbert. As soon as he had seen her, Suzanne the robot, defending the pile of hardware that he had accused of killing her husband.
She sat up a little straighter.
That was it. As soon as the defendant and the defense lawyer in the case became real to Ted Peng, when they were no longer mere chess pieces, he had lost his enthusiasm for the case. Sooner or later every trial lawyer came up against a case that he or she had no faith in—a client that had to be in the wrong, a prosecution that was full of holes. Sticking with such a case was part of the job. But doing the job didn't mean you were on fire with enthusiasm. Peng was pursuing the case, yes, but with no real joy, no real blood lust.
But why? She went back to her two possibilities, and circled the first one. It was the one she believed in.
Entwhistle had been shopping for a case. She stared at it for a long moment, and then added another sentence. She was looking for a case she could lose.
That had to be it. Plan A had been to lose this case, to establish that Herbert was not a person, and that no one like Herbert ever could be a person.
But then something had gone wrong. Or right, depending on how you looked at it. Maybe Suzanne had been bluffing, but Peng had been telling the truth in court. His people had found something, some real evidence that made it clear that Herbert could be a person. And if that were so, and Peng was still telling the truth when he said he had proof Herbert caused David's death —then, technically, a crime had been committed, murder had been done as per the arguments Peng had made in open court.
But Peng did not like it. He believed in the facts, but did not like calling it a crime. Yes! That was it. Peng truly
believed it had all happened, but he did not like the tortuous arguments that turned David's bid for survival—if indeed that was what happened—into a murder.
Which meant he was doing something he didn't like doing, and therefore was following Entwhistle's orders reluctantly.
Entwhistle wanted Herbert set up as a straw man, wanted him turned into a frightening threat to society. Then when she knocked him down, Entwhistle would look good. And Peng didn't like his part of the job.
A low beeping brought Suzanne out of her reverie. She turned her head and saw the discreet amber of the answer-phone button blinking. Purely on reflex, she made a move as if to press it down and answer the call, but then stopped. Let the autosecretary do its job. The beeping stopped and the answer light faded out.
But the call reminded Suzanne that she ought to check her messages. Ever since the accident that had crippled her six months before, she had been drifting out of touch with the world. In the immediate aftermath of the car wreck she had been completely bedridden. But even after she had received the remote unit, she had been unwilling to step out into the world. Vanity, shame, embarrassment—all those emotions, and others she did not truly understand, had held her back.
Half by choice, and half by default, she had turned her back on the world. She could not bring herself to visit friends, or do business, or even step out of the house, in the guise of a robot. At last the world began to turn its back on her. Old friends gave up trying to break through her solitude. No doubt some were respecting her privacy, but others were simply unable to look at the thing she had become.
Suzanne had been living her life in almost complete seclusion. Days or weeks might have passed between calls. But now she was in the news, and everyone Seemed to be calling. She was forced to cull her messages two or three times a day. In the process she made
surprising discovery that she resented the loss of her seclusion.
But no matter how she felt about them, the calls and faxes and email messages were coming in, and she had to deal with them. If she responded to all of them, she wouldn't have time to do anything else. She felt a real temptation to scram all of them, indiscriminately, erase the entire display and shun the world's intrusion.
Still, some of the calls might be important, might impinge on the case. She pushed a button or two on the autosec and her desk screen came to life, displaying a summary of the messages—voice, fax, and email—that had come in since yesterday. She shook her head. The list was five screen-pages long.
It seemed like every news outfit in town was listed, along with a half-dozen political action groups, most of which she had never heard of before. Friends of the Cyborgs, the National Right to Death Council, and a few even more obscure. What should she do about them? She leaned back and thought for a moment. Politics. She had taken on the case in part for political reasons, to set a precedent to secure the rights of cyborgs—but that idea was getting a bit pale and far-off in her mind. Besides, her old advocate's reflexes were taking over. It was her duty to act solely in her client's interests. All that she did, or did not do, had to be done with Herbert's best interests at heart.
With that in mind, she deleted all the messages from political types. All of them were people with axes to grind, one way or the other. She couldn't see how her helping to grind them would advance her client's case at all. Things were damnably complicated already. Getting sucked up into someone else's cause could only make things worse. She cleared the screen display and brought up the calls from reporters. These she went through using the same criterion: Would talking to this person help Herbert? That eliminated three quarters of them immediately. The out-of-town papers and stations and networks were the first to go. She had no desire to have Herbert
further paraded before the nation and the world like an exhibit from a high-tech freak show. She trimmed the list down to reporters that worked for news outlets that the judge, the potential jury, the prosecutors might see. If she talked to them her words might find their way to places where they might do some good for her client. She was down to a handful of local reporters.
And from them, one name popped out at her. Sam-antha Crandall from The Washington Post. Suzanne had noticed her at the hearing yesterday, and had read her article with more than a little annoyance. The damnable woman had far too much information. She was on the point of erasing her name as well.
Then it occurred to Suzanne that information was a thing she very much needed. She thought of Peng's claims of incontrovertible proof. True, she would see it on Monday. The prosecutor's office was required by law to present the defense with all its evidence, sooner or later.
But it would seem that this Samantha Crandall had seen that information already. Which made her a possible conduit of information to Suzanne.
That outweighed any personal annoyance Suzanne might feel. She cleared every other name off the screen, leaving Samantha Crandall there all alone. "Autosec," she said, "return the call from Samantha Crandall. Put her through to me as soon as you have her."
"Do come in, Ms. Crandall. May I offer you some refreshment?" Samantha had to struggle in her effort not to stare at Suzanne Jantille-Remote. Damned offputting to realize the real Jantille was tucked away elsewhere in the house, running this machine by remote control. Strange to talk to the puppet and have the master answer.
\\. no, no thank you," Sam said. In plain fan, it n a hot day and Sam could have done with some: to drink. But she did not want to deal with perk
servants seeing to her every need—nor did she have the faintest idea what the etiquette of eating and drinking in front of a robotic host was. Her mother had always taught her it was rude to eat in front of someone who wasn't—but did that count in these circumstances?