CHAPTER 4
Tyler spotted Naomi Sumner in the Graduate Student Center the next morning. She wore a blue University of Pennsylvania sweatshirt and kept her head down, waiting for a turn at the coffee machine. The Center’s common room offered free coffee all day on a bring-your-own-cup basis, making it a popular morning location. A busy stream of students filed in and out of the doors, fueling up before classes or meeting up with friends.
Tyler jumped up from the overstuffed chair he was lounging in and went to meet her. “Hey,” he said. “You know that coffee’s for grad students only.”
He smiled to show he was joking, but she blushed and turned away. “I usually come in with my sister,” she murmured.
“I know. I was just . . . never mind. Bad start. I’m Tyler Daniels. You were at my autocar test yesterday.”
She just looked at him without saying anything.
“Which, of course, you know,” Tyler said, feeling like he was rambling. “You were there when the venture capitalist stopped by too, weren’t you?”
Naomi nodded, one hesitant bob of her head that stopped as soon as it started.
“Well, she’s investing in our project, at least for a little while. She wants us to give a demo in a month. It’s not a lot of time, and there’s a lot of programming to do. I checked you out online . . .” She gave him a quick look, and he hastened to explain. “Your open source contributions. Lots of machine learning applications: voice recognition, handwriting identification. Good stuff. Everybody I talk to says you’re the best in your class. I was wondering . . .” The girl ahead of Naomi stepped aside, and Naomi sidled in to fill her travel mug with coffee. Tyler stood uncomfortably next to her while the coffee poured. When she straightened, he said, “I was wondering if you wanted to join us. Help us get the software ready for prime time. It’s not a paying gig, or anything, but if our company takes off like we hope it will, you’d be right in on the ground floor.”
She met his gaze briefly, and then her eyes slid off to the side. Her shoulders lifted slightly, as if she were a turtle trying to pull her head into her shell. “Okay,” she said.
“Okay? You’ll do it?”
Another awkward head bob. “I have to go to the library,” she said.
“All right. I’ll walk with you.”
A brief look of panic flitted across her face, and then vanished. “Okay,” she said again.
Naomi headed out the door, and Tyler followed her. “You saw the trouble we had when a tire went flat,” he said. “We need to raise the software to the next level. Its training has been superficial, just enough to implement some basic scenarios. We need to widen its experience, cover a lot more cases.”
They crossed Thirty-Sixth Street and headed off across the green. The trees overhanging the crisscrossing brick paths were budding, and the fresh-washed scent of spring was in the air. Tyler loved this time of year, even though it often brought rain. With the bitter cold gone, students tossed Frisbees across the lawn, and others sat on benches to do their homework.
“What do you think of the Gomez bill?” Naomi asked.
Tyler turned, surprised to hear her speak. Gomez was a bill before Congress that would require autocar manufacturers to include loopholes for law enforcement, such as an override signal a policeman could send to force a car to stop. It addressed the public fear of an autocar gone berserk, with no way for a human to gain control. “I think it’s a terrible idea,” he said. “How long do you think it will be before people other than law enforcement get a hold of the key or find some way to hack the signal? Do you want someone to be able to force your car to a stop at the side of the road at night? If that bill passes, it’ll be a disaster.”
“It’s unlikely to pass,” Naomi said. “It’s getting some traction in the House, but the Senate is 60–40 against.”
“People are afraid of all the wrong things,” Tyler said. “They imagine a robot apocalypse run by malevolent AIs bent on murder, and they want protection against that. But they don’t fear the much more likely dangers that AIs protect them from every day.”
“A recent poll showed that thirty-seven percent of people think artificial intelligence will be a threat to humanity,” Naomi said.
“It’s not just about AIs, either. If a plane crashes, it makes big news and sends people into a furor calling for measures to make sure it never happens again, no matter the cost. But in the time it took the plane to come down, more people in the country are killed in car accidents, every day. The plane crash is rarer—and somehow scarier—and so it gets more attention than the thing that’s actually likely to kill them.”
“You’re right about that. People are eighty-six times more likely to die in a car crash than in a plane crash,” Naomi said.
Tyler gave her a suspicious look. Her last few responses had been oddly stilted, and a bit heavy on the random statistics. “Are you using a conversation bot?” he asked. He regretted the question as soon as it left his lips. If he was right, he would have embarrassed her, and worse, if he was wrong, he would have insulted her.
She blushed. “I’m sorry.” She looked as though she wanted to dissolve and soak away into the grass. “I’m not very good at conversation, and Jane—I mean the bot—helps. Otherwise I just don’t say anything.”
“No, I don’t mind. It’s really good,” he said, backpedaling and cursing himself. “I know they exist, but I never heard of one being that good before. Did you write it yourself?”
She nodded, but looked away.
“Honestly, I’d like to check it out. Have you open-sourced it? Is it out on GitHub?”
She shook her head and mumbled something too soft for him to hear. They reached the library doors, and Tyler pulled one open, letting her walk in ahead of him. He considered just saying goodbye right there, hoping he hadn’t screwed things up so bad with her that she wouldn’t program for them, when something she’d said fell into place in his brain, and he followed her inside.
“Jane,” he said. “You named your conversation bot Jane? Like in Speaker for the Dead?”
She whirled to face him, this time with a genuine smile on her face. “You know it?”
“Of course, I know it. It’s Card’s best work.” It had been written well before he was born, but it was an important part of the SF canon.
“No,” she said. “Nothing beats Ender’s Game.”
They argued about that briefly, just standing there in the atrium, until Tyler realized he was blocking the entrance. “Sorry,” he said. “Where were you headed?”
“Um.” She twisted her hair around one finger. “I have some books I need to check out for one of my cognitive science classes. I spend a lot of time here, actually.”
Tyler grinned. “She sounds like someone who spends a lot of time in libraries, which are the best sorts of people.”
Naomi clapped her hands. “Catherynne M. Valente!” she said. “From The Girl Who Circumnavigated Fairyland in a Ship of Her Own Making! I love that quote. I had it taped inside my locker in middle school.”
“Well, if you’re not in a hurry”—Tyler spotted a cluster of unoccupied reading chairs—“I could show you around our code. If you’re really going to help us out, that is.”
“Okay.”
Tyler led the way. They sat on two comfortably-stuffed chairs arranged around a half-moon coffee table, decorated with a metal vase of faux dogwood branches and some kind of generic white blooms. The library was new, designed in a sparse, modern style that preferred brushed steel and abstract art over wood paneling and portraiture. High on elegance, but short on mystery.
“Ever been to the J.P. Morgan Library in New York?” Tyler asked. “When I’m a billionaire, that’s the kind of library I’m going to build. It’s like the one in Beauty and the Beast—a huge room, three stories high, with balconies, murals, a domed ceiling. Only mine will have secret walls that open up and teleportation circles to get around. And carnivorous shadows.” He eyed her for a reaction—he’d been referring to a library in a Doctor Who episode, but if she didn’t recognize it, then that last part would make him sound like an idiot.
He needn’t have worried. “So big it doesn’t need a name, just a great big ‘The,’” she said, smiling and brushing a lock of hair back behind one ear.
They synced glasses, and he started walking her through the code, showing her how it was organized, the training data they were using, and their build process. She picked it up quickly, often understanding the intent behind a function before he explained it. Instead of worrying if she’d be good enough to help, Tyler found himself worrying about her opinion of his code. Did she find it amateurish? Was she laughing at him behind that shy reserve? He got the feeling that there was a lot more going on in her mind than showed on her face or came out of her mouth.
“The real problem is the edge cases,” she said. “These days, it’s easy enough to train an AI to do simple recognition tasks— identifying faces, voices, cyber threats, suspicious behavior. But it’s only ninety-nine percent. When people’s lives are on the line, it’s not good enough. You need an algorithm that can use good judgment with incomplete or conflicting data.”
It was the longest group of sentences he’d ever heard her string together, but Tyler just went with it. “What does good judgment even mean in this situation? We call what we use ‘AI,’ but it’s not really intelligent. It doesn’t think, not really. We train a sophisticated mathematical configuration to filter out bad choices and select good ones, but that’s not the same as having creativity, or making leaps of intuition, or showing common sense. And there’s no way to test every possible situation.”
“We need an AI whose highest motivation is to keep human beings safe, with the judgment to evaluate its own decisions on that merit,” Naomi said.
Tyler grinned. “Three Laws Safe.”
She took it seriously. “Exactly. What’s the modern equivalent of Asimov’s Three Laws? How can we make autocars inherently safe?”
“The problem is, cars aren’t safe,” Tyler said. “You’re flying along in a two-ton steel box with lots of other two-ton steel boxes. Asimov’s robots just wouldn’t drive at all. They might even prevent a human from driving, if they could. ‘A robot may not harm a human being, or allow a human being to come to harm.’”
“Except in ‘Little Lost Robot,’” Naomi said. “In that story, they intentionally modified the First Law, so robots could work with humans doing a somewhat dangerous job without preventing them from doing it.”
“I remember that story. It was radiation, right? The humans would get a small dose of radiation, and the robots had to be able to allow that to happen.”
Naomi nodded. Tyler noticed that she still focused her eyes inside her glasses, not at him. He wondered if she was still reviewing the code while she talked, or if it just made her more comfortable to pretend he was an online contact instead of a person in real life. “So that’s what we need,” she said. “A root-level, built-in inability to harm humans directly.”
“Directly? So, in that case in Seattle, the woman’s car wouldn’t swerve, because hitting the motorcycle would be directly causing harm to humans? Whereas plowing straight into the tree would be inaction—it might kill more people, but not actively on the part of the AI?”
“It sounds kind of stupid when you put it like that.”
“Well, not necessarily. This is the kind of question moral philosophers argue about into the wee hours of the night. Is there a difference between doing and allowing? Between allowing harm to happen and doing the harm myself?” Tyler realized he was grinning. This was the kind of conversation he wanted to have with Brandon, but Brandon always resisted it. He cared about practicalities, not morals.
She thought about it. “I don’t think there is a difference. If I truly have the power to stop it, and I don’t, that’s just as bad as doing it. Neglecting a child is just as wrong as actively hurting her—in both cases, you’re causing harm, even though in the first case, you’re technically doing nothing.”
Tyler was enjoying this. She had relaxed in the chair opposite him, and although she still wasn’t meeting his eyes, she at least wasn’t browsing her glasses anymore. “Do you know the trolley problem?” he asked.
She shook her head.
“Really? It’s something they talked about a lot, back when the first autocars came out. It’s an ethical thought experiment. Here . . .” Tyler pulled a straight dogwood branch out of the vase on the coffee table. He laid it flat on the table. “This is a train track.” He gestured at her travel coffee mug. “May I?” She nodded, and he placed the mug at one end of the branch. “This is a runaway train, brakes not working, and you’re the driver. Down the track a ways, five people are working and don’t see you coming. You’re about to plow through and kill them all. But!” He pulled another branch out of the vase and laid it across the first, creating an alternate, forking path for the train to take. “On this track, there’s only one worker. You have a choice. You can switch tracks, intentionally and actively killing the one person, or you can do nothing, and let the five die.”
“That’s easy,” Naomi said. “Of course you choose the one. It’s not your fault either way—you don’t intend for anyone to die. You’re just minimizing the loss of life.”
“Fair enough,” Tyler said. “Most people say the same. Not all, but most. Try this variation, though. Instead of driving the train, you’re on a bridge above the tracks, watching the drama unfold.” He removed the second dogwood branch. “There’s no fork in the track, just five people about to be killed. You realize the train can’t stop, but you’re a railway engineer, and you know that if you could drop a weight of at least three hundred pounds on the track, you could stop the train before it reaches the workers. You don’t have a weight, but there happens to be a fat man on the bridge in front of you, right over the tracks. If you push him over the edge, the train will hit him instead and the workers will be saved. Should you do it?”
Naomi didn’t hesitate. “Of course. Five for one, the same as before.”
Tyler opened his mouth and closed it again. He had been expecting her to say no, of course not, you couldn’t push someone off a bridge—that was murder, even for a good cause. Then he could point out how this indicated there must be a difference between actively causing harm and just allowing harm to happen, because of the difference between these two cases. What did her answer say about her—that she was callous? Or just more consistent than most?
“But what if you were the fat man?” he blurted. “Would you still make the same choice?”
This time she had to consider. “That’s a very different question,” she said finally. “But the answer is the same. I should throw myself off to save the others, assuming I could know for sure that the others would be saved. It’s the right thing to do. But in real life, would I? What if my sister was the fat man . . . would I then? Probably not. But that’s because my sister is worth more to me than any five strangers.”
“It’s not a philosophical question, then; it’s a personal one,” Tyler said. “Which is exactly the problem we have with auto-cars. What people want to happen in general, to strangers, is different from what they want to happen when their own loved ones are involved. We somehow need people to agree on what choices are fair and correct before personal considerations get in the way.”
“It’ll never happen,” Naomi said, and for the first time she met his gaze directly. “Everything in life is personal.”
Her eyes were a deep brown, and while her face often seemed to hide her feelings, the eyes expressed them. She had none of Abby’s vivacious charm, but Tyler thought she might just be the prettier of the two.
“Hey, are you free tonight?” Tyler said. “Brandon and I are going car shopping. We need to add a few more vehicles to the fleet, now that we can afford it. We’re spending somebody else’s money. It’ll be fun.”
Her gaze dropped to the floor again. “I don’t think so. I’m busy.”
“Okay,” he said. “Maybe we could catch dinner together sometime. You free tomorrow?”
She stood hastily and picked up her travel mug. “I should go.”
Tyler studied her face, but she showed nothing. He had thought they were hitting it off together, but maybe not. “Okay,” he said. “I’ll send you the link to our code repository, so you can get started.”
“Great,” she said, so softly he could barely hear her. She jostled the coffee table on her way out, so that one of the dogwood branches slid onto the floor. She pushed through the library doors and out into the sun. Tyler watched her go, a little stunned. She had never even checked out the books she said she needed for her class.