CHAPTER 5
Naomi felt uncomfortable, so she did what she always did in those circumstances. She shut herself away with her software. She couldn’t go directly to her secret library nook, because Tyler might see her. Instead, she stood behind a statue on the green, waited until she saw him leave, and then slipped back into the library and up to the second floor.
She didn’t have anything against Tyler Daniels. She had actually enjoyed their conversation, at least a little, but the effort of talking with a stranger exhausted her. The idea of going out again, in a situation she couldn’t escape by just walking away, was more than she could handle.
Besides, she needed to check on her Mikes again. That morning, before leaving the library, she had reviewed the history of the competitive world. She found that most frequently, the Mikes who died did so by attrition, one at a time. The Mikes who contributed least to the survival of the group were denied food when there was a lack. The world wasn’t big enough for any of them to strike out on their own and hope to survive; all available resources were co-opted by the group. That made it unlikely for rival groups to grow and war against each other. Each mike, however, was rewarded for individual survival, not for the survival of the group. The scheme was evolutionary, but unlike in the biological world, the ability to produce offspring came at the end of life, not in the middle, so every Mike had incentive to live as long as possible. Every millisecond counted. As long as an individual mike’s survival was linked to the survival of the group, he would work toward that end. However, once a Mike was marked for death, or predicted that future for himself, his actions would change. Some stole food and fled, and were then hunted down by the rest. Others attempted to destroy the entire village to achieve some small amount of extra time for themselves. Those most individually successful, however, were those who could find ways to preserve a large number of Mikes in their world.
In the most recent versions, the strategies for claiming sunlight had become more sophisticated, using what amounted to a series of giant rectangular mirrors to reflect the sunlight away from the rocky terrain to fertile ground, making the grazing grass or crops grow at prodigious rates. In this world, the Mikes had dug, too, sending sunlight down onto underground yam fields to produce more food. They always lost some percentage of their population toward the beginning, before they could establish their infrastructure, but the later generations were increasingly able to survive harsh winters or even the occasional devastating storm. It reminded Naomi of a Dyson sphere—the hypothetical sphere a planetary civilization might build entirely surrounding its sun, exploiting the entire energy output for their own purposes. She wondered if the Mikes might eventually find a way to accomplish the equivalent feat in their own world.
However, there was still no sign of emergent creative behavior. No art, no sports, no activities that didn’t directly support survival. The Mikes didn’t seem to communicate in any way beyond simple reactions to each other’s actions. To each of them, the other Mikes were nothing more than a part of their environment, to be manipulated however possible to achieve the desired outcome. One could argue that they weren’t so much coordinating as independently discovering strategies that jointly enabled them to survive.
It was enough. Enough for publication, enough to attract the attention of graduate schools, enough to land her a good job in the industry. But it wasn’t enough for her.
Naomi selected the best one thousand Mikes from the most successful versions of their worlds, and started building a new Realplanet simulation for them to inhabit. She made this new world a harsher place, scattered with hidden traps, like nests of giant wasps that would attack and injure, and pits with lava that would cause burns. Nothing that would kill by surprise, at least not directly—she wanted to see if the Mikes would communicate to warn each other about the traps. She felt a little bit evil, like a game master in The Hunger Games, setting traps to catch unwary innocents, but it seemed as though competition and danger were critical to the development of intelligence.
Naomi stood and stretched. She had skipped lunch, and she was hungry. It was time to find some dinner. The quickest option would be one of the on-campus cafés, which really weren’t too bad. The only problem there was that she might run into someone she knew, but she would just have to risk it. She emerged from her secret lair and went out into the real world.
When Tyler returned to the apartment he shared with Brandon, he found Abby Sumner there, reclining on their sagging, secondhand couch with Brandon, laughing. The apartment was one story of an old Philadelphia townhouse, with tall ceilings, narrow rooms, and warped wooden flooring that threatened splinters to unwary bare feet. Monty Python and the Holy Grail played on the wall, the scene where King Arthur’s company is attacked by the Legendary Black Beast of Aaaaarrrrrrggghhh. The two were sitting very close, and Tyler got the idea that the movie wasn’t what they were laughing about. He considered making some excuse and leaving the two of them alone, but it was his apartment, too.
“I thought we were going car shopping,” Tyler said.
“We are,” Brandon said. “I invited Abby to go with us.”
“I invited Naomi, but she turned me down.”
Abby laughed. “Poor Naomi.”
“Poor Naomi?”
“Yeah. She enjoys being with people,” Abby said. “She really does. She just doesn’t admit it to herself.”
“I thought I had said something that offended her,” Tyler said.
“I don’t know if she’s ever been offended in her life. But if she were, she probably wouldn’t tell you. She certainly wouldn’t walk out in a huff.”
“Maybe she was just busy tonight,” Brandon suggested.
“Yeah, busy hiding in the library and working on her software,” Abby said. “Just like every night. I’ll take care of this.” She touched the side of her glasses. “Call Naomi.” They waited while the glasses made the connection. “Hey, girl, it’s Abby. Come on out with us tonight. It’ll be fun.” A pause. “No, you don’t. No, you’re coming out with us. It’s final. Okay, see you soon.” She looked at Tyler. “Problem solved. She’s on her way.”
When Naomi arrived, smiling shyly, the four of them headed to the row of car dealerships on Grays Ferry Avenue. Brandon was the force behind the outing, eyeing cars skeptically under the hood and negotiating hard. The rest of them were just along for the ride, keeping it fun and pushing Brandon to test-drive the most expensive cars in the lot. Eventually, they decided on a pair of electric Honda Alexis. The Alexis weren’t any fancier than the Accords, but as Tyler and Brandon had discussed many times, it only made sense to go electric for a self-driving fleet. An electric car could be designed to return to base and recharge itself a lot easier than a gasoline car could refuel itself. The only problem with electric cars was the infrastructure, and offering cars as a service solved that problem nicely.
After a boring round of paperwork, they signed over Aisha’s money, and the cars were theirs.
“Time to celebrate!” Brandon announced.
“What you need,” Abby said over their first round of drinks, “is a showgirl.”
They sat in a booth, Brandon and Abby on one side and Tyler and Naomi on the other, drinking bottles of Yuengling and munching on a plate of wings.
Brandon coughed. “A what?”
“You’re going to have this big demo for investors, right? You can’t just have it work right. You need to put on a show. Paint your cars all the same, something flashy, with a racing stripe and a company logo.”
“Hondas aren’t very flashy,” Tyler said.
“Hush. You need a flashy paint job, and you need a beautiful showgirl, somebody charismatic, to point at the cars and flash her winning smile and announce each bit as you perform it. You can’t just sit there with your tablet and say, ‘Now we’re going to execute scenario number five.’ You need some sex appeal. You’ve got to sell it.”
“And where would we find such a goddess?” Brandon asked.
Abby threw her arms above her head like a circus performer. “You’re looking at her, baby.”
Brandon frowned. “You want to be a car show bimbo?”
Abby dropped her arms and narrowed her eyes at him. “Be nice. I’m not going to wear a bikini or do a little dance. But if you want investors to rain down millions on you, you can’t just let the technology speak for itself. You’ve got to create a spectacle.”
“She’s right,” Tyler said. “We need a public face. I’m not great in front of an audience, and you’re uglier than an Ood.” He knew Brandon wouldn’t get the Doctor Who reference, but he glanced at Naomi, whose mouth twitched into a shy smile. “For the demo, we need to have a script, perfect timing, show-manship. But even after the demo, we’ll need to be raising public awareness, advertising, and establishing trust that our cars will keep people safe.”
“How many millions do you need?” Abby asked.
“What?” Brandon said.
“Money. What’s your goal? How much do you need to start your company?”
“I . . . uh . . .”
“Whatever we can get,” Tyler said. “We’ll start as small or as big as we have the means for.”
Abby shook her head, her expression scolding. Tyler remembered she was a business major, working on her MBA. “That won’t do at all,” she said. “You need a plan, and you need a make-or-break minimum threshold. More than one threshold, if you like, to designate different levels. But when an investor asks you how much you need to get started, you should have an answer. You can’t just say, ‘Gimme everything you’ve got.’”
“I figured they would have an amount they were willing to give.”
“It’s an investment with considerable risk,” Abby said. “Sorry, but it is. They want to give the right amount—not so little that it doesn’t help, but not so much that they’re throwing money away. You need to be ready with an answer, and know why it’s the right one. Also, are they going to want convertible debt or ownership equity in return for their investment? And are you willing to offer either? They’re going to expect a very large return on investment, if the company is profitable—have you thought about how high you’re willing to go?”
Brandon and Tyler looked at each other. “I think we might need a business major on the team,” Brandon said.
Abby smirked. “You think? Do you know what kind of insurance you’ll need? What taxes you expect to pay? Are you starting a ‘C’ corporation or an LLC?”
“Okay, you’re hired,” Brandon said. “Your salary starts at zero, but with great potential for advancement. Unless . . . how much do we have to pay to get the bikini and the dance?”
She punched him hard in the thigh, and he yelped. “You do it first, and I’ll think about it,” she said.
Later, the evening finally over, Naomi stretched out on her bed with her glasses on and checked the progress of her latest Real-planet world. She was pleased to discover that, despite the harsher environment, as many as seventy of her Mikes were surviving in each iteration. Not only that, but they were communicating through a rudimentary kind of language, mostly to warn each other about the locations of the traps she’d set for them. They didn’t speak by moving vocal chords to create sound waves; the simulation wasn’t sophisticated enough for that. Instead, they used the game’s “action” command in sequences, like a kind of Morse code.
To call it a language, however, was generous. The “action” command provided three different behaviors and their opposites: pick up/put down, build/break, and activate/deactivate. That meant it was a three-bit system, with a total of eight possible meanings. There was no way for them to increase the number of “words” without finding some different mechanism for communication.
She watched them for hours, deciphering the meanings of the signals from how the Mikes reacted. The eight words seemed to be the equivalent of: yes, no, straight, right, left, danger, food, and grass. They never had conversations, per se—they just passed information. For instance, “straight straight right straight right right danger” indicated the presence of a trap in a certain place. Since the world was broken into square tiles, the directions served as a sufficient indicator of location. Or, they might say “straight right food, straight straight grass” to indicate that one square should be used for planting food and another for grazing their sheep herds. It was no more sophisticated than a bee wiggling its backside to communicate the location of discovered nectar to the rest of the hive.
As Naomi studied the data from these new worlds, however, she started to notice patterns that made a chill creep up her spine. In previous worlds, there had been no leaders, just independent Mikes stumbling upon effective ways to survive. They didn’t collaborate as much as discover the same things at the same times. Each Mike worked for his own survival. Arguably, they didn’t even differentiate between fellow Mikes and other features of the world around them.
Now, however, things were different. Hierarchies had developed, with some Mikes at the bottom, doing most of the work and taking most of the risks. At the top, a single Mike ruled as a kind of king, doing no work, but taking a larger portion of the better food. The kings in each world had some of the best survival rates, however, since they had devised ways to manipulate and control the others, keeping most at subsistence levels while they themselves had plenty. They weren’t just kings. They were tyrants.
Studying the early logs from each world, she could see how it was happening. Mikes lucky enough to find a trap without it killing them learned to exploit that advantage—they would tell others where to find it or how to avoid it, for a price. The most successful of them leveraged that price into more knowledge and more control, until the whole survival pattern of their world revolved around them.
It was the beginning of a society. Not a good one, perhaps, but it felt very human. Just about all human groups had self- organized in similar ways from the very beginning. Many mammals and birds did the same, developing hierarchies, pecking orders, rituals of subservience or challenge. So far, she had seen no evidence that the tyrant Mikes were ever challenged, but perhaps that was due to the basic simplicity of the world. Her Mikes never grew old, never had a bad winter, never had children who grew up and became strong enough to test their elders. Once king, they could stay king forever, or so it seemed.
Not that any of this disproved Searle. Just because they behaved in roughly human ways didn’t make the Mikes intelligent in their own right. She could have written software to do this directly. In fact, she could have written software to imitate humans better than this. But that would have been straightforward if-then-else logic, and thus obviously a simulation, however realistic. What made this seem different was that it was emergent behavior. The deep learning algorithms were trained, not directly written, which made them more mysterious. She didn’t know why they made one choice over another, and so they felt more human to her.
But was it really different? Did her ignorance of their decision-making really mean the Mikes had human-level, self-aware intelligence? Or had she just moved to the outside of the Chinese room, assuming intelligence simply because she couldn’t see what was really happening inside? Humans were very good at anthropomorphizing things they didn’t understand. Or could it be that all intelligence worked this way—even human intelligence—and it was only our inability to follow the complexity of the firings of neurons and synapses in our brains that made it seem like something magical?
Regardless, she knew she couldn’t stop. She no longer worried about publishing; she had more than enough to do that. In fact, even now, if she revealed what she had to the larger world, she would attract a lot of attention. This wasn’t just a senior project; this was a whole field of study. She could see dozens of researchers in various vocations wanting to study her Mikes in different settings and make conclusions about the nature of humanity and intelligence. She had no doubt the media would run with the idea as well. She could be famous.
The thought terrified her. She didn’t like people noticing her across the room, never mind the kind of attention even a modest amount of fame would bring. People would look at her. Eventually, she would have to let the secret out, at least enough to get a passing grade and publish a paper. But not yet. She wasn’t ready for that. Besides, the Mikes were hers. And she had a feeling she had only begun to see what they could do.
She decided to make one more change. Instead of selecting the longest-living Mikes after a simulation had ended to populate the next iteration, she baked the iterations into the world itself. Whenever a Mike survived for twenty years of game time, it would spawn a copy of itself. Not an exact copy—although ninety-eight percent of the weights and biases that made up the layers of its neural net remained the same, the remaining two percent were randomly generated. If it survived another twenty years, it would spawn another copy. This more directly mirrored the natural process of evolution, where the most successful variants produced the most offspring, and those in command ultimately had to make way for the younger.
More importantly, however, it meant a single game world could continue indefinitely. As new Mikes were spawned, they would compete against older versions for available resources. Children would inhabit the same world as their parents, making it possible for improved variants of the genome to overthrow entrenched tyrants. Perhaps later generations would learn to work together to achieve such a coup.
Physically, all Mikes were identical. None could evolve to be stronger or faster than the others, or able to survive with less food, or to eat something new. The only thing that changed was their behavior. A Mike could theoretically live forever in such a world, but the genome randomization should inevitably yield children that were smarter than their parents. And grandchildren that were smarter still. Eventually, the older generations, unable to compete, would be left behind to die.