9

Agent Nick Vernon found the blog posts written by Kestrel Hathaway nine years ago to be quite informative. And, though he tried to keep an open mind, he had to admit that they expressed views that were sympathetic to Purist ideology.

Purist-leaning.

Yes.

That was certainly how she came across.

She wrote of the “disquieting” nature of advancements in genetic algorithms that allowed machines to reprogram themselves and of the “breathtaking tragedy” of people handing over the autonomy of their lives to machines.

In his research, Nick discovered what was most likely the event that triggered her anti-technology views: the tragic, senseless death of her parents.

The two of them had been killed when a law enforcement Artificial fired live ammunition at them. The Artificial identified them as a threat, authorized the use of deadly force, and then took what it believed to be the most appropriate and responsible action. All of the choices were done with no human being in the loop to verify the decision.

The Artificial misidentified them as suspects who’d committed a multiple homicide at an airport security checkpoint and were considered armed and dangerous.

Before they could flee, it mowed them down with automatic fire, nearly ripping them apart with bullets.

More than a hundred shots fired in less than six seconds, all from a machine that had been designed to not miss its target.

And Kestrel had been present, traveling with them. She saw it all.

Her parents ended up becoming two of the early casualties to autonomous weapons before the glitches were worked out.

At the time, the Artificials would evaluate the situation, assess the threat, and decide who should live and who should die, but not always accurately.

Since then, advances in programming had almost completely eliminated false positives, and the risks of Artificials making their own decisions in the pursuit of the common good had been deemed by society to be acceptable—even preferable—to having humans inserted into the equation of making the final decision regarding the use of lethal force. It was more efficient, more cost effective, saved time, and was now the norm.

So, yes, Kestrel’s posts took a decidedly Purist turn after her parents were veritably shredded to pieces by dozens of bullets.

And Nick could understand why.

She wrote:

Human nature being what it is, technological advances, as benign as they might initially appear, will eventually be used in destructive ways. This isn’t to say that those developments should be stopped, but it is to say that mechanisms should be put in place to protect us, not from the machines or technology, but from ourselves.

Three conclusions naturally follow: (1) When machines are given autonomy, they will eventually make decisions that humans find undesirable. (2) The more we advance in areas of technology without ethical constraints or consideration, the harder it will be to one day rein in the undesired consequences of our past decisions. (3) If there’s a way to abuse a certain branch of technology, it will eventually be abused.

Constructing machines that emulate human nature or are imbued with “human values” will not make them more moral, but may very well make them more violent, harder to control, and less apt to take responsibility for immoral or unjustified actions—just as humans are.

When developing advanced AI, it’s just as important to understand human nature as it is to understand computer processes, because we are, in essence, faced with the daunting task of creating machines that are more virtuous than their creators.

Nick was aware of the ongoing debate among religious leaders and ethicists about which “values” to instill in Artificials. Muslims had quite different views on the matter than atheists did. The Buddhists didn’t agree with the Latter Day Saints. Hindus and Catholics were at odds—and how do you teach a machine to act in a moral manner when you can’t even agree on what morality is?

Should robots assist with abortions and suicides?

Should they carry out death sentences?

Should they be taught to break laws or use purposeful deception when there’s a greater good to be obtained by doing so? And how will they be programmed to know the difference?

When should they be forced to go against their conscience in the service of society as a whole? Or should that ever happen at all?

He agreed with Kestrel that designing machines to emulate human nature wasn’t the answer. Over the last twelve years in his job he’d seen the worst aspects of humanity exhibited on clear and constant display. Some people claimed that humans were naturally good, but that wasn’t the conclusion he’d come to.

The evidence just didn’t support it.

And in this job, evidence rather than wishful thinking needed to rule the day.

After all, children don’t need to be taught how to lie, cheat, steal, or be selfish or greedy. We all naturally know how to act in those ways, and we spend the majority of our lives unlearning those behaviors, trying to overcome our natural bent. Just study history and you’ll see: as good as humans might sometimes be can’t even hold a candle to how bad we’ve been.

He read on.

Given autonomy, do human beings act altruistically, or don’t they?

Well, in truth, they do both.

Which is something many programmers don’t seem to acknowledge.

Since the earliest days of recorded history, human beings have been violent, proud, rebellious, broken people. Humans obscure the truth when it’s to their advantage, spread reputation-destroying gossip without a second thought, and judge others for the very things they themselves do when no one is looking. Despite the negative consequences to others, they exhibit varying degrees of psychopathy by trying to put themselves first in competitions of all sorts in all areas of life.

Technology not only makes our capacity to be human in the best ways, but also in the worst ways, more possible. We can heal more diseases, relieve more pain, bring food and hope to the starving; and we can also torture people more exquisitely, exterminate them more efficiently, drive them insane more quickly, and even target viruses to wipe out entire races through focusing on a specific genetic code found only in people of that ancestry. We have honed our capacity for both good and for evil. This is human nature. This is who we are.

No, it’s not a brave new world we live in. It’s the same old world, as old as the human race, but more frightening than ever because by giving machines the ability to make moral decisions, we’re allowing them the chance to act just as nobly and as abhorrently as we do—both.

Nick had no way of knowing if Kestrel Hathaway’s views had evolved since she wrote those posts, but from his experience, a person’s beliefs in matters like these rarely remained static. They matured and softened, or they hardened and led to radicalism.

After more than a decade of tracking fascists and violent extremists, he’d learned that religious devotion will usually remain pretty harmless and benign as long as people assume they’re members of a certain religion because of their upbringing or their country of birth.

If they believe it’s simply their heritage or tradition, one that they’re part of by default, you have little to worry about. However, the matter becomes something else entirely when they choose their beliefs of their own free will despite their culture and formative experiences. Then it can, and often does, lead to radical ideology.

And those who abandon religion altogether could be the most dangerous of all, especially if they adhere to a naturalistic set of beliefs. After all, why show compassion to children dying of AIDS in Africa when it makes no evolutionary sense to do so? Why care for the elderly? Why not abort or euthanize those who are impotent or genetically inferior? Though few like to admit it, the logical consequent of humanistic thinking inevitably leads to eugenics or genocide in one form or another.

He scrolled to Kestrel’s final blog entry.

We fear silence and we fear suffering. Consequently, we fill our lives with constant diversions to slay the solitude that so graciously pursues us.

Noise and comfort—these we spend our lives obsessively seeking, though we already know that they do not lead to enlightenment.

So many of our technological advances are simply a means of fighting against those two things that we fear most—silence and suffering. And yet, those are the ways that the wisest among us have, over the years, pointed out are the most trustworthy pathways to wisdom.

New avenues of virtual and augmented reality and new breakthroughs in music and video streaming directly into your brain through implants offer a constant diversion from having to ask yourself the big questions about life and meaning and hope and existence and God.

We fight what we know makes us wise in order to distract ourselves from the inevitable death that awaits us and that we cannot bear to face.

This is a form of insanity. And it has infected us all.

Nick considered her words.

Purist-leaning or not, she did have a point. It’s not crazy or illogical to spend time thinking about and preparing for your death; it would actually be crazy not to.

However, he had a case to solve here, and though her views were striking, they were not necessarily pertinent.

But they might be.

He refocused his attention on the case and her possible involvement in it.

He and his team had been given the assignment of locating the individuals who’d orchestrated the attack against the Terabyne plant that had left five dead, and that’s what he was going to do, regardless of who ended up being responsible.

Even if it was her.

And additionally, it didn’t look like this was going to be an isolated event.

His unit had uncovered chatter referring to another attack this weekend, one that, if the information was accurate, would be much larger.

There wasn’t any more specific information yet, but there was enough data to raise red flags and bump up the threat level at the Bureau.

Solving this current crime gave them the best chance at stopping that future one.

So then, Kestrel Hathaway.

Who was she, really, and what was her involvement? Innocent bystander, or radical Purist sent to assure that the attack went as planned?

This he knew: Murderers often return to the scenes of their crimes.

Arsonists return to watch their fires burn.

And terrorists? They also like to watch, and often prefer being present when their bombings occur.

Just like Miss Hathaway had been.

Yes, her blog posts and the views they espoused were nearly ten years old, but maybe she still had ties to the groups that’d posted them. If nothing else, even if she didn’t still harbor those beliefs, she might have contacts that could help him find out who did.

Nick didn’t want to get ahead of himself or make unfounded assumptions that would prejudice his investigation, but he did want to speak to the woman and to follow the evidence no matter where it led.

He called Ripley and asked how it had gone with Ethan Bolderson.

“Unfortunately, he passed away before I had the chance to speak with him.”

“I thought he was in stable condition.”

“From what I understand, it was a rather sudden downturn.”

Nick took that in. “I’m sorry to hear that.”

“I’m having the team follow up on the contents of his computer,” Ripley said. “Apparently there were some files that might implicate him in the attack.”

“I’ll want to take a look at those myself as soon as possible.”

“Of course. With him dead, that just leaves us with the woman.”

“Where are you now?”

“On my way to speak with her.”

“Swing by here.” Nick gathered his notes. “We’ll talk to her together.”