François Jaquet and Florian Cova
Philosophers rarely discuss our moral obligations towards artificial entities. When it comes to robots and ethics, most writings focus on what morality should be implemented in the machine’s “mind” (something Westworld alludes to when mentioning the hosts’ “prime directives,” which prevent them from hurting the newcomers). Luckily, another well‐developed part of philosophy is concerned with our attitudes to entities we are not indifferent to but still treat as less important than human beings: Animal ethics .
Our attitude to animals is similar to the attitude Westworld has us adopt vis‐à‐vis the hosts: We often deem animal suffering acceptable because it improves our well‐being but still feel upset when an animal is mistreated just for the sake of it. The show itself draws an explicit comparison between the hosts and animals on two occasions: When the Man in Black calls Kissy “livestock” and when the people who handle the robots are referred to as “livestock management” (“The Original”). Moreover, the confinement of the hosts in the park is reminiscent of the way livestock are raised in closed areas, without the possibility of escape – Westworld may invite its guests to “live without limits,” but this invitation manifestly does not extend to its attractions. Drawing on research in animal ethics, we can address the question “Is it okay to treat the hosts as the park managers and guests do?”
Westworld does not offer a straightforward answer to our question. Some characters, like William, sense that there’s something wrong in abusing the hosts. At the beginning, he’s troubled by the violent behavior of his brother‐in‐law. And despite his later turn to the dark side, he doesn’t actually change his mind, fully acknowledging the immoral character of his own actions. This is especially apparent when he recalls what he did to Maeve in her previous life: “I found a woman, an ordinary homesteader and her daughter … I wanted to see if I had it in me to do something truly evil. To see what I was really made of … I killed her and her daughter, just to see what I felt” (“Trace Decay”). Still, this point of view isn’t very popular among the guests, as most of them are perfectly willing to hurt, rape, and murder the hosts. So, are William’s reservations warranted? Or is he only prey to a mix of sentimentality and anthropomorphism?
Speciesism is the view that human well‐being matters more than that of other creatures. One justification for this view attempts to ground human beings’ special moral status in their membership in the human species itself. Accordingly, we deserve special consideration simply because we are Homo sapiens .
Some of Westworld ’s characters are visibly tempted by this kind of justification. Logan is a prime example. Right after Logan shoots one of the hosts, William exclaims: “You just killed an innocent man,” to which Logan readily answers: “No, he’s a robot” (“Dissonance Theory”). The fact that the victim isn’t human seems to be decisive in his opinion. As long as someone doesn’t belong to the right species, we are welcome to treat them however we please. Later, declining Logan’s invitation to accompany him to a bawdy house, William gentlemanly remarks: “I don’t think Dolores would find that very interesting.” While he is presumably correct, it takes a bit more than that to deter Logan. In a characteristically rude fashion, the brother‐in‐law insists: “Who the fuck cares what Dolores wants? She’s a goddamn doll!” (“Contrapasso”). And in a sense, Dolores is indeed a doll. But is that relevant to how she should be treated? Does it mean, as Logan implies, that he and William may ignore her wishes?
Merely biological differences – those biological differences that bear no relation to people’s mental capacities or to their interests – are generally considered morally irrelevant. They simply cannot justify discrimination. The darkness of Charlotte’s skin, for instance, doesn’t affect her mental capacities or interests. As a consequence, it doesn’t mean that her interests should be granted less consideration than Theresa’s. Likewise, the fact that the Man in Black is a male doesn’t entail that his well‐being matters more than that of a hypothetical Woman in Black. Some philosophers, therefore, have argued that membership in the human species cannot be morally relevant – it’s just another merely biological property. 1 That you are a human being rather than a chimpanzee does not grant you a superior moral status. But then, it would seem, the fact that Logan is a human being rather than a robot cannot grant him any special moral standing. Dolores may well be a doll, but her interests matter just as much as Logan’s. As such, the biological difference between humans and robots is morally irrelevant.
Some defenders of speciesism concede that there’s nothing magical to the human species per se . Yet, they insist that we have greater duties to human beings insofar as they are members of our species. 2
Animals, too, would have greater duties to members of their particular species if they had duties at all. On this view, human beings don’t have a privileged moral status in absolute terms. Rather, just as we have special duties to our friends and kin, we have special duties to members of our own species. This view is often dubbed “indexical speciesism.” 3
Westworld ’s human characters do not rely on indexical speciesism, but some hosts do. Thus, after she’s started to seriously question her condition, Maeve thinks precisely in terms of “us vs. them.” Talking to Bernard, whom she knows to be one of her kind, she says, “I could make you give me that tablet, turn your mind inside out, make you forget all this. But I’m not going to do that to you, because that’s what they would do to us” (“The Well‐Tempered Clavier”). Here, Maeve appears to treat membership in her group (not her species , since she doesn’t have one) as morally relevant. If Bernard wasn’t a robot she would probably treat him in all sorts of unpleasant ways. But wait, if it’s okay for the hosts to privilege their fellows, it should be for humans as well.
Assuming for the sake of argument that this justification works, it couldn’t account for every sort of discriminatory treatment. Maybe you ought to rescue a friend from a fire rather than a stranger if you can’t save both. But if a friend of yours is in need of a transplant, you cannot kill a stranger to collect their organs. Similarly, this justification wouldn’t warrant the abuses to which the hosts are subjected in Westworld. Be that as it may, indexical speciesism is ultimately no more justified than absolute speciesism. Yes, we must be loyal to our friends. But this stems from the banal observation that loyalty is essential to friendship, that friendship would not be possible in a strictly impartial world. On the other hand, membership in a merely biological category such as the male sex, the Caucasian race, or the human species plainly doesn’t rest on such partial attitudes. 4
Speciesism, therefore, cannot be justified directly, either in indexical or absolute terms. Whether Dolores, Maeve, and Teddy are human beings is irrelevant; it’s their mental capacities that matter.
So, can the newcomers’ speciesism be justified in a less direct way? Do human beings have a higher moral status in virtue of a property that is unique to them? And is it morally permissible to discriminate against the hosts because they lack this property?
One common argument in defense of speciesism is that non‐human animals are far less intelligent than human beings. Of course, we won’t treat rats and flies as our equals; these animals are way too dumb to deserve our respect. Whatever we may think of this reasoning, it’s unclear that it applies to the hosts since some of them are much smarter than ordinary human beings. This is admittedly not true of Teddy. But think of Maeve after she has boosted her own cognitive abilities. Compare her to Ashley, and you’ll get a nice counterexample to the claim that humans invariably beat robots in IQ contests.
Another feature of human beings that is often thought to have moral significance is possession of a moral sense: We are uniquely capable of deliberating about the morality of our actions and forming moral judgments as a result. 5 From this reasonable observation, some people don’t hesitate to infer that animals are not worthy of our consideration. How could we have duties to beings that owe us nothing in return? Again, however, this argument doesn’t weigh heavily when it comes to the hosts, who seem to have a moral sense roughly comparable to ours. If they didn’t have a moral sense, then why would Bernard anxiously wonder whether he had already hurt someone before killing Theresa (“Trace Decay”)? And why would Teddy, remembering his participation in the extermination of his troop, worry: “Something’s gone wrong, Dolores. How could I have done this” (“The Bicameral Mind”)? Both characters seem confident that they have done something immoral, and appear to feel guilty as a consequence. This is possible only if they have a sense of right and wrong.
Human beings are no less proud of another supposedly unique characteristic: Our free will. Whereas even the most evolved of our non‐human cousins are said to be determined by their instincts, we are supposedly free to act however we decide. Some philosophers believe that this difference sets us in an altogether distinct ontological category, and that this has deep ethical implications. 6 Because we are persons, ends in ourselves, we have a dignity whose respect is at the very core of ethics. Other creatures, by contrast, are mere things, tools that we may use however we please in order to achieve our goals. Westworld ’s characters have interesting discussions about free will. Some hosts wonder whether they are free, such as Dolores when she says, “Lately, I wondered if in every moment there aren’t many paths. Choices, hanging in the air like ghosts. And if you could just see them, you could change your whole life” (“Contrapasso”). Others seem certain that free will is specific to humans, such as Robert, whose take on Arnold’s murder by Dolores is unambiguous: “She wasn’t truly conscious. She didn’t pull that trigger. It was Arnold pulling the trigger through her” (“The Bicameral Mind”). Likewise, Lutz tells Maeve, “Everything you do, it’s because the engineers upstairs programmed you to do it. You don’t have a choice.” Unsurprisingly, she disagrees: “Nobody makes me do something I don’t want to, sweetheart” (“The Adversary”). Most hosts are no less convinced of their own freedom than we are convinced of ours.
In Westworld , all dialogues about free will focus on whether the hosts are determined or whether they could act otherwise. Yet, many philosophers no longer regard this issue as central to the problem of free will. Most agree that human beings are determined. If we ever act freely, we only do so in the sense that some of our acts are caused by our desires – although those desires themselves are ultimately determined by external factors. In this “compatibilist” understanding, we are both determined and free. 7 Given this understanding, it is hard to resist the conclusion that the hosts are free as well, at least as free as us. Admittedly, their actions are fully determined and they owe their desires to their designers. Still, sometimes at least, their acts result from their own desires. Hence, if compatibilism is correct, then hosts are just as free as humans are.
How much depends on the assumption that the hosts are as intelligent and free, and have as much of a moral sense, as human beings? Not much, really. It is generally considered unjust to neglect someone’s interests on the ground that they are cognitively unsophisticated. Some human beings are less intelligent than others, yet we strongly believe that neglecting their interests would be unfair. But then, considering that species membership itself is morally irrelevant, if Stephen Hawking’s well‐being doesn’t matter more than Sarah Palin’s or a newborn’s because of his superior intelligence, why should it matter more than a dog’s or a cow’s? And why would it matter more than Teddy’s or Dolores’s? Once we recognize that human beings are far from all being equally smart, the idea that they matter more because of their intelligence becomes suspicious. And the same observation applies to free will and the moral sense: Whether someone can act freely or has a moral sense is immaterial to the consideration that is due to their interests. 8
On the other hand, much hinges on consciousness. There is a reason for this: Something that doesn’t have any subjective experience – something that isn’t the subject of any sensations, emotions, or feelings – has no interests that could be taken into account, no well‐being that could matter more or less than a human being’s. Animal ethicists thus generally agree that it is because most animals are conscious (or sentient) that we must grant their interests the same consideration we owe to humans’ similar interests. It is therefore a crucial question whether the hosts are conscious: Are they subjects with a first‐person perspective or are they just highly advanced computers, very complex bits of matter? This question receives all the attention it merits in the show. In particular, it’s the main topic around which Robert and Arnold’s arguments used to revolve. Its complexity is reflected in Robert’s ambivalence on that matter, an ambivalence that is perfectly summed up by the sentence: “It doesn’t get cold, doesn’t feel ashamed, doesn’t feel a solitary thing that we haven’t told it to ” (“The Stray” emphasis added).
Arnold had no such misgivings. As Robert recalls, “He wasn’t interested in the appearance of intellect or wit. He wanted the real thing. He wanted to create consciousness” (“The Stray”) – until he thought he had. But Robert remained unconvinced, and maintained that “He saw something that wasn’t there” (“The Original”). In an ironic passage, he even tries to persuade Bernard, who still believes in his own humanness, that “The hosts are not real. They’re not conscious” (“The Stray”). Later on, pressed by Bernard to tell him what the difference is between their respective pains, Robert goes so far as to deny consciousness to human beings as well:
We can’t define consciousness because consciousness does not exist. Humans fancy that there’s something special about the way we perceive the world, and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next. No, my friend, you’re not missing anything at all.
(“Trace Decay”)
This looks very much like an exercise in bad faith, and it turns out to be just that in the season’s finale. There, we learn that Robert knew from the beginning that the hosts were conscious, as he confesses, “I was so close to opening the park that to acknowledge your consciousness would have destroyed my dreams” (“The Bicameral Mind”). Case closed.
Because they are conscious, the hosts are plainly members of the moral community, alongside human beings and other animals.
In response to an ethical argument for vegetarianism, most people will contend that eating meat is “natural,” that it is part of our very nature. As odd as it might first appear, Robert pushes such an argument in one of his discussions with Bernard:
We humans are alone in this world for a reason. We murdered and butchered anything that challenged our primacy. Do you know what happened to the Neanderthals, Bernard? We ate them. We destroyed and subjugated our world. And when we eventually ran out of creatures to dominate, we built this beautiful place.
(“The Well‐Tempered Clavier”)
In a somewhat Nietzschean spirit, Robert takes the pinnacle of artificiality (the park and its hosts) and makes it the simple product of a natural, innate urge: The human drive to dominate. 9 Whatever the merits of this psychological analysis, however, the mere fact that something is natural doesn’t make it morally right. In fact, we try to remedy many things that could be considered natural. That our lives would be short in absence of medicine, or that giving birth puts women’s lives at risk is no reason not to fight premature death. Natural is not necessarily right; this much should be clear.
One notion that might be implicit in the argument from nature is that of a “role.” Maybe what makes it okay for humans to exploit animals is that it is the latter’s role or function to be exploited: That’s what they were created and designed for . Of course, this argument won’t speak much to the atheists, who will laugh at the claim that animals were designed for anything. But this line doesn’t hold for the hosts: They were created for a reason – to satisfy the guests’ appetites. As the Man in Black explains to Teddy:
You know why you exist, Teddy? The world out there, the one you’ll never see, was one of plenty. A fat, soft teat people cling to their entire life. Every need taken care of, except one: purpose, meaning. So, they come here. They can be a little scared, a little thrilled, enjoy some sweetly affirmative bullshit, and then they take a fucking picture and they go back home.
(“Contrapasso”)
Since the park was created precisely to please the guests, it’s all right for them to abuse the hosts. Or so the argument goes.
As for Robert, he opportunistically relies on a similar, Lockean argument: Creatures belong to their creator. 10 Here’s his rejoinder to Bernard’s complaint that he has “broken his mind”: “I built your mind, Bernard. I have every right to wander through its rooms and chambers and halls, and to change it if I choose, even to burn it down” (“The Well‐Tempered Clavier”). Obviously, there are strong reasons to resist this argument. One is internal to the show: In light of Robert’s final revelations, it’s not clear that the hosts were primarily built for the pleasure of the guests – Arnold seemed to entertain nobler ambitions.
Other reasons are more philosophical: That you decided to have a child because you needed help at home doesn’t make it right for you to have her do all the cleaning. Robert’s argument actually works only if we assume that the hosts are more like objects (tools that can be used as means) than they are like persons (entities that deserve our respect), which we have already seen is not the case.
One final, more refined argument would justify the hosts’ exploitation by their very existence. Some people thus admit that animals suffer from their exploitation but argue that things would be even worse for them in the alternative: They would either live a harsh life in the wild or not exist at all. 11 The same can be said for the hosts: If it weren’t for the newcomers’ pleasure, they wouldn’t exist in the first place. Building them would not be a sustainable business.
This might be the strongest justification of the hosts’ exploitation. And yet, it’s inconclusive. What would we think of abusive parents who’d justify their behavior by saying that they had children for the purpose of abusing them? Besides, is the hosts’ existence better than non‐existence, as this argument presupposes? Quite conveniently, Robert bites this bullet: “Their lives are blissful. In a way, their existence is purer than ours, freed of the burden of self‐doubt” (“Trompe L’Oeil”). As per usual, Arnold disagrees. He’s adamant about that with Dolores, when she wonders whether he’s going to reset her: “This place will be a living hell for you” (“The Bicameral Mind”). Interestingly, Robert appears to share his friend’s pessimism for once when, later, he reports his worries about Dolores: “In you, Arnold found a new child. One who would never die. The thought gave him solace until he realized that same immortality would destine you to suffer with no escape, forever” (“The Bicameral Mind” emphasis added).
Finally, assuming that maintaining the hosts’ existence could justify hurting them in principle, almost no one seems to really pursue this goal – only Robert apparently believes that what he does is for their greater good (to let them know their enemy). It would therefore be extremely difficult to justify the guests’ behavior along this line. And this might be one of the most disturbing aspects of Westworld : That faced with the miracle of a new life form, most humans still care only about their kind. 12