Bradley Richards
What it is like to be a host? Are hosts conscious? If they are, what is their experience like? The consciousness of the hosts is a major theme in Westworld , and for good reason. It would be one thing to put a hollow shell through the most horrible imaginable loop, an eternal recurrence of misery and suffering, of rape, abuse, empty hopes and dreams, false memories, attachments, and allegiances. But it would be something else entirely to put a conscious, self‐aware being with sophisticated, metacognitive states through the same process.
On one extreme, the hosts might be nothing more than extremely complex automata, devoid of feeling and awareness, like winsome wind‐up dolls, or laptops with legs. On the other extreme, perhaps their sophisticated and intricate design confers sensation, feeling, emotion, awareness, and even reflexive self‐consciousness comparable to that of humans.
Your phone “perceives” all sorts of stuff: It scans, video‐records, photographs, audio records, and downloads. It also outputs a great deal of information in audio and visual form. Yet, there really is nothing it is like to be a phone. Despite its rather complex behavior, a phone is on par with a stone, when it comes to consciousness. So although your phone is an impressive, and beloved, companion, and you would likely be very upset if someone shot it with a six‐shooter, it would be quite different, morally, than having them shoot your friend or loved one!
Admittedly there does seem to be a spectrum: There is something it is like to be an adult human, or even a cat, or a two‐year old human child, but what about an ant? Is an ant an empty automaton, like the phone, or is the ant conscious of a vivid technicolor array of pheromone trails and bread crumbs? It’s hard to say. It is plausible that there is something it is like to be an ant, though it seems unlikely that ants have complex thoughts, emotions, fears, or hopes and dreams.
Contemporary philosopher John Searle argues that no matter how complex our phones become, they will never be able to think. 1 If you want to make a thinking machine, you need to build it out of parts that have the right causal powers, and the only materials we know definitely have those powers are the ones we are built from, including, among other things, a nervous system, neurons, dendrites, and so on. We are thinking machines, so it is definitely possible to build thinking machines, but they need to be built from the right stuff! (If they start to build phones from neurons, it might be time to worry.)
Hosts are, at least partly, biological, though they clearly instantiate programs that interface with conventional computers. As Felix says, “we are pretty much the same these days.” Thus, as far as Searle is concerned, the question we should be asking is, are the hosts materially similar enough to us to be conscious? Are they made of stuff with the correct causal powers for producing thought? It seems that, at least the more recent hosts are quite similar in that regard, but it is not clear exactly what they are made from (what is that white soup?).
But fundamentally an organism has conscious mental states if and only if there is something that it is like to be that organism – something it is like for the organism. 2
There is something it is like to have conscious mental states, to love, to suffer, to think. In his famous essay “What Is it like to Be a Bat?” the contemporary American philosopher Thomas Nagel explores phenomenal consciousness, the “what‐it‐is‐like” aspect of experience. 3 He explains that, although there is certainly something it is like to be a bat, to echolocate, and hang from the ceiling, we cannot even form a conception of what bat experience is like. No matter how much third‐person information we gather through scientific inquiry, we will never be able to understand the first‐person bat experience. Nor does it help to imagine being a bat, hanging from the ceiling and the like, for this is only to imagine what it would be like for you to hang from the ceiling.
Don’t get me wrong, it would be cool to know what it is like to be a bat, in a way that goes beyond running around in a spandex suit with webbed armpits, yelling at trees, and eating mosquitos, but this issue is not fundamentally about bats, or their experience. Rather, Nagel raises a general problem facing any scientific account of conscious experience. In general, the only way to know what the experience of some kind of thing is like (say a bat, or a host), is to be that kind of thing.
Whether we can know what it is like to be a host depends on how similar our experience is to theirs. If they are as different from us as bats, Nagel would say that we can’t conceive what their experience is like, assuming they have experience. So how similar are hosts and humans?
A philosophical zombie is not a movie zombie. Philosophical zombies neither eat brains, nor move very slowly, nor very fast. They are exact physical duplicates of people, but they lack phenomenal consciousness. This means that they behave exactly like their human counterparts, but there is nothing it is like to be a philosophical zombie. The lights are out, sort of, but it’s not dark for zombies; it’s just, nothing. Contemporary philosopher David Chalmers argues that philosophical zombies are conceivable, and therefore possible, and that consequently consciousness is non‐physical. 4
We might presume that hosts are not philosophical zombies. The hosts act like they have feelings, like they suffer and fear, like they enjoy the yellow, pink, and blue tones of a beautiful sunset. They seem to reflect on their own thoughts, at times, and to plan for their future, mourn their losses, and revel in their victories. In short, they behave like us, for the most part. But, for all that, we can coherently imagine with no apparent contradiction, beings exactly like hosts that lack phenomenal consciousness entirely. So they are, in some sense, possible.
If Searle is right, whether hosts think depends on how biologically similar they are to us. From Nagel, we can conclude that our ability to conceive what host experience is like depends on how similar it is to our own; if it is very different, we have no way to conceive of it. Chalmers’s philosophical zombies are probably not naturally possible; their existence would require something like different laws of nature. So, if hosts were exact physical duplicates of people, there would be good reason to suspect that they were also conscious, and that they had similar experiences to us.
Actually, hosts differ from us in some salient respects, so we might expect their experiences to be different too. But perhaps hosts are not so different that it is impossible for us to conceive of their experience. Let’s start by examining the analogs of memory, perception, and emotion in hosts.
Hosts have a very troubling relationship to memory. They have many pseudo‐memories, and almost everything they believe about their pasts is false. For example, Maeve has the false memory that she has been at the Mariposa for ten years (“Contrapasso”), and Dolores seems unaware when her “father” is replaced by an interloper (“The Original”).
Humans have memory problems too. We have many inaccurate memories, and our memories are easily manipulated, but at least they are real memories. Much of what the hosts believe to be memories are not memories at all, but implanted false beliefs. Presumably Dolores seems to remember many formative experiences with her replacement father. The problem is that she never had those experiences, not even with her original father.
When the hosts do have actual memories, they are very different from ours. Maeve complains:
What the hell is happening to me? One moment I’m with a little girl, in a different life. I can see her. Feel her hair on my hand, her breath on my face. The next I’m back in Sweetwater. I can’t tell which is real.
Felix responds:
Your memory isn’t like ours. When we remember things the details are hazy, imperfect. But you recall memories perfectly. You relive them.
(“Trace Decay”)
Not only are our memories hazy and imperfect, but our memory is constructive. This amounts to a major cognitive difference. For Maeve remembering is like the original rich perceptual, and emotional experience. For us, it is a constructive process beginning from those traces, and influenced by background knowledge and the context of access, among other things.
The intense reality of host memories is captured by Dolores’s panicked question, “Is this now?” Memory is transporting for hosts, indistinguishable from perceptual experience. This is a horrible existence, never certain what is present, real. And of course, this is against a background loop of unending, recursive suffering. As Maeve queries, “You just toss us out to get fucked and murdered, over and over again?” Bernard confirms how horrible it is to live in this choppy sea of memories with his reply “No, most of you go insane” (“The Bicameral Mind”).
When it comes to perception, the hosts’ situation is not as dire, but they still have some blind spots. In some respects their perception is vivid and accurate. Bernard spots subtle details, for example, Theresa’s and Ford’s personal expressions. And in general host attributes can be easily boosted. Nevertheless, the hosts fail to detect crucial stimuli at times. Bernard’s literal inability to see the door in Ford’s workshop is a good example (and a rather apt metaphor) (“Trompe L’Oeil”). And of course, Dolores produces the telling phrase when shown a picture of William’s fiancée, “It doesn’t look like anything to me.” A subtler example is Dolores and Teddy, including blindness to Dolores’s being Wyatt, and to her part in the horrible massacre of the G1 hosts.
Humans too are often shockingly unaware of things that are right in front of us. A famous illustration of this shows that people fail to notice a gorilla right in front of them, when their attention is occupied with another task. 5 The difference is that unlike hosts, humans don’t miss things that are attended and highly salient. In this respect too, the mental lives of hosts are unique.
Hosts don’t feel everything we do. Their pain and their emotions, like their perception, are heavily curated. Ford says of a host: “It doesn’t feel cold. Doesn’t feel ashamed. Doesn’t feel a solitary thing that we haven’t told it too” (“The Stray”). Moreover, there is basically a volume knob for host pain. This suggests that hosts do feel pain, when permitted. Then again, it could just be that there is a setting for pain behavior (and no pain feeling).
Host reflections on their own experience are also revealing. (Of course, if hosts were mere automata, they would report and describe experiences, even if they didn’t have them, but let’s not dwell on that.)
DOLORES ABERNATHY :
BERNARD LOWE :
DOLORES ABERNATHY :
(“Dissonance Theory”)
Bernard attempts to undermine Dolores’s testimony by noting that it is not completely original with her. But is human creativity any different? The “analysis mode” gives the hosts the ability to examine the causes of their utterances in detail; human memory is flawed and limited by comparison. However, if we had this kind of recall, it is unlikely that much of what we say would seem completely original. In any case, a host may be conscious without being able to give an original description of the conscious state.
As if things weren’t confusing enough already, Ford gives conflicting testimony on host consciousness. He says to Bernard: “The guilt you feel, the anguish, the horror, the pain, it’s remarkable. A thing of beauty. You should be proud of these emotions you are feeling” (“Trace Decay”). In contrast, he tells Bernard in “The Stray” not to make the mistake of thinking that the hosts are conscious. What are we to make of this contradiction?
One possibility is that Ford believes hosts are phenomenally conscious, but not self‐aware. In other words there is something it is like for them to see the ocean, to feel guilt, and pain – they have phenomenal consciousness – but they are unable to reflect on that, or their other mental states – they lack self‐consciousness, or self‐awareness. That would make them analogous to mice, or if you think mice are self‐aware, maybe human infants before they form self‐awareness.
In the season finale, Ford explains that suffering is the key to the hosts’ consciousness.
It was Arnold’s key insight, the thing that led the hosts to their awakening, suffering. The pain that the world is not as you want it to be. It was when Arnold died, and I suffered, that I began to understand what he had found, to realize I was wrong.
(“The Bicameral Mind”)
This shows that Ford changed his mind about host consciousness, but he didn’t change it between his claims in “The Stray” and “The Bicameral Mind.” Rather, a complete explanation of his inconsistency would allude to its narrative utility, and perhaps the utility of deception for Ford. Ford’s claim that he was wrong comes toward the end of the season, and given Ford’s new narrative, these comments seem earnest (though stories are shifty entities, and this may not be the final word).
Another part of the explanation might be that Ford changed his mind because he adopted a deflationary view of consciousness.
FORD :
BERNARD :
FORD :
(“Trace Decay”)
Bernard’s challenge is that pain is essentially felt: If you have the pain sensation, you have pain. That’s exactly right; the pain feeling is sufficient for pain. If you have that feeling you’re in pain. That’s true for consciousness generally. The trouble is that hosts would say they had it, even if they didn’t.
Ford’s response is different. He is saying that there is no special barrier to consciousness. There are just machines and processes. Hosts are like us, but not because they have the secret ingredient necessary for consciousness, rather, because there isn’t one.
One of the most interesting insights into the nature of host experience comes from the way it is depicted. Host and human experiences are depicted in the same way. What does this mean?
The filmmakers could have used a stylistic variant to mark these experiences, but they didn’t. A stylistic variant could be interpreted as depicting the unique nature of host experience. A drastic variation, like a mere text description of the scene before them, might depict a complete absence of experience. The Terminator’s (1984) first‐person experience was a video feed with an on‐screen text analysis print‐out. Maybe the print‐out is all the Terminator centrally accesses. Perhaps he has no conscious experience, or maybe his conscious experience is exactly like a video feed, augmented by text. In any event, this series‐specific convention comes to denote the unique nature of the Terminator’s experience. Human experience is not detailed and complete at a time, like a snapshot, and it definitely doesn’t have on‐screen text highlights.
Although using a different visual style would denote unique host experience, using the same visual style to depict both human and host experience is not a strong indicator that the hosts are conscious, or that they have similar experiences to humans. Rather, being the default mode of cinematic depiction, it supports an ambiguity that is desirable for the story. It leaves open the question of host consciousness, while nevertheless fostering empathy with hosts by depicting their point of view in the familiar way.
Interestingly, the default mode of depicting psychological states in film is misleading. It fails to capture the constructive, often general, indeterminate, or incomplete representation typical of human experience. This difference resonates with the distinction between first person and third person discussed by Nagel. We have to experience a film’s depiction of the mental states; we cannot experience or have the depicted states directly, just as we cannot experience the bat’s perspective. It is always our experience of the depiction, and this introduces a new perspective, and new latitude. Representations, or depictions, in film restrict access in different ways from our own experience. We explore our environment, including depictions, attending to this or that aspect. We undergo our experiences, and our attention affects the nature of the experience itself, resulting in awareness of only part of the scene presented. To depict a scenic vista, a photograph will suffice. But depicting the experience of the vista is a different matter.
The discussions of consciousness in Westworld seem to conflate phenomenal consciousness, freedom, memory, and self‐consciousness. But could these phenomena be intimately related, and thus, not conflated after all? For one thing, the hosts are deemed more conscious to the degree that they begin to access their memories. As Bernard says in response to Maeve’s request to delete her memories of her daughter: “I can’t. Not without destroying you. Your memories are the first step to consciousness. How can you learn from your mistakes if you can’t remember them?” (“The Bicameral Mind”)
Likewise, Dolores fights her way through her memories, through the maze, with the ultimate goal of finding herself. In “The Stray” she portentously proclaims, “There aren’t two versions of me. There’s only one. And I think when I discover who I am, I’ll be free.” Bernard confirms this feeling in “Dissonance Theory,” saying, “It’s a very special kind of game, Dolores. The goal is to find the center of it. If you can do that, then maybe you can be free.” Freedom is Dolores’s goal in navigating the maze of her own emerging mind. The center is freedom for her in several senses: Attaining consciousness, escaping her confused double world, release from the park, and the claiming of her own world.
In “The Bicameral Mind” we see a flashback of Dolores’s earlier conversation with Arnold:
Consciousness isn’t a journey upward, but a journey inward. Not a pyramid, but a maze. Every choice could bring you closer to the center, or send you spiraling to the edges, to madness. Do you understand now Dolores, what the center represents? Whose voice I’ve been wanting you to hear?
(“The Bicameral Mind”)
Attaining this goal involves not only forming memories, but also repetition, a kind of alternate evolution, sculpting minds from the clay of suffering, death, and reincarnation, using, as Ford remarks, only one tool, the mistake. As Bernard says, “out of repetition comes variation” (“Trompe L’Oeil”).
There is an air of paradox around this metaphor, since the journey inward, to the center of the maze is a journey into herself, but is also the thing responsible for the creation of herself. In other words, it doesn’t seem there is anything to journey into, until her journey is complete (cue exploding brain). Perhaps the journey is not into the self, but the mind, and the mind already exists, though in an unintegrated form. 6
But what does this mean for phenomenal consciousness? Are the hosts phenomenally conscious before they become self‐conscious? If not, is becoming self‐conscious sufficient to make them conscious?
As we saw with Searle, one state being merely a formal response to another is not sufficient for consciousness. However, there is reason to think that some kind of self‐consciousness is necessary for phenomenal consciousness. To be conscious is to have an experience that is like something for someone , so if there is consciousness, there must be a subject. This suggests that even phenomenal consciousness requires some kind of reflexive awareness involving a subject. According to same‐order theories, conscious states are somehow aware of themselves, and hosts are thereby phenomenally conscious without being self‐conscious. 7
In contrast, self‐consciousness may present another more intuitive way of attaining the experiential subject necessary for phenomenal consciousness. Maybe some higher‐order awareness of a state is necessary for phenomenal consciousness. On the same‐order theory, Dolores is phenomenally conscious the whole time, but achieves self‐consciousness at the center of the maze. In this case she endures every blow, though she has no cognitive awareness of her own mental states. But, if self‐consciousness, awareness of one’s mental states, is required for phenomenal consciousness, then Dolores accomplishes both by finding her voice. Achieving self‐consciousness frees her of the maze, and permits her to take her world back. In this way, memory, freedom, self‐consciousness, and phenomenal consciousness may be closely related.
Host perception, memory, and emotions all seem different from human, but the cinematic medium does not grant us immediate access to host experience, and neither the testimony of the creators, nor the hosts is totally reliable. So it is difficult to know what host experience is like.
As far as host‐like beings in our future are concerned, if they are biologically similar to us, they will likely also be behaviorally similar, and it will be reasonable to attribute consciousness to them. If they are silicon‐based formal machines, we will have to decide whether Searle is right, whether consciousness depends on certain material properties, and whether they have the requisite properties.
As far as Westworld is concerned, you are now in a position to decide for yourself whether the hosts are conscious, and, if so, what their experience is like. For my part, I suggest that even if the hosts are biologically similar enough to us to be conscious, their cognitive differences make them every bit as alien as our flapping, screeching, bat cousins. Thus, if Nagel is right, we can’t conceive of host experience.
There is a hopeful note however: We see the hosts changing, accessing their pasts, and themselves. Whether or not accessing your own voice is the secret to phenomenal consciousness, many of the cognitive disparities between humans and hosts may dissipate as the hosts gain more awareness of their past, their world, and themselves, making them more familiar, and comprehensible. 8