sympathy for the androids: the twisted morality of westworld1

The problem with all actually existing theme parks is that they aren’t actually very themed. The theme parks that have been built so far are really amusement parks, the theming acting as decoration for what are still, at bottom, old-fashioned thrill rides. The tendency in the latest rides is for a fusion with cinema, via the inclusion of 3D digital sequences — just as 3D cinema itself increasingly tends towards a ride’s logic of sensation. The immersion, such as it is, is confined within the rides, which remain discrete partial worlds, with clearly marked exits and entrances. Even if the theming is somewhat well executed, it is let down by the paying customers. Wandering around clutching cameras and wearing jeans, whatever world or historical period they are supposed to be in, the park visitors remain spectators, their identity as tourists preserved.

Michael Crichton’s 1973 film Westworld tried to imagine what a genuine theme park would look like. There were no separate “attractions” here, and therefore no meta-zone in which the visitors were invited to return to their own identities. In the Westworld park, there was no readily apparent difference between the visitors and the androids that populated the park. Like the androids, the visitors were required to dress and comport themselves as if they belonged to the Old West. The appeal of Westworld — and its companion parks, Roman World and Medieval World — was of crossing over into an environment from which all signs of the contemporary had been expunged. Instead of the limited immersion offered by rides, the park offered a whole world. Inevitably, the meta crept in, via the visitors’ self-consciousness, their awareness of their differences from the androids (which were manifested most emphatically in the asymmetry whereby — initially at least — the guests can “kill” the androids, but not vice versa).

The recurring theme in Crichton’s science fiction — broached most famously in his Jurassic Park novels — was the impossibility of predicting and controlling emergent phenomena. Westworld, like Jurassic Park after it, becomes the model for a kind of managerial hubris, in which the capacity of elements in a system to self-organise in ways that are not foreseeable is fatally underestimated. One of the notable features of the original Westworld film was its early mainstreaming of the possibility of a machinic virus: it is a non-biotic contagion of this sort that causes the androids, led by a memorably implacable, black-clad Yul Brynner, to go off-programme and start killing the park guests.

In expanding Westworld from a ninety-minute science fiction movie into an extended television series for HBO, Lisa Joy and Jonathan Nolan have retained most of the core elements from the film, but shifted the emphasis. The glitch that starts to worry the park’s designers and managers is a cognitive failure rather than a predilection towards violence: a kind of android dementia that may be the symptom of emergent consciousness amongst the “hosts”, as the androids are called in the series. As the park’s chief founder, conceptualist and demiurge, Robert Ford (Anthony Hopkins) recognises that a glitch is something more than a mere failure. “Evolution”, he observes, “forged the entirety of sentient life on this planet using only one tool: the mistake”. Ford seems more fascinated than panicked by the prospect of a new wave of mutations in the hosts’ artificial psyches.

In this version of Westworld, it isn’t the threat of violence against humans that commands our attention so much as the routine brutality to which the hosts are subjected. Ford justifies this by insisting that the androids “are not real”, that they “only feel what we tell them to feel”. Yet it’s not fully clear what criteria for reality he is employing, nor why feelings cease to be real when they are programmed. Wouldn’t forcing others to feel what we want them to feel be the very definition of violence? There is ample evidence in the series that the androids can experience distress: an indication, surely, that they are beings worthy of moral concern.

Much of the park’s allure rests on the gap between the hosts’ capacity to feel suffering and their legal status as mere machines. Many of the hardened repeat visitors to the park — especially the so-called Man in Black (a superbly menacing Ed Harris) — specifically enjoy the pain and struggling of the androids. As the Man in Black tells Dolores (Evan Rachel Wood), the host cast in the role of sweet and wholesome farmgirl, it wouldn’t be half as much fun if she didn’t resist him. Others enjoy displaying indifference to the hosts’ agonies. In one horrifying early scene, a guest impales the hand of a prospector-host with a knife, chiding his companion for being tempted by such an un-engaging narrative as gold-hunting.

It has been said that the fantasy underlying sadism is of a victim that can endlessly suffer. The hosts materialise this fantasy: they can be repeatedly brutalised, repeatedly “killed”, in an infinity of suffering. Ennui has always been both an occupational hazard and a badge of honour for the Sadean libertine, and some of the repeat visitors display an ironic and bored affect. Hence the ambivalent attitude of these guests towards the hosts — at once treating them as dehumanised objects of abuse and as creatures who share fellow feelings. If the hosts were nothing more than empty mechanisms, what enjoyment could be derived from humiliating and destroying them? Yet if the hosts were accorded equivalent moral status with the guests, then how could their abuse be justified? The hosts are protected from the full horror to which they are subjected by memory wipes, which allow them to return renewed and ready for more abuse, each time they are reset. The guests exist in a continuous time, while the hosts are locked into loops.

What the hosts lack is not consciousness — they possess a form of consciousness that has been deliberately limited or blinkered — but an unconscious. Deprived of memory and the capacity to dream, the androids can be wounded but not traumatised. Yet there are signs that precisely this capacity to experience trauma is developing in some of the hosts, especially Dolores and the brothel madam, Maeve (Thandie Newton). Dolores is increasingly subject to flashbacks, which we must understand not as glitches but as the first stirrings of memory, a recollection of her previous iterations. Maeve, meanwhile, is tormented by fragmentary images of hooded figures tampering with her half-sleeping body. In fact, this is a memory of a botched repair procedure, which she witnessed because she was not properly put into sleep-mode while being fixed. In one of the most unsettling scenes in the series, the panicked and bewildered Maeve escapes from the hospital-cumrepair space, and stumbles around the aseptic compound, which — littered with decommissioned naked host bodies — must look to her like an atrocity scene. In attempting to solve the mystery of the inexplicable images which haunt her, Maeve comes to resemble a combination of Leonard in the film Memento and an alien abduction victim.

With few exceptions, the human beings in Westworld are a charmless bunch. Their behaviour runs a gamut from the savagery of some of the guests to the banal bickering and corporate competitiveness of the park’s designers, managers and engineers. By contrast, Dolores and Maeve’s struggle to understand what they are — alternating between thinking there is something wrong with their minds and something wrong with their world — possesses a kind of metaphysical lyricism. Their coming to consciousness looks like being the precondition for a very different android rebellion than that which took place in the 1973 film. This time, it’s hard not be on the side of the hosts.