CHAPTER 7

BLOOD, GORE, AND BODY HORROR

While it’s not a requirement of the genre, very few horror movies—especially contemporary ones—skip out on mixing some form of violence into the story, whether it’s a suggested threat or an unflinching shot of a body turned inside-out. When I did a casual search for recent examples of nonviolent horrors, the results put forth were surprising. It was immediately clear, from the breadth of films that made the cut, that people’s individual tolerances for what is considered gory and violent differ wildly. At least one list even recommended Funny Games (1997 and 2007, dir. Michael Haneke) among the ranks of horrors stripped of shock-value violence, despite extended and gruesome torture sequences, simply because these acts were off-screen. We generally acknowledge that scary movies will contain violence, whether mild and implied or in-your-face and extreme. If so much of horror involves a sense of threat, as we’ve explored in previous chapters, then violence surely has a role to play in building a scare.

But what if we reversed the situation and stripped the horror away? Is violence on its own scary?

When we boil down horror movie violence to its barest essence, the scares spool from the experience of owning a human body, and the ways in which ownership can be threatened and wrested away. Those threats of violence can come from inside the body, or from an outside source.

VIOLENCE FROM WITHIN

The human body is an ideal site for horror: the body is personal, and even on a good day it’s kinda gross. Fluids and squishy tissues aside, our bodies are the tools that allow us to exist in the world and experience it for all its pleasures and pains. But despite our marvelous complexity, at the end of the day, a human is just a fragile meat tube—a fact that movies like The Human Centipede (2009, dir. Tom Six) are quick to exploit—and physical existence is easily invaded, abused, and altered, even without an evil doctor helming the action.

Violence from within the body can take many shapes, including transformation and mutation (usually triggered by some sort of infection) and possession. While the tropes associated with each of these subgenres are distinct, they all lean into the idea of something getting under the skin or into the blood to make bodily changes.

Transformation horror takes many forms, ranging from classic werewolf flicks like An American Werewolf in London (1981, dir. John Landis) to non-lycanthropic transformations like Afflicted (2013, dirs. Derek Lee and Clif Prowse), which sees a tourist becoming a vampire in the aftermath of a one-night stand, and Bite (2015, dir. Chad Archibald), in which the consequences of a simple bug bite involve developing insectoid traits, spinning cocoons, and uncontrollably producing gloopy eggs. How do these fictionalized invasions play off real horrors? While thematically transformations speak to a loss of control—something that is already scary—the infection that leads to transformation speaks to common biological fears. While we can mitigate risks—we can wear condoms to help prevent STIs and use bug spray to stave off hungry bugs—nothing is ever 100 percent. For some of us, that leftover unpreventable what-if? can gnaw at our thoughts—especially if the worst-case what-if scenarios involve peeling flesh and drooling slime.

Even typically natural (and often welcomed) bodily invasions aren’t safe from the genre. I’m talking, of course, about pregnancy horror. As a subgenre, pregnancy horror points out that the miracle of creating new life also entails an entire organism taking up residence within your body as a sort of parasite, pushing your organs around, siphoning off your blood and nutrients to grow, and causing not only change to your physical body, but lasting alterations to your neurochemistry.

The classic example of pregnancy horror is embodied by Rosemary’s Baby (1968, dir. Roman Polanski), where the titular Rosemary (Mia Farrow) finds her pregnancy transforming her into something thin, weak, and sickly when she ought to be glowing. A ton of movies have since portrayed pregnancy as monstrous (or demonic, or alien) infections, and it continues to be an effective form of body horror, whether you have a uterus or not. Even the “Unprotected” segment of the anthology series The Mortuary Collection (2019, dir. Ryan Spindell) shows that uterus-less bodies don’t necessarily get a free pass—a cocky frat boy who lies about condom use ends up gestating a monster and has no organs suitable to birth it. It ends in the only way it could: horribly.

On the neurochemical front, Prevenge (2015, dir. Alice Lowe) is a great take on pregnancy directly messing with your brain. In the real world, what we commonly call “pregnancy brain” has been linked to a perfect cocktail of hormonal changes and exhaustion that causes pregnant people to feel more forgetful, unfocused, and out of sorts. Research is inconclusive about the extent to which pregnancy brain is more than anecdotal—one longitudinal study, led by Diane Farrar in 2014, compared pregnant women with nonpregnant women and did find that spatial recognition memory specifically was reduced in pregnancy—but other measured changes to the brain are even more interesting. One 2016 study found that pregnancy consistently pares down gray matter to make lasting efficiencies that are likely preparing the expectant parent’s brain to form attachments to its baby; other studies have identified a phenomenon called fetal microchimerism, where a fetus’s cells live on in its parent’s body tissues, including the brain, like little souvenirs. These changes are natural and, as far as research can tell at this point, innocuous, if not beneficial. Horror would much rather take a sinister spin. In Prevenge, mother-to-be Ruth (Alice Lowe herself) can hear the voice of her unborn daughter whispering to her constantly, goading her to murder the people involved in a climbing accident that claimed her partner’s life. The control that the whispering voice exerts is akin to a possession narrative.

In a sense, possession narratives can be thought of as a sort of spiritual infection. Instead of breaking through a body’s physical barriers, possessing entities invade by exploiting weak points in a person’s spirit, beliefs, emotions, or dreams. Instead of your cells breaking down and re-forming, or your own flesh turning against you, your experience of the world might become manipulated and your personality shoved aside. This isn’t to say that a ghostly or demonic possession movie is devoid of body horror. It’s quite the opposite: ghosts and demons aren’t used to wearing a human meat suit and have no qualms about contorting it into joint-breaking positions, à la back-bending Nell Sweetzer (Ashley Bell) in The Last Exorcism (2010, dir. Daniel Stamm), or inflicting self-mutilation, as when a very possessed Mia Allen (Jane Levy) licks an X-Acto knife in Evil Dead (2013, dir. Fede Álvarez).

The body is terrifyingly fallible. It’s all too easy for something—tangible or not—to germinate from deep within and wreak havoc. Of course, being so soft and fleshy, the human body is just as prone to violence at the hands of other people.

VIOLENCE FROM WITHOUT

Torture porn and extremity, especially, are among the most polarizing subgenres of horror. At one end there are the gorehounds who prefer their horror to be slick with blood and clotted with carnage, and at the other, there are atmospheric horror fans who claim to prefer “psychological” scares. (Yes, I’m totally being pedantic with my quotation marks because, as I’ve spent this whole book explaining, all scares are trying to leverage our psychology to a degree, not just the slow, creeping frights.) Many horror fans seem to have a pretty varied palate when it comes to what on-screen violence they’ll tolerate (and I include myself among that bunch who will at least sample almost everything), but it’s not uncommon for moviegoers to hit a limit when it comes to explicit, intense human suffering. Even among horror industry professionals, I’ll often notice a similar pattern when the subject of torture comes up: they will grimace and say something like, “Oh, no, I’m not into that sort of stuff.”

Or, to echo what Trudie (played by Shenae Grimes-Beech) in Scream 4 (2011, dir. Wes Craven) poignantly observes where torture porn is concerned: “It’s not scary—it’s gross.”

I do disagree with Trudie on one point: the concept of torture, any torture, but especially the seemingly meaningless Why are you doing this?–type torment featured in torture porn, is scary. It’s horrifying. But it is also absolutely gross. That said, torture horror is hardly the only subgenre that qualifies as gross. Blood splatter, gore, and body horror are featured across horror’s myriad subgenres, from slashers to revenge films to creature features. But what makes torture porn different is its approach.

According to Jeremy Morris, there are five basic elements that qualify a horror film as belonging to the torture porn subgenre, differentiating it from other ultraviolent horror films:

  1. The torture is non-interrogational, so if the film violence is means to getting information, then it’s disqualified.
  2. The torture itself is the source of horror in the film, and not just added flavor.
  3. At some point, the roles of the torturers and their victims are reversed and the victim has been transformed (for better or for worse) by their ordeal.
  4. The victim’s transformation into a torturer justifies the violence of the film. Sometimes it is a flimsy justification. Often it’s the only one.
  5. The torture is represented with realistic visuals. The torture shouldn’t be of a magical, supernatural, or religious bent, at least not at first glance.

The realistic approach to torture porn in particular is what’s so divisive to many audiences. While we typically have clear delineations between real life and horror, torture porn seems to approach blurring that line as a goal in order to have us connect directly to real bodies, and to feel the history that birthed this subgenre.

Torture porn has a specifically American flavor, mostly thanks to its emergence as a response to 9/11, as we discussed in chapter 2. Extreme horrors from other countries aren’t a tidy fit with this category, and neither are exploitation films from other decades. Other types of films that are often mentioned in tandem with torture films, like French and Asian extremity movies, have extreme depictions of horror, even torture, but torture tends to be used as a tool for horror rather than being the strict source of horror itself. Take, for example, Audition (1999, dir. Takashi Miike): the final torture sequences of the film are certainly the climax of horror (and undeniably disturbing), but Asami’s (Eihi Shiina) violent behaviors can be read as vengeance, rather than as torture for torture’s sake. Through the violence, the film engages us to connect with the characters and their emotions, rather than to their bodily experiences.

We mention in chapter 1 that one of the brain regions that gets fired up during scare sequences is the insula. One of the insula’s many functions includes interoception, or our awareness of what’s going on within our own bodies. While individual interoceptive abilities differ (research points to dancers as having particularly incisive interoceptive powers), seeing a clip of a person being flayed in a torture sequence will light up your insula and make you very aware of your own skin. Narrative and visual techniques in torture films tend to divorce us from the characters and instead fill the screen with unflinching takes of bodies being injured. This forces us to connect with the body that we’re seeing and builds upon that insular experience of mapping it to our own bodies.

Since I’m drawing a clear line between torture porn and other violent subgenres, let’s also take a closer look at where lines can be drawn between torture and gore. While we often lump these two elements together, the most obvious distinction is that torture is a form of violence and gore is a possible outcome of violence. In fact, they can even be mutually exclusive: torture can be enacted without any guts or blood being spilled, while splatter and gore can absolutely wreck the upholstery without acts of torture. From a visual storytelling standpoint, adding gore to torture sequences serves to engage that connection to the body even more and stir up revulsion in addition to the horror of seeing characters being subjected to intentional suffering. Gore on its own has a broader palette: it can be fun and playful shorthand in the form of a splatter across a wall, an explosive kill, or a room saturated in the aftermath of some bloodbath. Gore on its own can also be an incredibly effective tool when it’s used as part of a startle. One of the most stressful gore sequences I’ve ever experienced was from the genre-bending black comedy Spontaneous (2020, dir. Brian Duffield), in which scared students run through their school hallways, randomly exploding into a mess of death like blood-filled balloons. I think I spent the entire sequence with my hand clapped over my face in stress, even though most of the violence was only shown in fleeting glimpses. Sometimes the most powerful gore images end up being the ones that we barely see.

SIGHTS UNSEEN

If you’ve gone back to rewatch the first Saw movie (2004, dir. James Wan), you might have found yourself thinking this isn’t as violent as I remembered. Habituation, as we talked about in chapter 5, accounts for only part of what’s happening—the shock of Jigsaw’s traps is bound to have less of an impact the second time you watch them. While not a torture-horror entry, a more famous example of this experience is the “Stuck in the Middle with You” ear-cutting torture scene in Reservoir Dogs (1992, dir. Quentin Tarantino). Audiences misremember seeing the moment when Mr. Blonde (Michael Madsen) tortures a captured cop (Kirk Baltz) and cuts his ear off, an action that never actually appears on-camera.

You might have heard of this referred to as the Mandela Effect. The term was coined by self-identified paranormal consultant Fiona Broome to describe the phenomenon where people tend to misremember that Nelson Mandela died while in prison during the 1980s—to the point where they claim to recall seeing his funeral when it aired on television. (Mandela was released in 1990, served as president of South Africa from 1994 until 1999, and died in his home, surrounded by family, in 2013.) Wildly, this collective misremembering happened around 2010, when Nelson Mandela was decidedly still alive. The term was solidified in public consciousness about five years later when people online misremembered the children’s series The Berenstain Bears as “The Berenstein Bears” but could not find any evidence for this alternative spelling. Broome explains the effect using theories about parallel realities; others have blamed mischief and manipulation by time travelers, but neither of those concepts is something we can test scientifically. Instead, it is much more reasonable to propose a neuroscientific basis for the effect.

In the case of torture sequences where we cut away from the action, like in our Reservoir Dogs example, a similar phenomenon occurs, where we remember details of violence that weren’t actually visible. So, what’s actually going on in the brain when we fill in these gaps? The hippocampus seems to be the crucial component in creating this false memory by integrating sensory modalities into a “remembered” experience. The hippocampus specializes in consolidating long-term memories, but its main concern is what’s known as episodic memory, or our life experiences and personal memories. It’s also the part of the brain that can sift through your mental filing cabinet to pull really old memories up to integrate them into your present experience or imaginings of possible futures. So, the hippocampus calls up a memory of the last time we watched Reservoir Dogs, other areas of the brain, such as the ventromedial prefrontal cortex (vmPFC) and the dorsomedial prefrontal cortex (dmPFC), help to reconstruct the memory using present context clues and existing associations—an ear is cut off! That would make a bloody mess!—and now that we’re done with remembering, the hippocampus reconsolidates the memory and puts it back into its cognitive file folder with the new info included. It’s less like photocopying a photocopy—which would create a faded result over time—and more like making revisions every time the memory file is pulled up.

In a radio interview, neuroscientist Steve Ramirez explains one interpretation for why memories are so malleable: “the same machinery, for example, the hippocampus, that enables us to recall the past, is also the same machinery that enables us to reconstruct the past. It also happens to be largely the same machinery that helps us imagine ourselves in the future.” To a degree, every consciously recalled memory can be considered a form of false-ish memory because the act of remembering activates pathways in the brain to reconstruct that memory experience. As we mentioned in chapter 5, memory consolidation can be disrupted really easily by new information—the same is true with memory reconsolidation.

There are a few ways by which these false memories might be built. One is that the template that we’re using to rebuild the memory might be faulty, what are known as schema-driven errors. In psychological terms, schemas are conceptual knowledge templates that help to organize information. If I asked you to imagine a chair, for example, your basic schema for “chair” would probably be something like: “a structure with four legs, a seat, and a back,” like a drawing of a basic wooden chair. This schema makes it easier for us to look at different pieces of furniture and quickly understand them as chair and not-chair.

The trouble with schemas comes when our memories distort new information in an effort to fit it into an existing schema in a process referred to as “effort after meaning.” This means that it’s easier for us to disregard and/or transform details that don’t make sense or seem unfamiliar. In films, for example, this selective attention might be a factor in how the memory of a scene is encoded: if what we’re seeing on-screen is not interesting, we’ll instead pay attention to what’s unfolding outside of the camera lens’s field of view and integrate that information instead. And then of course our social, context-based understanding of what’s going on in the scene helps shape how that memory is recalled and reconstructed.

So, when we remember the torture scene from Reservoir Dogs, we remember the context of Mr. Blonde dancing around with his straight razor, we remember the close-ups of the cop tied to the chair, his mouth duct-taped and blood dripping down his face, we remember the sounds of Mr. Blonde saying “hold still” while the cop grunts in pain, and we remember seeing the fleshy ear in Mr. Blonde’s hand in the aftermath. Although we don’t immediately get a glimpse of the gore that would be where the cop’s ear used to be, our reconstruction of the memory of the scene fills in the logical missing step—the moment when the cop’s ear was cut off while Mr. Blonde was straddling him on that chair—and readily conjures the missing visual for us. And if we hear from other people that they remember that image too, it serves as a reinforcement for the visual.

This lends support to the idea that whatever we imagine is going on in a narrative between cuts will be scarier than what we ultimately are shown. From context clues and our own past memories, our brains are able to construct and reconstruct events in ways that can be way more intense than what we see on-screen. And when it comes to torture horror, gore, and body horror, that reconstruction can feel very personal because it involves interpreting through the lens of our own experiences of having a body. Filmmakers recognize the power of that quick cut away just before a blade slices skin or just before the worst torture is about to take place on-screen. It’s a clever way to exploit our brains into conjuring images possibly more intense than what can be accomplished with visual effects.

If you’ve come this far and thought to yourself, Wait—my brain has never imagined any missing visuals for me. This definitely sounds fake, you’re not alone. The ability to visualize exists on a spectrum, and living at the extreme end of that spectrum is aphantasia or a complete inability to conjure mental images. It’s not an easy phenomenon to measure, but existing research has estimated that anywhere from two to five percent of people might experience this absence of imagistic thought.

One 2021 study sought to explore the role of mental imagery in conjuring fear by placing electrodes onto the skin of participants, half of whom experienced aphantasia. The researchers sat them in a dark room, and then read them scary scenarios, like stories of the listener falling off a cliff, or being inside a plane that is crashing. The electrodes measured tiny changes in sweatiness associated with arousal, which changes your skin’s electrical conductivity. While it’s not a direct measure of fear (nor can it pinpoint an emotion as specific as fear), it’s a good indicator that some sort of intense physiological reaction is happening. Participants who self-reported that they experience aphantasia showed no change in skin conductance while listening to the stories, while participants who did not experience aphantasia demonstrated spikes of arousal as they pictured themselves in the situations being described.

When the experiment was repeated using upsetting images instead of just stories, everyone showed the same amount of freaked-out skin conductivity, whether they experienced aphantasia or not. This result suggested that the lack of fear response in aphantasic participants when no images were presented boiled down to the power of visualization in heightening fear.

It’s possible that people with aphantasia are the ultimate audience for the oft-cited “show, don’t tell” rule of storytelling. Cutting away from gore won’t, well, cut it: without actually seeing the scare on-screen, the aphantasic moviegoer just doesn’t have enough information to experience the emotional wallop that the filmmakers are trying to conjure through implied violence. People with aphantasia can get spooked like everyone else, they just need to be presented with concrete visuals to get there.

When critics refer to horror as a “visual feast,” it usually seems to refer to depictions of violence. While scares can be effective without any gore whatsoever, there can be something so appealing about the splashy violence and grotesque kills that only horror movies can achieve.

THE BLOODIER THE BETTER?

Blood Feast (1963, dir. Herschell Gordon Lewis) is often considered the first splatter film—or the first horror film to feature graphic on-screen gore. Merely three years earlier, Psycho (1960, dir. Alfred Hitchcock) had gone so far as to feature disembodied blood burbling down a shower drain in sterile black-and-white; Blood Feast feels like a response to Psycho, delivering buckets of blood, tongues, and body parts in raunchy Technicolor. It might seem schlocky and silly by today’s standards, but it’s undeniably bloody.

In real life, fear of blood, which is often rolled into the fear triad known as blood-injury-injection (BII) phobia, is common. It’s estimated that up to 4 percent of people experience BII phobia, a high enough percentage that you’ve probably met someone at some point in your life who cannot handle the sight of blood, real or fake. While it’s a common fear, it does have an unusual presentation. While all phobias trigger a fear response—that’s a big part of what makes a phobia, well, a phobia—most fears don’t cause people to faint. Fainting at the sight of blood is caused by a very specific physiological response called a vasovagal reaction. The vagus nerve controls involuntary “rest and digest”–type functions, like lowering your heart rate and telling your body to unleash gastric juices into your food-full stomach. When something triggers it to overreact, like a sudden scare, it might overshoot and cause heart and blood pressure to suddenly drop—basically the opposite of your typical scare responses, which usually send heart rates soaring. The sudden drops associated with the vasovagal reaction result in wooziness at best, and a dead faint at worst. Studies have proposed that blood fears might trigger a vasovagal reaction thanks to stimulating fear and disgust at the same time. Why blood, gory injuries, and hypodermic needles seem to be the only fears that consistently win the disgust-fear vasovagal lottery and not other gross-out fears remains a mystery.

As far as horror movie aesthetics are concerned, though, blood is a staple of the genre. It’s hard to think of horror, especially the messier subgenres like splatter, revenge, and body horror, without liberal use of jets, drips, and splashes of blood. Too much blood, of course, will push a film into an R rating, or even the dreaded NC-17. Scorsese famously desaturated the blood to a brownish color in Taxi Driver (1976) when the MPAA was threatening an X rating. (He earned an R for his efforts.) As we’ve discussed in previous chapters, countless horror movies have also snipped away seconds from their bloodiest sequences to evade restrictive ratings.

Theatrical blood has seen a number of different formulas over time, from chocolate syrup to more complex concoctions containing dyes, syrups, and sometimes dangerously toxic chemicals (like Kodak Photo-Flo, a concentrated wetting agent for preventing water spots while processing photographic film, which was a key ingredient in the legendary fake blood recipe developed by Dick Smith). In terms of color, opacity, and viscosity, watch enough movies and it’s obvious that some fake blood recipes look more realistic than others when captured on camera. This is partially due to changes in aesthetic trends over time, and partially due to which bloods were popular on the market. More recent bloods tend to be darker and less like the vibrant arterial bloods we’ve seen in the past. One quality is pretty consistent across fake bloods, though: most of them are sticky, cause stains, and are generally a pain to work with take after take.


SCARE SPOTLIGHT: DEEP RED (1975, DIR. DARIO ARGENTO)


You can peep one of my favorite forms of movie blood in Dario Argento’s aesthetically violent giallo films of the 1970s, like Suspiria (1977) and Deep Red. You can also see it in George Romero’s Dawn of the Dead (1978), playing a nice, garish contrast to the grayish zombie makeup.

Consider the death scene of Helga (played by Macha Méril), near the beginning of Argento’s Deep Red: Helga’s just been struck from behind with a meat cleaver and has had her head smashed through a window. Marcus Daly (played by David Hemmings), who is on the street below, sees it happen and rushes up to the apartment to help her. As he pulls her back from the window, the blood on her throat looks dry and waxy, frozen into perfect drips around the piece of glass that’s stuck into her neck. When she’s laid out dead on the floor, more blood pours thickly from her mouth and pools on the floor like melted red crayon.

It’s not one of the most realistic bloods—in fact, we could argue it’s one of the least realistic. It’s a garishly Technicolor red-orange and it looks suspiciously waxy or nail polish–like. The blood in question was known as Nextel Simulated Blood, which was developed by Phil Palmquist and Len Olson at 3M. The formula won a Technical Achievement Award at the 45th Academy Awards in 1972 and enjoyed a brief burst of popularity in the 1970s. What made it so innovative? It wouldn’t stain skin, clothes, or sets.

It’s a fun bit of science. You see, Nextel Simulated Blood didn’t stain skin or clothing because no liquid dye or pigment was involved in the solution. The red color, and probably its plasticky appearance, came from microspheres—basically teeny-tiny red plastic spheres suspended in a colorless liquid thickener. Other recipes for blood include oily compounds that readily stain pretty much everything. So instead of soaking into surfaces like a stain, the Nextel blood just sits on top of the material that it’s bloodying up and can be easily wiped away.

Unfortunately, while it was an attractive concept, and was popular for theatre productions and live shows (KISS used it for their concerts, and front-person Gene Simmons was once photographed pretending to drink a bottle of the stuff), Nextel Simulated Blood didn’t exactly perform well on film sets. Tom Savini, who worked with Nextel Simulated Blood on Dawn of the Dead, reportedly wasn’t a fan. As he describes in his book Bizarro: A Learn-by-Example Guide to the Art & Technique of Special Make-up Effects (1983), “At the time there wasn’t a really great blood formula floating around. The blood I used in Dawn was 3M Brand Stage Blood which sometimes photographed terrifically—really deep, red blood—and other times looked like a tempera paint.” He didn’t recommend it for film productions based on how it was picked up by most film stock.

It isn’t all that surprising that the Nextel blood didn’t always behave as intended on camera. The structure of the near-microscopic spheres in suspension likely scatter light differently than the dissolved dyes of other fake bloods, giving the blood its unexpectedly crayon-like coloring. I don’t have a patent for Nextel blood that I can read over, but it’s possible that this inconsistent coloring comes from an effect known as structural color. For most objects, our perception of an object’s color is dictated by light that is absorbed or reflected by a material. We perceive the pigment in red food coloring as red because the pigment’s molecular structure absorbs light wavelengths that are not associated with our perception of the color red, but reflect the ones that are, and these reflected wavelengths of light then enter our eyes and excite the cones on our retinas that let us perceive that color.

But sometimes, the structural arrangement of a material, at micro-, but especially nanoscales, can actually affect how we perceive a color. If we take the iridescent blue morpho butterfly, for example, and zoom in on its wings until we’re viewing them on a nanoscale, you’ll see that the butterfly’s wings aren’t even blue at all—they’re brown. The tiny brown scales that make up the blue morpho’s wings are arranged in layered rows of structures that behave like a diffraction grating, scattering light and causing interference in the way the light is being reflected. The blue wavelengths experience constructive interference, amplifying the vibrant blue that we perceive, whereas other color wavelengths experience destructive interference. It’s possible that the extremely small size and arrangement of the microspheres being used in the Nextel blood might have been similarly creating a structural color effect. So, just like how the color of the blue morpho’s wing changes slightly as you tilt it, shooting Nextel Simulated Blood from different angles might produce small variations in color.

In any case, by the time he was working on Friday the 13th in 1980, Tom Savini was using a much more realistic-looking blood recipe, and Nextel Simulated Blood had more or less retired from Hollywood.


Is bloodier better? It’s not a question that we can really answer. Blood is a tool—a bucket of it can ramp up the schlock factor and fun in a movie like Evil Dead 2 (1987, dir. Sam Raimi); likewise, a bucket of blood can bring tension to the breaking point when it’s dumped onto a traumatized and telekinetic prom queen’s head in Carrie (1976, dir. Brian De Palma). Whether a wound seeps arterial red or “realistic” brownish blood, milky white fluid or, perhaps most frighteningly, nothing at all, blood can be used to conjure feelings of distress and disgust without too much effort (although I always imagine the bloodiest sets are a nightmare for sticky-coated actors and cleanup crew).

MORE THAN MEETS THE EYE

One of my first jobs out of university involved giving live organ dissections and demonstrations for visitors to a science center. People would gather around a table while I guided their gloved fingers through valves into the ventricles of pig hearts or pointed out different regions in a slice of a sheep’s brain. My favorite demonstration, and the toughest sell to everyone else, was the cow eye dissection. Hearts, lungs, brains, or stomachs? Not a problem for most people. But pull out an eyeball and people start cringing, gagging, and backing away.

You might have noticed that the same squeamishness toward eyeballs translates to the screen. I know I see it. We all have different tolerances for gore and body horror, but seeing eyeballs pierced, punctured, or plucked out of their sockets tends to be where a lot of people find themselves squeezing their own eyes shut. But being vulnerably squishable vision-orbs filled with gelatinous goo (that’s technically called vitreous humor), eye horror is making constant appearances in genre. And even if it’s not featured in the movie itself, eyes have become synonymous enough with horror to grace all sorts of horror film posters (and book covers!).

When it comes to the people who can’t handle seeing eye horror, I have a theory that the discomfort stems from the fact that eyes are intrinsically more personal, familiar organs. You don’t see your liver in the mirror every morning while you’re brushing your teeth. Probably you don’t take selfies with your small intestine hanging out front and center. Our guts and gore live on the inside, and while it’s gross when we see our insides exposed to open air, we don’t have the same intimate attachment to our guts as we do to our eyes.

The external nature of eyeballs makes it easier to imagine having needles pushed into them while watching the same thing happen to Shigeharu (Ryo Ishibashi) in Audition, or having thumbs pressing them deep into your sockets like in 28 Days Later (2002, dir. Danny Boyle), or even having them sliced with a straight razor like in Un chien andalou (1929, dir. Luis Buñuel) or Would You Rather? (2012, David Guy Levy). I know I squirmed and felt my own eyes water the first time I watched Alex’s (Malcolm McDowell) eyes being propped open for aversion therapy in A Clockwork Orange (1971, dir. Stanley Kubrick), even if his character’s eyes technically come out unscathed. And then I squirmed some more when I later read that, despite safety precautions, Malcolm McDowell’s cornea got sliced on one of those eye clamps anyway.

It’s not hard to extrapolate to a sympathetic reaction to seeing an eyeball getting speared or slashed. Seeing eyeballs get ruined so easily in movies is a painful reminder of how fragile and unprotected eyes are. Even I have a bizarrely specific preoccupation with the idea that I’ll somehow accidentally slice my cornea open with a playing card whenever I’m playing a card game (even though this has never happened). While there isn’t a wealth of research specifically on eyeball-related fears, studies have shown that eye mutilation occupies a similar fear dimension to blood, injection, and injury fears discussed earlier in terms of anxieties around loss of body integrity and bodily invasion. There’s something about these fears specifically that have an especially large anxiety spike associated with anticipation (that good ol’ amygdala-fueled fight-or-flight response) that switches quickly to insula-led disgust and interoception when we’re actually confronted with visual imagery. Unlike other types of fears, blood-injection-injury (including eye injury) fears feel personal.

We love the spectacle that violence adds to horror more than the acts of violence themselves. We love the idea of a creative kill—limbs bent in ways that they shouldn’t, and bodies taken apart in unusual ways. We love the visuals of blood raining down to soak our Final Girl from head to toe—or of a chopped limb sending an arc of blood splattering against a wall—but we love it because it’s part of the horror experience, not because we love violence itself. Gore is gross; the threat of violence is stressful and scary, even when it’s divorced from the context of horror. When violence becomes a part of horror, it becomes an emotional amplifier that connects us personally to the characters and to the action, because we all know the experience of owning a body. We may not always empathize with the violence we see on-screen, but we definitely recognize it.


IN CONVERSATION WITH JOHN FAWCETT



John Fawcett is a director and showrunner whose horror credits include cult werewolf film Ginger Snaps (2000) and Orphan Black (2013).

As a filmmaker, do you feel particularly drawn to horror compared to other genres?

I think horror for me was the real reason probably I became a filmmaker in the first place. My way of thinking about horror and my way into horror has always been from a very visceral, emotional point of view. It’s never been particularly science-y; it’s never been really, even really clearly thought through. It’s only, I’ve only reacted to things in a really raw emotional kind of way.

I remember I was always very afraid as a kid. I was very fearful. I just remember being little, being like age five through, I don’t know, like fifteen, and being really afraid of, like, the dark, of everything. It took a long time to kind of, like, develop myself away from that. It’s funny, because I think that I started to look at horror almost as my own therapy. Because I was scared of it! I remember seeing Black Christmas [1974, dir. Bob Clark] on the television and it was really scary. And my parents shut it off before it got too intense, but it was too late! It had fully traumatized me.

I saw Halloween [1978, dir. John Carpenter] at a young age. I saw American Werewolf in London fairly young. I remember seeing parts of The Exorcist [1976, dir. William Friedkin] and The Omen [1976, dir. Richard Donner] really young and because I was so afraid, I was drawn to it. And it became this kind of thing where I would be … I would try and test myself and see how much I could stand and then back away again. It’s funny because I started to get more understanding of how films were made and I used that as a way to conquer my fear of horror movies. And that was like, say, imagining what it would be like on set to make a movie.

Like taking yourself outside of the action?

So, imagine Halloween. Halloween was a movie that really, really scared me. I saw it in junior high. There’s a lot of frightening images there, like Michael Myers with the mask and stabbing through someone into the door so they’re up off the ground … and I was like, okay zoom back and now I’m inside a set, I might not even be in a real house. I’ve got lights at the windows, I’ve got a guy in a mask, I’ve got a microphone overhead, I’ve got people behind the camera, and, you know, if you’ve been on a set, it ain’t scary. It’s just not scary. So there was a part of me that conquered my fear of horror in that way, but I was always drawn to what frightened me in the first place as a kid. And I think that’s where this “what scares you” kind of thing came from, really. As an understanding of horror films and what scared me in the first place.

And that came from really base, primal stuff. Like being a kid and being afraid.

How does that translate to how you see horror now?

Well, I think that part of what makes horror scary is when it takes real people, or what we perceive as real characters, and puts them in outlandish, scary situations. Which has been a sort of working theory of mine with regards to genre, not just horror, for a long time. That the way to make these things work, to make people go for the ride, to identify, you have to present them with characters that feel real. That their emotional journeys feel real. They don’t feel like actors acting a script. There’s always some kind of trope to a character, but there are always ways to circumvent the clichés. I think as long as we get presented with something that feels authentic, we buy into stories. Like stories of werewolves or clones or aliens, or whatever it is.

In the ’70s, the style was very real. They weren’t trying to do a kind of heightened comedic style. I don’t know what your favorite movies of the ’70s were. But there’s a lot. A lot of my favorite horror movies came out of the ’70s. I mentioned a few, but there was also: Don’t Look Now [1973, dir. Nicolas Roeg], Alien [1979, dir. Ridley Scott], Black Christmas, Dawn of the Dead … Jaws [1975, dir. Steven Spielberg] was kind of one of the most successful horror films of all time. It’s all very real approaches to the characters. So you buy in. As soon as you feel like a character is too stylized or not real, suddenly it’s not scary.

You mention a lot of movies that are considered horror classics, which are sort of responsible for what audiences expect today when they go to see a horror movie. How much of your approach as a director involves navigating these audience expectations?

I think if you work in the genre, you become very aware of the tropes. And certainly as the filmmaker, you become very aware of what works and what doesn’t work. And so some of this stuff works because it’s easy. It’s low-hanging fruit. And that doesn’t mean don’t do it. I think that … I love a good jump scare. I’ll do it. There’s a bunch of really simple little techniques that freak people out in the movies. Revealing someone standing behind them when you didn’t think someone was back there, and leaving them out of focus. Having a suspense sequence leading to an area where there’s a big empty space in the screen. And then not using that big empty space.

And the thing is, I like a lot of those old horror techniques.

And audiences do too!

I think they do! Listen: I think if you’re presenting material that feels original—you still have to tell a story, you still have to cinematically tell a story. And there’s not that many different ways to do it, to go about telling it. What makes something unique is, yes, a visual language can be … it doesn’t have to be boring. But what makes it unique is the characters and the subject matter. That’s what sets it apart from its tropes.

There are lots of these tropes that, if you present them in a way that feels fresh and original, in subject matter that is in a direction that people haven’t seen before, then it’s … I think that dipping your toe in some of these things is fun. It’s like, when I was making Ginger Snaps, you become really aware there’s a certain mythology behind the movie werewolf. I didn’t actually want to make a werewolf movie at all. I wanted to make a metamorphosis movie. And so I wanted to make it about transformation, physical transformation—I wanted to make body horror and have it be comedic.

And the more I thought about it, the more I thought: this is the reason to do this. It’s a genre that can be reinvented.

The way I looked at it was: here are the ten things that define a werewolf movie. And I would use them because people expect it! But I would use their expectations against them. I would reinvent the genre by taking their expectations and twisting them and bending them and subverting them. And sometimes to use them just for fun because it’s fun. You’re making a werewolf movie, let’s have fun with it!

How do you feel about revisiting Ginger Snaps twenty years later?

It’s really interesting to take that movie and update it. I think that it actually survived reasonably well over the last twenty years, but it’s interesting to take it and go: let’s put it in a contemporary setting now. Let’s modernize it. Let’s have it transcend people’s expectations again, you know? And do something totally different with it.

That’s been my theory of making movies and making stuff from the beginning. Just look for all of the tropes and try to avoid them and look for the things that you can take from the genre, that you can play with in the genre and make it unique and try and kind of either genre-mash or use unexpected elements to tell your story.

Because everything’s been done! It’s hard in horror. It’s hard to make anything original these days.

What do you find scary?

I don’t get frightened by fantasy creatures. I don’t find that frightening. What I find frightening is things that actually could potentially exist or could affect me or could really attack me somehow in a dark alleyway. Or get in my bloodstream.

There are lots of different signposts of Canadian horror and body horror is definitely one of them. David Cronenberg was a big influence on Ginger Snaps and on just what I think is scary. I think things getting into your body or into your bloodstream is one of the more frightening genres in horror.

Audition really scared me. It was really, really creepy. I like American Mary [2012, dirs. Jen Soska and Sylvia Soska]. I liked, obviously The Fly [1986, dir. David Cronenberg] and Dead Ringers [1988, dir. David Cronenberg] were big influences on Ginger Snaps. Alien was another obviously big body horror movie. And The Thing [1982, dir. John Carpenter], which is probably one of my favorite body horror movies of all time. It’s an amazing movie.

You do what feels frightening, right? You do what feels scary and that comes from an emotional place, and everyone is different. I think anything, if you want to make horror movies, body horror movies, thematically is a good starting spot for me. And another thematic starting piece for me always in horror is family. Those are the things that, to me, are big aspects that I like in horror. I like psychological horror. And, you know, if I think about psychological horror, I think about movies like, well, potentially starting with, say, Repulsion [1965, dir. Roman Polanski].

Do you find yourself drawn to horror more in your work? Or are you partial to any other genres?

I like humor, but often the two things go together well. More often than not I see things through a horror lens. Like not a super-serious horror lens, but through some kind of comedic horror lens.

Because for horror you have to be with a character. You have to be utterly with a character to be afraid. You can’t be omniscient. If I’m going to make an audience fearful walking down a hallway, probably the most effective way to do that is to be with the person. To be right behind them. And so really, you could make a whole horror sequence out of two angles. Like if someone is going into, say, an old barn. You could have one angle following behind them, so we see what they see, and one angle on their face to see how they feel. And you could probably construct the whole scene with those two angles and a flashlight.

It’s been a common theme in my conversations that horror involves a lot of emotion, emotional intelligence, and empathy.

Well, I think that that makes sense. That is the kind of thing that makes a good story: there’s two things for me. One is the believability of the characters, that you’re presenting characters that the audience cares about, that they feel real. So, whatever the emotional story is, you need the audience to be able to get inside those characters. That’s why I personally have a lot of issues with very stylized characters, because I can’t … I feel like I’m being held at arm’s length. I can watch it, but I can’t emotionally engage with it because it feels like I’m just supposed to watch. There’s something cold, there’s something that’s keeping me outside of it, and so it’s very important with horror to be able to engage, to viscerally connect with characters, and that comes from emotion, that comes from empathy, but the other thing, for me, in terms of, you know, getting an audience to believe what is happening, is the way the actors are directed, the actors are in the scene and there is a kind of emotional landscape for the actors that is believable.

For example, the final scene in Ginger Snaps, after Ginger is lying dead on the floor there is a very emotional scene where Brigitte’s tears are streaming down her face and this was a moment where I, making this movie on a shoestring budget, was terrified that people weren’t going to believe for two seconds any of the effects that we were doing. It was just a guy in a rubber suit, right? And this thing that was lying on the ground was a big bunch of fur and … it’s utterly fake. But what sells it is the totally authentic emotion on Emily [Perkins]’s face. And these tears. And you go oh my god, her sister is dead.

So, that makes me believe all of the makeup nonsense and absurdity of the concept that is my sister has turned into a werewolf—it’s how much I believe the characters.

The second thing is also … and this is just my thing. It’s not everyone’s thing. My thing is: don’t take yourself too seriously. I find that when movies are very earnest and there isn’t a stitch of comedy anywhere, you can’t laugh at anything, I find that I get disengaged a little bit. I find that when I can laugh, when I’m allowed to, I find myself much more invested in the characters and I find myself invested more in what’s happening to them. I’m not saying make the horror elements funny, but I think allowing some humor through the eyes of the characters allows a larger bandwidth to accept the absurdity of the concept.

Because a lot of horror is kind of absurd.

Do you think it’s possible to make something too novel that might alienate audiences?

I think as long as you’re telling me a story about a character that I care about and they’re real, and that story line is a character journey that is real—that’s how stories are told and that’s what generally works.

I think you can tell a weird story, a weird horror movie, as long as I believe in it and I give a shit about the characters. I have to believe in it. You have to make me believe in your content. So. Tell me some weird science. Tell me, make me believe in the mythology somehow, whatever it is that you’re doing. Like, I haven’t seen Teeth [2007, dir. Mitchell Lichtenstein]. I’m not sure how they’re going to make me believe that a vagina has teeth.