Just when the studios get comfortable, it seems, new technologies upset the status quo. Hollywood is always in a state of transition. In the 1970s and 1980s home video took Hollywood by surprise, and the industry spent the better part of a decade adjusting to the changes wrought by the new medium. But in the end, just like many previous technological revolutions, home video opened up new aesthetic possibilities for filmmakers, gave viewers more options, and increased Hollywood’s profits. By 1986 video rentals surpassed theatrical box office returns, and by the end of the 1980s video rental stores in the United States outnumbered movie theaters four to one. Sony’s early Betamax advertisement, “Watch whatever whenever,” announced the new motto for the industry, although it would take Hollywood some time to adjust to the new reality.
Hollywood should not have been surprised by the VCR. Bringing moving images to the home and putting the power of recording in the hands of consumers was a long time coming. Edison (it always goes back to Edison) developed a projecting home Kinetoscope in the 1890s but only sold a handful. The French company Pathé began selling an amateur 9.5mm movie camera, the “baby,” in 1922. In the 1940s and 1950s, engineers developed electronic videotape processes, one of which was funded by television crooner Bing Crosby’s production company. Crosby hoped to develop videotape so that he wouldn’t have to do two identical live television performances, one for the East Coast and one for the West Coast.
By the 1950s and 1960s commercial videotape machines were being used by most television companies, and video cameras began to show up on film sets in 1960 when comic actor-director Jerry Lewis started using a video camera next to his film camera in order to have an immediate look at actors’ performances, a device which came to be called a “video assist.” In the early 1970s, MCA/Universal collaborated with a consortium of technology companies to develop a laser disc playback system called Discovision, which eventually hit the market in 1978, a few years after the first VCRs.
Although Hollywood experimented with videotape and laser discs, Sony released the first consumer VCR, the Betamax, in 1975. Technology manufacturer JVC released a competing format, VHS (Video Home System), shortly thereafter. Betamax had a higher-quality image, but VHS won the format war. VHS tapes reached the goal of holding an entire feature film before the shorter Betamax tapes, and JVC licensed the VHS format widely to technology and content companies, while Sony exerted tight control over the proprietary Betamax format.
The proprietary-versus-open-format wars have been repeated many times, recently with Apple’s (mostly) closed IOS mobile operating system and Google’s (more) open Android operating system. Open systems have tended to win out over closed systems in the past, but, despite Karl Marx’s adage, history does not always repeat itself. It remains to be seen which mobile platform will triumph, or if they can coexist.
VHS may have prevailed in the format war, but Sony, as the first manufacturer to release a home video machine, had the privilege of being sued by the Hollywood studios. More specifically, MCA’s Lew Wasserman, who controlled Universal studios and MCA’s large television production arm, filed a lawsuit against Sony. Wasserman asked every Hollywood studio to join the suit, but only Disney took him up on the offer. Disney not only had a large investment in television production, but the studio’s executives worried about the impact VCRs might have on its unique business model. Disney owns a library of classic children’s films and television shows, which the studio releases anew for each generation. If consumers started taping Disney shows and owning Disney movies, how would the company survive?
The case went to the Supreme Court, where the justices took the unusual step of hearing oral arguments twice. After the arguments, we now know from internal correspondence, the majority sided with the studios, favoring the argument that consumers could not tape television shows without the permission of the producers. But during deliberations, Justice Sandra Day O’Connor changed her mind and tipped the scales in favor of Sony. She saved the VCR’s record button and fast-forward button (which allowed viewers to skip commercials) by a single vote.
This truly groundbreaking decision found “time shifting,” recording a television show to watch it at another time, to be fair use (i.e., it did not infringe the rights of the copyright holder). Even making a temporary copy of an entire show, the decision stated, was protected by copyright law. All the evidence in the case suggested that people with VCRs watched more television. The court also listed examples of socially valuable uses of home video. VCR owners might record a presidential speech or simultaneous news programs for later viewing. Because VCRs had valuable legal uses, the court decided, Sony could not be held liable when users found illegal uses for the machines. The Sony Betamax decision changed the environment for home media innovation, and it made possible the creation of technologies from the iPhone to YouTube.
But Lew Wasserman did not give up easily. With support from MCA/Universal and other studios, the Motion Picture Association of America (MPAA) began lobbying Congress for an alternative solution to the problems VCRs created for Hollywood. The MPAA asked for movie and television producers to receive royalties from the sale of VCRs and blank tapes. Such a royalty would have acknowledged that unwanted copying took place, but at least the studios would benefit from it indirectly. Congress said no.
The MPAA also asked that the copyright first-sale doctrine not apply to videotapes. The first-sale doctrine permits consumers to rent, sell, or give away copies of media that they own. Used bookstores, for example, sell books without the permission of the publishers. The first-sale doctrine posed a problem for video sales, because once rental stores bought tapes from the studios, they were free to rent them repeatedly, collecting all of the rental fees. And the studios could not realize the full profits from films that found success in the video market. Congress again said no and refused to exempt videotapes from first-sale protections.
Hollywood responded first by trying to create what the industry calls a “sell through” market, cutting out the rental stores and selling tapes directly to consumers. Films with cult followings like the Star Trek franchise found fans who wanted to purchase movies to watch multiple times. And the occasional blockbuster, especially if it appealed to children, sold well when it was priced for families. Steven Spielberg’s E.T. the Extra Terrestrial (1982) became one of the biggest sell-through success stories. With cassettes priced at $25, one in five US households owned a copy of E.T.
Most tapes, however, were priced at $65 or more, which discouraged personal ownership and required small video stores to be selective in the videos that they purchased. Even the biggest rental chain, Blockbuster Video, could not afford enough copies to satisfy the initial demand. In 1997, Sumner Redstone, CEO of Viacom, the media conglomerate that owned both Paramount and Blockbuster, brokered a deal for the studios to provide rental chains with licensed (as opposed to owned) copies. The stores would get as many copies as they needed, and the rental chain and movie distributor would share the profits from the rentals. Finally, Hollywood had found a system for profiting from successful video rentals.
To maximize sales from theatrical and home viewing, the studios further developed their system of “release windows.” They first released films to movie theaters, then for television broadcast, and finally for home video rental. Studios have continued to experiment with these windows in subsequent decades, with some movies even being released “day and date” (i.e., simultaneously in theaters and on home video). Readjusting or giving up release windows has been one of the most difficult transitions for Hollywood studios in the digital marketplace.
Home video was more than just a new distribution outlet; it changed film production too. Films out of place in the theatrical blockbuster culture found success on home video. Music films, for example, did well on video, because viewers watched them more than once. Flashdance (1983), Michael Jackson’s Thriller (1983), Purple Rain (1984), and Footloose (1984) were all part of a music-film cycle that took advantage of the video market. As consumers calculated whether to go to the theater or wait for the video, blockbusters got bigger, while more intimate films did well on video. In 1989 the blockbusters Batman, Indiana Jones and the Last Crusade, and Lethal Weapon II were the top-grossing theatrical releases. But the light caper film A Fish Called Wanda was the top rental that year.
It would be difficult to quantify all of the different ways that home video infused Hollywood with cash and creative outlets. Studios and independent producers began funding films through advance sales of domestic and international video rights, and in 1986 Platoon became the first Oscar-winning film funded with the presale of video rights.
Ironically, one of the companies that led the resistance to home video, Disney, may have benefited most. Disney released a few of its smaller films to home video early on, but it kept its classics like Snow White (1937), Fantasia (1940), and Dumbo (1941) locked in the studio’s vaults. The 1983 theatrical rerelease of Pinocchio (1940) set off a clash at the top levels of Disney’s management. Newly appointed CEO Michael Eisner wanted to release Pinocchio and other Disney films on video; the new head of the Disney’s movie studio, Jeffrey Katzenberg, worried about the effect on future theatrical rereleases. Eisner won and twenty-one classic Disney films appeared in video stores. The profits rolled in and funded the next wave of Disney classics, including The Little Mermaid (1989), Beauty and the Beast (1991), Aladdin (1992), and The Lion King (1994). And these new films, in turn, helped Disney acquire the television networks ABC and ESPN and grow into a major international media company.
While Hollywood studios learned to thrive in the home video economy, the fantasy of the playback-only disc remained alive. MCA released Discovision shortly after the introduction of the VCR, and laserdiscs remained a high-quality niche market for cinephiles and collectors. But the digital video disc (aka digital versatile disc, or DVD) offered a new opportunity. The president of Warner Bros. home video, Warren Lieberfarb, saw the potential of the DVD early on, and he convinced the other studios to adopt the new format.
With DVDs, Hollywood attempted to avoid all of the pitfalls it had encountered when VHS tapes were introduced. Released in 1995, DVDs were at first available for playback only, although eventually recordable DVDs became widely available as well. DVDs carried high-quality video and could hold large amounts of compressed video, allowing studios to add extra features that enticed viewers to buy and rent them. New markets arose for unrated versions of films, director’s cuts, and DVD extras about the making of feature films or additional scenes. DVDs also allowed multiple audio tracks for dubbing and commentary. DVDs were priced to be sold directly to consumers, encouraging the building of home libraries and consolidating distribution in the studios.
Finally, DVDs contained copy-protection software that prevents easy duplication. VHS tapes used a form of copy protection as well, but in 1998 it became illegal to bypass the copy protection on DVDs. Although ineffective as a tool for preventing piracy, copy protection allowed Hollywood studios greater control over which DVD players are compatible with studio movies on disc, and it allowed the studios to create versions of movies that can be played only in certain geographic regions, offering region-based price discrimination. DVDs could be sold at one price in the United States and at another in eastern Europe, for example.
DVDs enhanced viewers’ relationship to movies and television series. Home libraries encouraged repeated viewing, and DVD extras enticed viewers to learn more about production histories and the events that inspired the movies or shows. Fans, scholars, and educators enjoyed access to high quality copies for research, teaching, and the creation of fan artwork.
Fan communities had existed since the early days of the studio system, but DVDs and digital editing software gave fans new tools for creating works that parodied, criticized, and further explored Hollywood media. With the introduction of internet video sharing platforms in the mid-2000s, fans found outlets for displaying and commenting on each others’ work as well. DVDs may have seemed like an incremental change from VHS tapes, but the higher quality, cheaper prices, and digital formats allowed DVDs to change the culture of movie consumption.
Digital technology and the internet expanded the home video market even further. In the 1980s, most VHS tape players sat on top of televisions with 12:00 a.m. blinking on the device’s clock. Consumers found VCRs complicated to set up and program. But digital video recorders (DVRs) incorporated hard drives and computer interfaces, creating sophisticated and more intuitive video recording devices. DVRs could be programmed to record favorite shows, entire seasons of show, or specific genres. They supported binge watching (viewing entire television seasons or series of movie sequels in a short period of time). DVRs became remotely programmable, so viewers could access their devices from a restaurant or while at work, and an add-on called a SlingBox allowed viewers to watch the content on their home video equipment from a laptop computer. The “whatever whenever” ideal that Sony had touted with its initial Betamax advertisements continued to triumph.
Hollywood has consistently been challenged by independent film movements, which have all been absorbed into the studio system, reinvigorating it. in the 1980s and 1990s, two very different independent film movements arose that pulled the studios in opposite directions. Both were helped by home video. Independent producers funded films with the advance sales of video rights, and home video provided another distribution outlet and source of revenue for smaller independent films.
The creation of what are called “mini-majors” led to the rise of an independent film movement in the 1980s. Mini-majors took advantage of Hollywood’s blockbuster culture, undertaking expensive productions on borrowed money and hoping they would turn into box office successes. Ultimately, the mini-majors wanted to become new studios. Companies like Carolco, Vestron, Cannon, and the DeLaurentis Entertainment Group raised money by selling home video and global distribution rights; they promised unprecedented star salaries; and they invested in the emerging technology of computer-generated imagery (CGI). For a while, some mini-majors made consistently successful bets. But eventually they all took too many expensive risks during uncertain economic times.
Carolco, the most successful of the mini-majors, specialized in male-star-driven conservative action films, which were perfect for the era of HUAC star witness–turned-politician President Ronald Reagan. Carolco hit on the first successful franchise of the movement with the movie First Blood (1982), starring Sylvester Stallone as Vietnam War veteran John Rambo. The movie initiated a genre cycle bent on forging Vietnam War heroes. Carolco marketed the film by sponsoring a documentary on prisoners of war and deploying its public relations staff to help ripen the cultural moment for the film’s reception. First Blood appeared to be so much in keeping with the zeitgeist that even Reagan praised it. The film took in $300 million in worldwide gross profits, and it spawned equally successful sequels.
Carolco invested in more action films, having another great run with a series of CGI-heavy science fiction films starring Arnold Schwarzenegger, including Total Recall (1990) and The Terminator series (1984–present).
In a short period of time, Carolco sought to move beyond making blockbusters and expand into a new major studio. Like Warner Bros. during the transition to sound or MCA in the 1960s, Carolco attempted to become vertically integrated. The company bought a video distribution chain, while also diversifying its production slate with some smaller genre films, including the history film Mountains on the Moon (1990), horror film Jacob’s Ladder (1990), romantic comedy L.A. Story (1991), and thriller Basic Instinct (1992).
But Carolco ran out of money during a recession in the 1990s and bet the entire company on a single film: the flop Cutthroat Island (1995). The mini-majors turned the oil-well blockbuster system into a casino, and eventually they all lost. But before going under and selling their film libraries to the studios, the mini-majors permanently raised star salaries, and they successfully hastened Hollywood’s adoption of CGI, proving once again that most new innovations come from challenges to the major studios.
The mini-majors made the blockbuster culture of the New Hollywood even bigger. The other independent film movement of the late 1980s and 1990s extended the flip side of the New Hollywood era, the personal films that dwindled after the New Hollywood auteurs turned to blockbusters and franchises. What has come to be known as “indie” film grew directly out of the 1960s New Hollywood, extending the auteur-focused movement to incorporate more women and minorities who were (and remain) underrepresented in the studios system. Some of the directors that emerged from this period include Paul Thomas Anderson, Wes Anderson, Jane Campion, Sofia Coppola, Julie Dash, Tom DiCillo, David Fincher, Hal Hartley, Agnieszka Holland, Jim Jarmusch, Ang Lee, Spike Lee, Richard Linklater, Michael Moore, Alexander Payne, Robert Rodriguez, John Singleton, Kevin Smith, Todd Solondz, Steven Soderbergh, Quentin Tarantino, Robert Townsend, and Gus Van Sant.
Spike Lee (along with fellow New York–based filmmaker Jim Jarmusch) formed a bridge between the film school generation of the 1970s and the indie filmmakers of the 1990s. Trained at New York University’s graduate film school, Lee made films about topics that would never have found a home in Hollywood. Like those of the social-problem filmmakers of the depression, Lee’s early films centered on controversial topics: racism, female sexuality, and drug addiction. He funded his first feature film, She’s Gotta Have It (1986), with a New York State arts grant and investments from family and friends. Shot for under $200,000, it earned over $7 million at the box office. Lee continued to struggle for creative autonomy, forming his own production company, 40 Acres and a Mule Filmworks, and supplementing his film work with commercials and music videos. Always at the forefront of new funding opportunities outside of the studio system, Lee went on to be one of the first major filmmakers to use the crowdfunding service Kickstarter to fund a film project.
Spike Lee’s films were recognized by festivals and awards ceremonies early on. His NYU thesis project was the first student film to be featured at Lincoln Center’s New Directors/New Films festival in New York, and She’s Gotta Have It won a young filmmakers’ award at the Cannes film festival. In 1989 Lee was nominated for an Academy Award for best original screenplay for Do the Right Thing. He didn’t win, and the best picture Oscar that year went to Driving Miss Daisy, a film that could not have been further from Lee’s provocative ensemble piece about the racial tensions ready to ignite on a street in Brooklyn.
But the generation that followed Spike Lee had a larger infrastructure of support. A range of institutions helped launch and sustain indie film, most prominently actor Robert Redford’s multifaceted Sundance organization and brand. Based in Utah, what began as an independent film festival in the late 1970s grew into an educational institute, a funding entity, and eventually a lifestyle brand, selling clothing and furniture. Sundance really took off as a full-service film development center under direction of Geoffrey Gilmore in the 1990s, and the organization came to dominate the independent film world so thoroughly that by 1995 another festival, Slam Dance, was started to oppose Sundance’s hegemony.
On the East Coast, the Independent Filmmaker Project grew out of the New York Film Festival in the late 1970s and created a screening venue, film market, and award ceremony for independent films. Dozens of magazines from Film Threat to Filmmaker to the online IndieWire began to cover independent films, and a cottage industry of books on how to make independent films sprang up.
Distributors began to focus on the emerging independent film movement. Law school student Robert Shaye proved to be one of the more successful entrants into the field when he started New Line Cinema in the late 1960s to distribute low-budget art house, exploitation, and public domain films. After successfully reviving the camp 1930s antidrug film Reefer Madness (1936) and new films by Baltimore-based exploitation auteur John Waters, New Line moved into production in the late 1970s. Throughout the 1980s and 1990s, New Line produced and distributed commercial franchises like Nightmare on Elm Street (1984–present) and the Teenage Mutant Ninja Turtles (1990–present) films, while its Fine Line division made edgy independent films like Gus Van Sant’s My Own Private Idaho (1991), David O. Russell’s Spanking the Monkey (1994), and Harmony Korine’s Gummo (1997).
At the same time, brothers Bob and Harvey Weinstein produced and distributed a mind-boggling number of independent film hits through their company Miramax. One of their earliest coups was buying the rights to Steven Soderbergh’s Cannes and Sundance winner Sex, Lies, and Videotape (1989). The film, about technology in the age of safe sex, was made for $1.1 million and grossed over $100 million.
The Weinsteins not only chose great material; they also created a popular audience for independent films through clever marketing. They sold theatrical documentary films without using the word documentary. Instead, they highlighted the films’ political or real-world relevance and the auteurs who made them, especially Errol Morris and Michael Moore. Miramax, like Carolco, took unorthodox steps to create an environment in which films seemed timely and important. They managed to have actor Daniel Day-Lewis testify before Congress on behalf of the Americans with Disability Act during the promotional campaign for his film My Left Foot (1989), about a man with cerebral palsy who learns to paint.
Miramax marketed the film The Crying Game (1992) by convincing critics not to give away the film’s surprise but of course revealing to readers that there was a not-to-be-missed surprise. The Weinsteins also realized the financial value of the Oscars. They put hundreds of thousands of dollars into campaigning for Oscars, and they were extremely successful at winning them. Oscars not only bring prestige; they also bring in more viewers for extended theatrical runs and for home video releases. Oscars translate into real dollar value.
By the mid-1990s, the independent film movement had developed into a mirror of the studio system, with its own auteurs, storytelling formulas, and marketing techniques. The independent film world had its own star system featuring Parker Posey, Eric Stoltz, John Turturro, Samuel L. Jackson, and Steve Buscemi. And a familiar narrative of low-budget triumph and Sundance discovery came to dominate marketing: film student Robert Rodriguez made the fast-paced action film El Mariachi (1992) for $7,000 while on break from film school; first-time filmmaker Jonathan Caouette made Tarnation (2003), a personal film about coming out and life with his unstable mother for just over $200 on a borrowed laptop; and the handheld-camera horror film The Blair Witch Project (1999) was made for $25,000 and grossed over $200 million.
It is difficult to define indie film. Is it a genre, mode, series of institutions, group of people, or attitude? Was it really an independent movement when many actors, directors, and production companies seemed to move fluidly between Hollywood and the indie world? There may not be an all-encompassing definition, but by the mid-1990s all of the parts were in place for indie film to be a widely recognized movement.
Some filmmakers recognized the indie film scene as the heir to New Hollywood auteurs of the early 1970s. Directors Steven Soderbergh, Spike Jonze, David Fincher, and Alexander Payne paid homage to their predecessors by naming their short-lived production company after Paramount’s failed auteur subsidiary of the 1970s, the Director’s Company. The company name was more apt than its founders knew at the time, and the 1990s indie film movement had a fate similar to its predecessors.
Like BBS and the other New Hollywood studio subsidiaries of the 1960s, all of the 1990s indie film producers and distributors were absorbed by the Hollywood studios. In 1993 Disney acquired Miramax, and Turner Entertainment (which became part of Time Warner) bought New Line. In 2001, Universal bought the last distributor of the period to remain independent, Good Machine. The studios that did not snap up an indie distributor started their own art house divisions, including Sony Picture Classics (1991), Fox Searchlight (1994), and Paramount Classics (1998).
The lines between Hollywood and the indie world began to blur, creating what some have called Indiewood. The 1995 Oscars were hailed as the year of the independents, because so many films nominated for best picture came from indie producers. But by 1995, every one of those producers had become part of the Hollywood studio system, and the 1995 Oscar nominees signified not the year of the independents but the triumph of Indiewood: The English Patient (Miramax/Disney), Breaking the Waves (October/Universal), Fargo (Gramercy/Polygram), and Shine (Fine Line/Time Warner). As it had so many times before, Hollywood successfully absorbed its competition.