Early in 1962, a young student at MIT was on his way to his home in the nearby town of Lowell, Massachusetts. It was a cold night, with a cloudless sky, and as Peter Samson stepped off the train and gazed up at the starfield, a meteor streaked across the heavens. But instead of gasping at the beauty of creation, Samson reflexively grabbed for a game controller that wasn’t there and scanned the skies, wondering where his spaceship had gone. Samson’s brain had grown out of the habit of looking at the real stars. He was spending way too much time playing Spacewar.1
Samson’s near hallucination was the precursor of countless digital fever dreams to come—that experience of drifting off to sleep dreaming of Pac-Man, or rotating Tetris blocks, or bagging a rare Pokémon Jigglypuff. Or, for that matter, the reflexive checking of your phone for the latest Facebook update. The ability of a computer to yank our Pavlovian reflexes and haunt our sleep—in 1962, that would have been unimaginable to anyone but Peter Samson and a few of his hacker friends. They were avid players of Spacewar, the first video game that mattered, the one that opened the door to a social craze and a massive industry and shaped our economy in more profound ways than we realize.
Before Spacewar, computers were intimidating: Large gray cabinets in purpose-built rooms, closed off to all except the highly trained.2 Vast and expensive, forbidding and corporate. Computing was what banks did, and corporations, and the military; computers worked for the suits.3
But at the beginning of the 1960s, at MIT, new computers were being installed in a more relaxed environment. They didn’t have their own rooms—they were part of the laboratory furniture. Students were allowed to mess around with them. The term “hacker” was born, and it meant not the modern mass-media sense of a malevolent cracker of security systems but someone who would experiment, cut corners, produce strange effects. And just as hacker culture was being born, MIT ordered a new kind of computer: the PDP-1. It was compact—the size of a large fridge—and relatively easy to use. It was powerful. And—oh, joy!—it communicated not through a printer but through a high-precision cathode-ray tube. A video display.
When a young researcher named Slug Russell heard about the PDP-1, he and his friends began plotting the best way to show off its capabilities. They had been reading a lot of science fiction. They had been dreaming of a proper Hollywood space opera—this was nearly two decades before Star Wars. But since no such movie was in the offing, they plumped for the best possible alternative: Spacewar, a two-player video game that pitted starship captains against each other in a photon-torpedo-powered duel to the death.
There were two ships—just a few pixels outlining the starcraft—and the players could spin, thrust, or fire torpedoes. Other enthusiasts soon joined in, making the game smoother and faster, adding a star with a realistic gravitational pull and cobbling together special controllers from plywood, electrical toggles, and Bakelite. They were hackers, after all.
One of them decided that Spacewar deserved a breathtaking backdrop and programmed what he called the “Expensive Planetarium” subroutine. It featured a realistic starscape, stars displayed with five different brightnesses, as viewed from Earth’s equator. The author of the glorious addition: Peter Samson, the young student whose imagination was so captured by Spacewar that he misperceived the night sky above Lowell, Massachusetts.4
In one way, the economic legacy of Spacewar is obvious. As computers became cheap enough to install in arcades, and then in the home, the games industry blossomed. One of the early hits, Asteroids, owed a clear debt to Spacewar—with the realistic-seeming physics of a spaceship that rotated and thrusted in a zero-gravity environment. Computer games now rival the film industry for revenue.5 They’re becoming culturally important, too: LEGO’s Minecraft tie-in jostles for popularity with its Star Wars and Marvel sets.
But beyond the money that we spend on them, games affect the economy in a couple ways. First, virtual worlds can create real jobs. One of the first people to make this case was the economist Edward Castronova.6 In 2001 he calculated the gross national product per capita of an online world called Norrath—the setting for an online role-playing game, EverQuest. Norrath wasn’t particularly populous—about 60,000 people would be logged in at a time, performing mundane tasks to accumulate treasure that they could use to buy enjoyable capabilities for their characters. Except some players were impatient. They bought virtual treasure from other players, on sites like eBay, for real money. Which meant other players could earn real money for doing mundane work in Norrath.
The wage, reckoned Castronova, was about $3.50 an hour—not much for a Californian but an excellent rate if you happened to live in Nairobi. Before long, “virtual sweatshops” sprang up from China to India, where teenagers ground away on the tedious parts of certain games, acquiring digital shortcuts to sell to more prosperous players who wanted to get straight to the good stuff. And it still happens: some people are making tens of thousands of dollars a month on auction sites in Japan just selling virtual game characters.7
For most people, though, virtual worlds aren’t a place to earn money, but to enjoy spending time: cooperating in guilds; mastering complex skills; having a party inside their own imaginations. Even as Castronova was writing about tiny Norrath, 1.5 million South Koreans were playing in the virtual world of the game Lineage.8 Then came Farmville on Facebook, blurring a game with a social network; mobile games, such as Angry Birds or Candy Crush Saga; and augmented reality games, like Pokémon Go. By 2011, the game scholar Jane McGonigal estimated that more than half a billion people worldwide were spending serious amounts of time—almost two hours a day, on average—playing computer games. A billion or two is within easy reach.9
And that brings us to the final economic impact. How many of those people are choosing virtual fun over boring work for real money?
A decade ago, I saw Edward Castronova speak in front of a learned audience of scientists and policy wonks in Washington, D.C. You guys are already winning in the game of real life, he told us. But not everyone can. And if your choice is to be a Starbucks server or a starship captain—what, really, is so crazy about deciding to take command in an imaginary world?
Castronova may have been onto something. In 2016, four economists presented research into a puzzling fact about the U.S. labor market: the economy was growing strongly, unemployment rates were low, and yet a surprisingly large number of able-bodied young men were either working part-time or not working at all. More puzzling still, while most studies of unemployment find that it makes people thoroughly miserable, against expectations the happiness of these young men was rising. The researchers concluded that the explanation was . . . well, they were living at home, sponging off their parents, and playing video games. These young men were deciding they didn’t want to be a Starbucks server. Being a starship captain was far more appealing.10