29.

The Compiler

One, zero, zero, zero, one, zero, one, one. Zero, one, one . . .

That’s the language of computers. Every clever thing your computer does—make a call, search a database, play a game—comes down to ones and zeroes. Actually, that’s not quite true. It comes down to the presence or absence of a current in tiny transistors on a semiconductor chip. The zero or one merely denotes if the current is off or on.

Fortunately, we don’t have to program computers in zeroes and ones. Imagine how difficult that would be. Microsoft Windows, for example, takes up twenty gigabytes of space on my hard drive. That’s 170 billion ones and zeroes. Print them out and the stack of paper would be almost two and a half miles high. Now imagine you had to work through those pages, setting every transistor manually. We’ll ignore how fiddly this would be: transistors measure just billionths of a meter. If it took a second to flip each switch, installing Windows would take five thousand years.

Early computers really did have to be programmed rather like this. Consider the Automatic Sequence Controlled Calculator, later known as the Harvard Mark I. It was a concatenation of wheels and shafts and gears and switches, measuring fifty-one feet long by eight feet high by two feet deep. It contained 530 miles of wires. It whirred away under instruction from a roll of perforated paper tape, like a player piano. If you wanted it to solve a new equation, you had to work out which switches should be on or off, which wires should be plugged in where. Then you had to flip all the switches, plug all the wires, and punch all the holes in the paper tape. Programming it was a challenge that would stretch the mind of a mathematical genius; it was also tedious, repetitive, error-prone manual labor.1

Four decades after the Harvard Mark I, more compact and user-friendly machines such as the Commodore 64 were finding their way into schools. If you’re about my age, you may remember the childhood thrill of typing this:

10 print “hello world”;
20 goto 10

And, lo—“hello world” would fill the screen, in chunky, low-resolution text. You had instructed the computer in words that were recognizably, intuitively human—and the computer had understood. It seemed like a minor miracle.

If you ask why computers have progressed so much since the Mark I, one reason is certainly the ever tinier components. But it’s also unthinkable that computers could do what they do if programmers couldn’t write software such as Windows in humanlike language and have it translated into the ones and zeroes, the currents or not-currents, that ultimately do the work.

What began to make that possible was an early kind of computer program called a “compiler.” And the story of the compiler starts with an American woman named Grace Hopper.

Nowadays there’s much discussion about how to get more women into careers in tech. In 1906, when Grace was born, not many people cared about gender equality in the job market. Fortunately for Grace, among those who did care was her father, a life insurance executive: he didn’t see why his daughters should get less of an education than his son. Grace went to a good school and turned out to be brilliant at mathematics. Her grandfather was a rear admiral, and her childhood dream was to join the U.S. Navy, but girls weren’t allowed. She settled for becoming a professor instead.2

Then, in 1941, the attack on Pearl Harbor dragged America into World War II. Male talent was called away. The Navy started taking women. Grace signed up at once.

If you’re wondering what use the Navy had for mathematicians, consider aiming a missile. At what angle and direction should you fire? The answer depends on many things: how far away the target is; the air temperature, the humidity, the speed and direction of the wind. The calculations involved aren’t complex, but they were time-consuming for a human “computer”—someone with a pen and paper.3 Perhaps there was a faster way. At the time Lieutenant (junior grade) Hopper graduated from the Naval Reserve Midshipmen’s School in 1944, the Navy was intrigued by the potential of an unwieldy contraption recently devised by Howard Aiken, a professor at Harvard. It was the Mark I. The Navy sent Hopper to help Aiken work out what it could do.

Aiken wasn’t thrilled to have a female join the team, but soon Hopper impressed him enough that he asked her to write the operating manual. Figuring out what it should say involved plenty of trial and error. More often than not, the Mark I would grind to a halt soon after starting—and there was no user-friendly error message explaining what the problem was. Once it was because a moth had flown into the machine; that gave us the modern term “debugging.” More likely, the bug was metaphorical—a switch flipped wrongly, a mispunched hole in the paper tape. The detective work was laborious and dull.

Hopper and her colleagues started filling notebooks with bits of tried-and-tested, reusable code. By 1951, computers had advanced enough to store these chunks—called “subroutines”—in their own memory systems. Hopper was then working for a company called Remington Rand. She tried to sell her employer on letting programmers call up these subroutines in familiar words—to say things like “subtract income tax from pay” instead of, as Hopper put it, “trying to write that in octal code or using all kinds of symbols.”4

Hopper later claimed that “no one thought of that earlier because they weren’t as lazy as I was.”5 That’s tongue-in-cheek self-deprecation—Grace was famed for hard work. But it does have a kernel of truth: the idea Hopper called a “compiler” involved a trade-off. It made programming quicker, but the resulting programs ran more slowly. And that’s why Remington Rand wasn’t interested. Every customer had its own, bespoke requirements for its fine new computing machines. It made sense, they thought, for the company’s experts to program them as efficiently as they could.

Hopper wasn’t discouraged: she simply wrote the first compiler in her spare time. And others loved how it helped them think more clearly. One impressed customer was an engineer, Carl Hammer, who used it to attack an equation his colleagues had struggled with for months; Hammer wrote twenty lines of code and solved it in a day.6 Like-minded programmers all over the United States started sending Hopper new chunks of code; she added them to the library for the next release. In effect, she was single-handedly pioneering open-source software.

Grace Hopper’s compiler evolved into one of the first programming languages, COBOL; more fundamentally, it paved the way for the now familiar distinction between hardware and software. With one-of-a-kinds like the Harvard Mark I, software was hardware: no pattern of switches on the Mark I would also work on some other machine, which would be wired completely differently. But if a computer can run a compiler, it can also run any program that uses it.

More and more layers of abstraction have since come to separate human programmers from the nitty-gritty of physical chips. And each one has taken a further step in the direction Grace Hopper realized made sense: freeing up programmer brainpower to think about concepts and algorithms, not switches and wires.

Hopper had her own views of why colleagues resisted it, at first—and it wasn’t because they cared about making programs run more quickly. No, they enjoyed the prestige of being the only ones who could communicate with the godlike computer on behalf of the mere mortal who’d just purchased it. The “high priests,” Hopper called them.7

Hopper thought anyone should be able to program. Now anyone can. And computers are far more useful because of it.