8
Tick, Talk
Memory as Recording
In the century between 1870 and 1970, the world of memory was dominated by a new kind of figure: the entrepreneur/inventor.
Memory work had heretofore been the province of scholars, ecclesiastics, and scientists. But as the focus of memory shifted from the human brain to mechanical systems, and as command economies transformed into free-market capitalism, the nature of the players changed as well. The goal now was not just to develop new memory technologies but to turn them into successful and profitable enterprises. If lost in this transformation was research done for its own sake, what was gained was rapid innovation, continuous improvement forced by competition, and, through interaction with users, practical results.
The century after the American Civil War offered a brief window of opportunity for these singular individuals. Entire markets lay open for prospecting, inventions could still be made by single individuals, and manufacturing was cheap. Moreover, the public, increasingly trained to become enthusiastic early adopters of new inventions, not only provided a perpetually hungry market but was happy to lionize these inventors as heroes of the age.
And they were heroes. But they were also geniuses, eccentrics, megalomaniacs, and, more often than not, doomed by the same traits that had made them successful in the first place (stubbornness, single-mindedness, and contempt for the work of others) to ultimately destroy their creations. But before they did, they changed the nature of memory, human and artificial, forever.
GO-GETTER
John Shaw Billings was a classic “Can do!” character of post–Civil War America, one of those men who turned the United States from a war-battered frontier nation to a global empire in the span of a single generation.
With his walrus mustache and baggy eyes, Billings looks slightly embarrassed by all the attention in his portraits and photographs, perhaps in a little over his head. But he was anything but: Billings was, in fact, brilliant, pragmatic, a veteran leader and administrator, and utterly unafraid to take on the biggest and most thankless tasks.
Like many of his successful contemporaries, Billings had come from humble beginnings in small-town antebellum America and had made his success through brains, pluck, and hard work. In Billings’s case, he was born in 1838 in Allensville, Switzerland County, Ohio. He graduated from Miami (Ohio) University in 1857 and completed his medical degree three years later at what is now the University of Cincinnati College of Medicine. Thus, he was just a few months into his practice when Fort Sumter was attacked and the nation, North and South, mobilized for war.
Following enlistment, Billings found himself in Washington, D.C. The Union Army was facing huge problems keeping up with sanitation (disease would kill more soldiers than battle would in the Civil War) and with equipping and staffing both field surgeries and military hospitals. It was one of those moments when merit is actually recognized and rewarded: At age twenty-four, Billings was named medical inspector of the 200,000-man Army of the Potomac. By all accounts, he served brilliantly.
Following the war, Billings was appointed librarian of the Surgeon General’s Office—a job with more title than substance. Billings changed that: The collection he compiled became the heart of the National Library of Medicine.
Though he held the title of librarian for thirty years, it didn’t keep Billings from taking on other “impossible” tasks whenever they were offered. For example, he was also professor of hygiene at the University of Pennsylvania … and while there he played a key role in designing all of the original buildings for Johns Hopkins University (its well-known dome is named for him). Later in his career, Billings was invited to New York City to pull together the local borough libraries into the New York Public Library system, still one of the world’s largest.
It says something about Billings’s reputation and his comfort with power that it was he who first approached, and then convinced, tycoon Andrew Carnegie to build sixty-four new branch libraries in New York City, and then—in probably the greatest individual act of philanthropy in modern times—to build three thousand public libraries in cities and towns across North America and the British Empire. Most of these “Carnegie” library buildings still stand, like secular temples, in small-town America, Canada, the United Kingdom, Australia, New Zealand, Fiji, and the West Indies—and they played a major role in raising literacy rates in many of those nations.
But it was in the mid-1870s when Billings received, and accepted, what he believed would be the biggest challenge of his career: directing the 1880 U.S. Census.
THAT ALL BE COUNTED
The census was hardly a new idea. Counting populations had already been going on for five thousand years starting with Egypt, the first believed to have taken place in 3340 B.C. during the early Pharaonic period. Within a millennium after that, the Chinese were conducting a regular census, and some of their records still survive to this day. The Jews in Israel also conducted a regular census for tax purposes (as commanded by God in Exodus), a process so embedded in the culture that it even gave name to the Old Testament book of Numbers.
Rome reportedly began taking a regular census as early as its pre-Republican kings, in particular Servius Tullius in the sixth century B.C. By the time of the empire, census-taking had become a massive undertaking stretching across Europe and the Middle East—an event memorialized in the opening of the Gospel according to Luke, where Joseph of Nazareth and his pregnant wife, Mary, have to travel to Bethlehem to be recorded and taxed. Despite the immense challenge, Rome considered its census to be so important that the empire ran it as a continuous program for centuries, adding the results every five years. “Census” is, in fact, a Latin word for “assessment,” the measure of all male citizens.
Census-taking was pretty much abandoned in Europe during the chaos of the Dark Ages, but it continued in the Middle East under the caliphate, beginning under the caliph Umar in the early seventh century, soon after his conversion to Islam. But almost as soon as order was restored to Western Europe, population counting began again in earnest.
The most famous census of the Middle Ages was the Domesday Book, ordered by William the Conqueror in 1086 to survey his newly conquered British Isles and to determine what taxes and tributes had been owed to his predecessor, Edward the Confessor, and that now belonged to him.
According to the Anglo-Saxon Chronicles, the idea came to William the previous year:
While spending the Christmas of 1085 in Gloucester, William had deep speech with his counsellors and sent men all over England to each shire to find out what or how much each landholder had in land and livestock, and what it was worth.1
It is often assumed that the Domesday Book is simply the record of that census. But it is not that simple. In this world before printing, of vellum and with shortages of scribes, the census was the book—and the book was the law. Hence the name. To the Anglo-Saxon citizens of England, under the heel of their new Norman overlords, this thick volume really was the “Doomsday” book—the book of Last Judgment. Against its records, there was no appeal.
By the sixteenth century, as part of the zeal to catalog and enumerate that characterized the Renaissance, censuses were taking place in almost every kingdom and principality in Europe. They were also being held in the Middle East, in China, and even in South America in the Incan Empire. In 1183 the Crusaders in Jerusalem undertook a census to determine if they could raise a sufficient army to defend against attack by Saladin. In 1577, King Phillip II of Spain ordered a census of the entire Spanish empire in the Western Hemisphere to determine Spain’s total wealth and population of captured peoples (Spain itself had held censuses since the thirteenth century). In 1666, France did the same with its colony in Canada; and the British raj did so at last in India in 1872.
As these examples suggest, the appeal of census-taking to rulers and legislatures took many forms. For the pharaohs, it was a way to measure their own power in people, structures, and slaves. For the Caesars and the emperors of China, it was all of that, plus a means to determine the potential number of men-at-arms to fill armies. And, of course, by the Middle Ages, it was the primary tool to squeeze taxes out of the citizenry.
But the censuses were also something else that was largely lost on the people of their time but is now their primary value: They were the ultimate (and often only) memory records of everyday people.
Of the ancient world, and even as late as the Renaissance, we know almost nothing about individual shopkeepers, laborers, and soldiers—much less women and slaves—unless they did something extraordinary, which by definition made them more than ordinary people. There was no photography, and in the lower classes there was general illiteracy, so it was almost impossible for any person outside of the ruling classes to leave even the slightest record of their existence for future generations.
As a result, we have almost no records of what it was like to be a laborer on the pyramids or a low-ranking Roman legionnaire, a Chinese farmer during the Tan dynasty, or a Greek hoplite. These memories will never be recovered because they were never recorded. The memories of a billion people over the course of a thousand generations are lost to us … forever.
Censuses changed that. Not right away, of course. Rulers cared little about the biographical details of their plebian subjects, and not at all about the proletarians and slaves. But that all changed when the middle classes began to emerge across the Continent and governments suddenly became interested in individuals, their extended families, and most of all, the ownership of their assets.
NEW WORLD, NEW MEASURE
For a newly emerged democratic republic such as the United States—and a nation that wouldn’t have a federal income tax for its first century—a census served a wholly new set of purposes: to establish voter registrations and apportion districts; to create rolls of adult males for potential military drafts; to locate the ever-growing numbers of new immigrants; and to count the number of slaves and indentured servants (though the former would only be officially recorded as three-fifths their total number). But perhaps the most important task in a nation that was rapidly expanding its borders westward was to locate and count the hundreds of thousands of citizens spread across the wide frontier—some of them so isolated that they continued to vote for Andrew Jackson for president long after he had served his terms and died.
The different American colonies had held their own censuses since the seventeenth century, the first taking place in Virginia at the beginning of that century. The first official U.S. Census took place starting on August 2, 1790, when the nation was officially just one year old. To underscore how seriously the federal government considered the task—and to give a classic example of President George Washington’s decisiveness in establishing executive precedents—federal marshals, under the command of Secretary of State Thomas Jefferson, were sent out to knock on every door, take down the name of every head of household, and then count the number of other individuals in the residence.
Interestingly, while slaves were counted in this first census (and, as noted, only as fractional citizens for apportionment), Native Americans were not—both because they weren’t considered “real” citizens and because of the potentially mortal risk to the census takers.
The final tally of this first U.S. Census was 3.9 million Americans—a reminder of just how tiny the nation was that had just fought its way to independence from history’s largest empire.
The second U.S. Census took place in 1800, thus setting a pattern of ten years between counts—a very brief interval, given the growing effort required, but almost too long for a nation growing as rapidly as the United States. Meanwhile, each census added a few more questions as the young government struggled to understand how its citizens lived, where they worked, and the forces that underpinned the national economy.
Thus, as early as the third census in 1810, questions were already being asked about manufactured goods. The 1840 census added questions about the nation’s fisheries, perhaps reflecting maritime disputes of the era.
But it was with the 1850 census—perhaps reflecting the growing schisms of a nation sliding into Civil War—that the U.S. government went all the way in. Now it asked questions about taxes paid, criminal records, religious affiliation, and even whether the subject was destitute. Moreover, for the first time (and to the eternal gratitude of future genealogists) the census takers also asked for the complete names and ages of all household members. And in another unprecedented turn, even family slaves were fully named.
Now, for the first time in history, the average person had a name, a family, a way of life … and a memory that would live long beyond them, to be read by their descendants centuries into the future. Even more remarkably, slaves, the ultimate forgotten figures of history, at last entered into human memory—a harbinger of the Emancipation Proclamation still more than a dozen years away.
ADDING UP A NATION DIVIDED
If the 1850 census had predicted a brighter future, its 1860 successor took that hope away. It was, as Adam Goodheart dubbed it 150 years later in The New York Times, “The Census of Doom.”2
By this point, the act of taking a U.S. Census, compiling the data, and then publishing the information took almost five years. Needless to say, with the nation coming apart at the seams—and just months before the outbreak of conflict—there was no time nor funds for a fully elaborated report like that of a decade before. By the time the results were finally tabulated, fighting had begun, and so census superintendent Joseph Kennedy, working with a staff of only 184 clerks to count the millions of records, ultimately delivered only an abbreviated report that lacked the expected cartographical representations of where America’s populations were located.
Still, enough data was processed and made public between the June 1860 start date of the census and the November 1861 presidential election to have a profound effect on the lead-up to war—and thus the future of the United States. Wrote Goodheart:
Preliminary figures that began appearing in the press as early as September 1860 confirmed what many Americans already suspected: immigration and westward expansion were shifting the country’s balance of population and power. Since the last count, in 1850, the North’s population had increased an astonishing 41 percent, while the South’s had grown only 27 percent. (Between 2000 and 2010, by comparison, the entire nation’s population grew just 9.7 percent.) Tellingly, the statistical center of national population had shifted for the first time not only west of the original 13 states, but also from slave territory into free: from Virginia to Ohio.3
These numbers shocked the Southern slave-holding states. The survival of slavery had long depended on a voting balance between North and South, by which the states below the Mason-Dixon line could hold off the abolitionists above it. Now it was obvious that this fragile balance had been shattered—and the South would never again have control over its own destiny. Not only would these booming populations all but decide future presidents, but reapportionment would soon be adding new congressional districts to the North, while taking them away from the South.
And, at least for Southern whites, it only got worse. Not only were the populations of the Northern states booming but so were those in the frontier regions of the old Northwest Territory, the Great Plains, the Southwest, and the Far West—territories that, thanks to their “free-soil” pioneers, were all likely to vote anti-slave with statehood.
Of all of these territories, Wisconsin and Minnesota represented the worst Southern nightmare:
In 1836, [Wisconsin] had claimed fewer than 12,000 inhabitants. Now, in 1860, it boasted 778,000—an increase of almost 6,400 percent in less than a quarter of a century … Nor was it even the most remarkable case. Neighboring Minnesota’s population had risen from 6,000 to 172,000 in the past decade alone.4
Making matters worse, in Southern eyes, the inhabitants of these two states were largely immigrants, Northern Europeans who had no history of slavery and who had a particular hatred of the Peculiar Institution. Even the South’s long-standing argument that slavery was too important an economic institution to dismantle was now beginning to lose its credibility: The census data also showed (and The Philadelphia Inquirer was happy to point out) that slaves now, at just 12.5 percent, represented the lowest percentage of the U.S. population since measurement began.5
None of this was lost on the South. The writing was on the wall: Slavery, the largest single industry in the United States, was doomed. And since slavery was the underpinning of the entire Southern agrarian culture, so, too, was it the heart of the Southern economy. At the same time, white Southerners, who had led the nation through the Revolution and the early years of the Republic, were now destined to become a minority, slowly stripped of their government representatives, their role in American life supplanted by immigrants in the North.
If the 1860 Census didn’t cause the Civil War—those roots reached back to the treatment of slaves in the Constitution—its data most certainly hastened its arrival. Southerners, seeing these results, now looked on with greater dread than ever at the coming presidential election and its Republican, Northern, and antislavery candidate, Abraham Lincoln. It is a measure of just how frightened the South was of these shifting demographic winds that Lincoln won the presidency but failed to win a single electoral vote from the Southern states. The stage was set; there was no going back to the status quo. Using the same census data, Southerners convinced themselves that because they were now a larger population than the American colonies at the time of the Revolution, they, too, could shrug off a mighty continental empire.6
Once the Civil War did break out, the Census Department quickly became an arm of the War Office—in particular, Kennedy and his team, now armed with the best population data in the country, began creating a new set of population maps for field commanders and military governors. These maps included information not just on white populations but slaves as well (and even Indians, or at least those 40,000 Native Americans who had “renounced tribal rules”), along with local agricultural products and, crucially, train routes and schedules.
The total population of the United States according to the 1860 Census was 31,443,321. Within five years, more than 620,000 of those citizens would be killed.
The next U.S. Census, in 1870, landed smack in the middle of Reconstruction in the South, the Gilded Age in the North, and massive pioneer emigration to the Great Plains and beyond. America’s official population was just short of 40 million.
THE LONG COUNT
In 1880 it was John Shaw Billings’s turn to count all Americans. By all accounts, he did the job brilliantly, despite dealing with a much larger (50 million, having jumped by 30 percent in a single decade) and more geographically diverse population, and a set of census questions of unprecedented size. Indeed, the 1880 census had five separate schedules (sections), with only the first resembling the traditional headcount that could be filled out by citizens. The other four sections—mortality, agriculture, manufacturing, and social statistics—asked dozens of questions about marital status, birthplace and cause of death of parents, crops and their rotation, fertilizer purchases, number of seasonal hired hands, average daily wages, capital equipment, debt, education, and a host of other matters … all of which had to be asked by census field “enumerators.” Special customs agents also went into the field to gather statistical data on all of the nation’s major manufacturing industries.
It was a gigantic effort—and in the end, an extraordinary achievement. For the very first time, a great nation fully understood its own nature … and prepared the most complete memory of itself for future generations achieved to date. But it hadn’t come easy, despite the fact that Billings had broken with precedent and for the first time hired women to join his army of enumerators. In the end, it took seven years to complete the 1880 U.S. Census.
The 1890 census loomed just ahead. And John Shaw Billings knew he had a serious problem on his hands. By then there might be more than 70 million Americans, filling even more of the hard-to-reach corners of the nation. And the planned census schedules were even more complicated in their questions. The 1880 census had taken seven years; Billings had only to extrapolate out to realize to his horror that it would take an estimated thirteen years to complete the 1890 census as it was currently configured. Until then, the entire U.S. government—and likely the economy itself—would be working blind on wildly obsolete data. The results could be catastrophic.
There was no way that one census could be allowed to overlap the next. After such an illustrious career built on an unmatched reputation for competence, Billings realized he was facing career suicide and public humiliation.
But John Shaw Billings wasn’t the type of man to shrink from challenges. As he’d shown when he hired women, he saw only the task to be done, and was prepared to crash through any technical or cultural barriers in his path to getting there.
He had been watching one of his employees—an eccentric, young former mining engineer named Herman Hollerith—who seemed to have a gift for solving mechanical problems. And it was over a chicken-salad dinner in 1879, not long after Hollerith joined the Census Bureau, that Billings first raised the challenge of speeding up the 1880 census and the looming specter of the 1890 census beyond. Hollerith left that dinner driven to find a solution.
He didn’t make it in time for the 1880 census, but Herman Hollerith was not a man to give up on a challenge.7 Born in Buffalo, New York, in 1860, he had moved at an early age with his German immigrant parents to New York City. There, young Herman proved to be a difficult student—brilliant in the subjects he cared about but indifferent, even truculent, about those he did not. Legend has it that he once even jumped out of a schoolroom window to escape a spelling test.
Eventually, his despairing parents pulled Herman out of school and put the young man in the hands of a private tutor. Hollerith bloomed under the customized curriculum and was able to enter Columbia College at just sixteen. Three years later he graduated as a certified mining engineer. Ironically, given his future, his two worst subjects were machinery and bookkeeping. Taking a job at the Census Bureau, he moved to Washington, D.C., and within a few weeks he had his fateful dinner with John Shaw Billings.
Like many geniuses, Herman Hollerith’s talent lay not in creating something wholly new, but rather in visualizing something that already existed in a wholly different context. Riding the train back and forth to New York City, Hollerith couldn’t help noticing the railroad conductor passing from seat to seat, using a handheld punch to put holes in passengers’ train tickets in specific locations to designate gender, approximate age, and other characteristics to prevent fraud. These punched tickets were crude descendants of the punched cards used in jacquard looms, and thus owed their existence to de Vaucanson’s Digesting Duck.
In the intervening 150 years, the notion of using pieces of memory for the purposes of controlling systems had spread from industry to daily life—and from automatons and looms to train tickets, employee “punch” time clocks, player pianos, and music boxes. In operation, most of these were very simple compared to the sophisticated and “programmable” looms now in use around the world. For example, the “machine” in the punch-ticket application was the conductor himself, whose job it was to compare the traits marked on the ticket with the passenger seated in front of him. But nevertheless, together these various applications had begun to create a growing regime of machinery control through data entry.
Millions of people now experienced one or more of these control devices almost on a daily basis. But only Herman Hollerith had the combination of genius and the knowledge of a pressing need to see how these systems might be reversed—that is, instead of encoding commands, they could compile results.
Throughout the 1880s, Hollerith, with Billings’s support, began to bring together various existing technologies in a new design of his own, which he called a tabulator. Billings, it should be said, while a staunch supporter of his subordinate’s work, was also far too pragmatic to pin all of his hopes on the odd twenty-five-year-old. So, even as Hollerith labored away, Billings also put out a public call for entries in a competition for a new census-compiling machine, with a deadline of 1887.
In the end, there were three entries. In addition to the design of Hollerith, who had finished building his tabulator just in time, there were also working models from two other inventors, Charles F. Pidgin and William C. Hunt. Each machine in turn was run through a representative set of data from the 1880 census, a task that mostly involved converting handwritten census data into machine coding, then slicing and dicing the entered data for various categories of results.
When the results were scrutinized, it was discovered that the Hollerith tabulator had not just defeated its two competitors but blown them away. Hollerith’s machine proved to be twice as fast as the Pidgin machine, and three times as fast as the Hunt. This also meant that the tabulator was probably at least ten times more efficient than traditional handwork.
The tabulator design had triumphed because Herman Hollerith had not just focused on one big breakthrough—though he had one in the design and use of inexpensive punched paper “cards”—but also on bringing together diverse technologies to take full advantage of that core innovation. Thus, the tabulator had a means of mechanically loading new cards into the machine, where they could be punched using the new typewriter technology. Then, after the new holes in the 12-row-by-24-column face were electrically “read” by spring-loaded copper wires dragged across their surface to make contact with a metal conductor placed below, they could be mechanically sorted by their common data into stacks and dropped into a series of boxes while an overhead bell rang to signal success. This process could then be reversed, through a counter, to add up the results, which appeared on dials, one for each decimal place, on the display. And even though many of these steps were still performed manually, the process was still faster than any data storage—that is, artificial memory—ever before attempted.
COUNTING COUPS
John Shaw Billings, fearless as ever, immediately committed the entire U.S. Census Bureau to the Hollerith tabulator and, beginning in June 1890, these unproven machines held the collective memory of the United States in its gears, keys, and cables.
After just six weeks of processing—instead of the previous two years—the Census Department announced its preliminary results: the United States now had a population of 62,947,714.
The news shocked many Americans, especially those in positions of power, for several reasons. First, the results, thanks to the Hollerith tabulator, had been announced so quickly—it seemed impossibly quickly—that there was bound to be questions about the quality of the work. Had the Census Bureau taken shortcuts? Grown sloppy? Or even made the data up? A headline in the New York Herald read:
SLIPSHOD WORK HAS SPOILED THE CENSUS
MISMANAGEMENT THE RULE
Speed Everything, Accuracy Nothing!8
Only adding to these suspicions was that the population figure seemed unexpectedly low. The United States had been growing, the past censuses said, at about 25 percent per decade for most of the nineteenth century. This suggested a likely 1890 population figure of 65 million and, given the huge influx of immigrants in recent years, some analysts had predicted the number might even reach 75 million. This apparent shortfall, combined with the unbelievable speed with which the figure had been reached, led the skeptical, especially those with a political stake in the results, to suggest more nefarious motives.
But a confident John Shaw Billings ignored the naysayers and plunged on. The 1890 census asked questions in twenty-five categories, including nearly all of those from the 1880 census, adding queries regarding citizenship and naturalization, chronic illnesses, permanent physical defects, months unemployed in the last year, farm ownership, Civil War veteran’s status, and finely detailed questions about race (specifying categories like mulatto, quadroon, octoroon, and so on).
The federal government fully expected that the processing of all of this data would have taken, if not as long as Billings’s worst nightmare of thirteen years, then at least almost the entire decade. So one can imagine the shock when the Census Bureau released all of these results, along with maps and other supporting documents, in just one year. This time, the complaints were muted: The details of the census report were now just too complete to deny. Billings and his team seemed to have performed a miracle.
One effect of this remarkable achievement was to make both the Hollerith tabulator and Herman Hollerith himself the subjects of considerable demand around the world. Hollerith, like many corporate and government “intrapreneurs” to follow in the next century, quit the Census Bureau and started his own company to build new, upgraded tabulators. He received his first international order from Russia for its 1897 census. Orders from other European nations quickly followed. The U.S. Census Bureau also ordered Hollerith’s new models—the integrating tabulator, which could not just count but also add numbers represented by punched holes on the cards; and, in time for the 1900 U.S. Census, an automatic-feed tabulator to further automate the process.
Now forty years old, Herman Hollerith was becoming a very wealthy man. But as he began to indulge a growing taste for cigars and fine wine, he lost none of his eccentricity. When asked, in the early days of his company, why he had not yet applied his tabulators to the lucrative field of railroad accounting, he replied (as he later told others): “One good reason, and that was that I did not know the first damned thing about railroad accounts.”9
Nevertheless, within months, Hollerith was delivering his machines to the company’s first private customer: the New York Central Railroad. And in the years that followed, he added new customers at other railroads, public utilities, and even one department store. But without a John Shaw Billings to manage and direct him, Herman Hollerith proved to be a much better inventor than businessman. Billings had left the Census Bureau and taken on a new role as the federal government’s investigator/liaison to the women’s temperance movement. Dying in 1913, he didn’t live to see the bittersweet results of his last successful project: Prohibition.
Hollerith had one last triumph in 1906 with a new Type 1 Tabulator. It featured an added wiring panel that enabled the tabulator’s operation to be modified for different tasks, making it one of the key milestones along the path to the invention of the computer. But by then it was too late. The year before, a frustrated U.S. Census Bureau had given Hollerith an ultimatum to upgrade the tabulator and reduce his fees. Being Herman Hollerith, he refused—then went ahead and built the Type 1. But the Bureau had had enough; it dropped the Hollerith contract and started building tabulators of its own. The subsequent lawsuit lasted seven years and ended with a victory for the government.
By then it didn’t really matter: Hollerith had sold his company to a newly formed conglomerate, Computing Tabulating Recording Corporation (CTR), with which he retained a major stock position and the title of chief consulting engineer. It was a role that suited Hollerith, who was weary of business and increasingly drawn to a simpler life of farming and cattle breeding.
In 1914, entering into Herman Hollerith’s life was a man who was as quintessentially an American businessman of the twentieth century as John Shaw Billings had been a government bureaucrat of the nineteenth. Thomas Watson Sr. was a gifted salesman on his way to becoming industrial titan, and had Hollerith followed his advice the way he had that of his old boss, his profile in history would have been far higher.
Watson knew what his customers wanted: permanent memory output. In this case, in the form of a paper tape printer. It would have been easy for Hollerith to modify his existing Type 1, but instead, he refused to have anything to do with the work—not least because he despised Watson and all that he represented in the growing role of corporate sales and marketing. Instead, Hollerith retired to his farm and the simple, good life.
Watson, meanwhile, got his paper tape machine—and soon added printers and removable plug boards. And in 1924, he changed the name of CTR to International Business Machines … IBM.
THE WIZARD
While Herman Hollerith was struggling to build his first tabulator, the world’s greatest inventor was struggling to consolidate his success of the previous two decades.
Thomas Alva Edison, whose inventions had transformed the world, had hit a rough patch in his life. Mary Edison, whom he’d married in 1871 when he was twenty-four and she a sixteen-year-old employee in his shop, had died of a brain tumor in 1884, leaving him with three children. And Edison himself had spent much of the 1880s battling one competitor after another in courtrooms and in the marketplace.
He had been sued by Emile Berliner over the patent to the carbon microphone—the heart of the telephone for the next century—and eventually won. He had battled George Westinghouse over control of all electrical power distribution in the United States, Edison stumping for his direct-current (DC) design against Westinghouse’s promotion of Nikola Tesla’s alternating-current (AC) design. (Edison eventually hired Tesla and then managed to turn the mysterious scientist into a lifelong enemy.) That fight, with so much at stake, had quickly turned ugly, with Edison employees publicly electrocuting animals—most notoriously a circus elephant—and filming the results to prove the dangers of AC.
Meanwhile, in his rush to develop the fluoroscope as an efficient diagnostic tool for the use of the newly discovered X-rays, Edison had accidentally given one of his favorite employees (who had been a ready volunteer) a mortal dose of radiation.
And most depressing of all, Edison had been sued over his most famous invention, the incandescent lightbulb. After his patents in telegraphy, the lightbulb had been the major source of Edison’s wealth—and even those revenues had been under assault since 1882, when George Westinghouse had bought the patent to the rival “induction” light, slashed licensing fees, and in the process forced Edison to cut his own patent fees. Then, a year later, the U.S. Patent Office stunned the world by awarding the patent on incandescent light to William Sawyer, rendering Edison’s patent invalid. It took six years of litigation before a judge officially recognized the superiority of Edison’s claim.
Edison had come a long way from the single-minded young inventor of the 1870s who had forever transformed the world with a dazzling run of new creations, including the automatic ticker tape machine, the mechanical vote counter, the phonograph, and, of course, the first commercial electric lightbulb. In the course of doing all of this, Edison had also created the first modern industrial research laboratory, using mass-production techniques that wouldn’t be duplicated by most manufacturing companies for another decade.
The world had seen nothing like young Tom Edison, and it really hasn’t since. There is no need here to recapitulate at length the story of Edison’s early years—pulled out of school for lack of attention, getting a job as a telegraph operator after saving the life of the station-master’s son, losing his hearing from either scarlet fever or getting his ears boxed by an angry conductor after Edison’s secret lab set fire to a railroad car, learning about electricity from telegraph wires, getting fired, and living and experimenting in the basement of a fellow inventor—as most schoolchildren even today encounter the tale at some point.
If Edison’s reputation today isn’t as brilliant as it was a century ago, it’s no doubt because its glow has been at least partially eclipsed by the growing exposure of Edison’s sharp and sometimes questionable business practices. Today it is hard to get past the stories of ripping off Tesla, taking credit for the ideas of others, driving pioneering filmmaker Georges Méliès into bankruptcy, choosing Birth of a Nation as his favorite movie, and so on, in order to give Edison anything more than conditional credit for his accomplishments.
But dismissing him, even partially, would be a mistake. Because if it is memory that makes us human, and if memory is indeed the guardian of all things, the preserver of culture and wisdom, then no person in history made a greater contribution to the power, the range, and the democracy of memory than Thomas Alva Edison.
It began in 1869, with Edison’s development of the modern stock ticker. Stock ticker tape machines (their name came from the sound they made and the paper tape on which they printed) had been around for more than a decade at this point, and their ability to send breaking financial data over long distances was instantly hailed as a vast improvement over human runners. But the early machines were big, clumsy, and unreliable. Worse, they printed out their streaming data in the form of Morse-like code that had to be translated. Edison’s Universal Stock Ticker, by comparison, was a revelation. Almost small enough to fit in one’s hand (it was typically, and memorably, covered by a glass dome), it was able to print on its paper tape actual letters and numbers. It was, in many respects, the first real-time capture of the memory of distant physical events at the instant they occurred.
But that was only the start. Shouting “Mary had a little lamb!” into the bell horn of his first phonograph in 1878, Edison captured not just the first human voice but the first recorded sound of anything on Earth. The phonograph’s subsequent effect on human memory is almost incalculable, and we are haunted by the loss of the chance to hear the legendary voices and sounds we narrowly missed: Paganini’s violin, a Dickens lecture, Edmund Kean’s Shakespeare, Lincoln’s Gettysburg Address. By the same token, we cherish those brief audio records of Enrico Caruso, Queen Victoria, Alfred, Lord Tennyson, and Florence Nightingale especially because we know that we will never hear the voices of nearly all of the other great figures of history.
Edison’s original phonograph was a fragile device: The recording medium—a sheet of tinfoil wrapped around a cardboard tube—was only good for a few repeated plays, but that was enough to set the stage. In a way that we—after more than a century of continuous technological innovation—can barely imagine, the appearance of the Edison phonograph was like some kind of magic trick or something from another planet. Most people had never considered that sound might be a physical phenomenon, a wave of energy that could not only be transmitted—the telephone had been invented only two years before—but that might even, somehow, be captured and stored forever.
Edison’s genius—echoed a decade later by Hollerith—was in understanding that this could be a reversible process of both recording and playback, and bringing together different technologies (horn speakers, reduction gearing, Bell’s telephone microphone) to realize his goal. Artificial memory, which had been a solely visual process since the invention of writing, now, after five thousand years, added a second sense.
Within a few years, other inventors, including Bell himself, had improved upon Edison’s design, notably by improving the recording technology and promoting a more durable wax cylinder licensed from Edison himself. By the end of the nineteenth century, German American inventor Emile Berliner further advanced the technology by creating the “gramophone”—the tone-arm needle/spinning-flat-disk design that would dominate sound recording for most of the twentieth century.
Edison stayed involved with sound recording into the early years of the new century, perfecting new, and more durable, recording media (including Amberol, an early plastic), but as was typical he grew increasingly cranky and resistant to change. Edison Records, which had once ruled the audio-recording world, finally made the transition to disk records in 1912—with a superior but proprietary technology that was not only more expensive but only played on Edison gramophones. Add to that Edison’s unwillingness to pay top-flight talent to record for him (as did competitors like Columbia Records) and, on the eve of the Great Depression, Edison Records went out of business.
TUBE STAKE
By the early years of the twentieth century, recording sound and playing sound had divided into two increasingly distinct industries. On one side was the brand-new world of consumer entertainment—millions of gramophones produced by scores of manufacturers supported by growing catalogs of records sold by the tens of millions through an infrastructure of distributors and retailers, from record shops to department stores. On the other was the new entertainment industry providing the talent and content for these records; these included record companies, recording studios, and manufacturers.
It was on this latter side, content, where the important innovation continued to take place. Early recording sessions literally required performers to shout or sing into giant versions of gramophone horns in order to produce enough sound vibration to carve into the surface of a master metal platter that could then be mass-copied onto consumer records (in the early cylinder days, when only a dozen could be produced at a time, performers had to repeat their performances many times over the course of hours). Because the process—at both ends of the creator-consumer relationship—was mechanical, volume and sometimes fidelity were also limited.
That changed with the work of a whole new kind of inventor, one whose flamboyance and risk-taking style presaged wholly new generations of entrepreneurs to come. His name was Lee De Forest, and he was yet another Midwesterner (Iowa) who had gone east to make his name—in De Forest’s case, via Alabama, where his father had been the controversial president of a black college. After earning his Ph.D. from Yale, De Forest took a faculty position at the future Illinois Institute of Technology teaching radio technology.
But De Forest didn’t have the personality for a quiet career lived in a faculty lounge. And by 1905 he embarked on his own private research quest. He began with a device derived from Edison’s lightbulb, but instead of installing a pair of electrodes with a carbon filament stretched between them, De Forest changed the location of those electrodes, making one end a filament and the other a receiving plate, which allowed the electrons to shoot across the gap through the vacuum. He filed the patent for this “diode,” which detected electromagnetic waves, in 1906.
A year later, in the greatest innovation of his nearly two hundred patents, De Forest added a third electrode to this design—a grid, placed between the other two. The resulting “triode” is considered the first true vacuum tube … and even De Forest at first didn’t understand what it was good for.
In the meantime, De Forest built a company to exploit what he believed would be a huge market for his two-electrode “Audion,” as he called it, in the telegraph industry—where it proved useful in detecting wireless telegraph signals. But De Forest was too mercurial to be a good entrepreneur, and between his poor business skills and some crooked partners, by 1911, the company was out of business. De Forest moved to Palo Alto, California, and took a job with Federal Electric, one of the first technology companies in what would one day be Silicon Valley. But De Forest couldn’t put his past behind him, and in March 1912, he was arrested by federal officers for stock fraud. It took a hurried collection of $10,000 bail by Federal Electric’s directors to keep De Forest out of jail.
As he awaited his trial, De Forest embarked on one of the most feverish periods of creation in his life. And it was during one of these sessions, working with two assistants to try to perfect a newer Audion design, that De Forest tried reconfiguring the locations of the three electrodes. To see how well it worked, De Forest plugged the tube into a telephone transmitter, put on a pair of headphones, dangled his “trusty Ingersoll” pocket watch in front of the transmitter … and nearly blew out his eardrums.10
Lee De Forest had invented the amplifier. And in the years to come, he and his successors would continue to experiment with the location of the components, the shape of the bulb, the transmitter “guns” shooting the electrons, and the coatings on the end of the bulb that would glow when hit by these electrons. The resulting inventions built from these new designs over the next forty years included radio, the klystron tube, radar, microwave transmitters, and television.
But one of the first places the tube amplifier found a home was in the recording studio—in particular, in microphones. The Audion brought electricity into the recording studio. Now sound could be boosted even as it was recorded, and that—combined with new master-recording technologies—meant that a single superior recording could be reproduced millions of times, and the master copy permanently stored. The performers could also hear themselves in playback over the studio’s speakers. And once tube prices fell, that same combination of electricity and amplification could come to the gramophone—now called the “record player”—as well. Audio memory, in the form of what would eventually be billions of copies of records, LPs, cassettes, CD, and MP3 files, had permanently entered into the human legacy.
ENDURING IMAGE
Edison’s final contribution to human memory took a little longer to create.
As already noted, he spent much of the 1880s dealing with both business and personal trials. But by 1888, he had a good idea of what he wanted to do next. He had remarried (to Mina Miller, the twenty-year-old daughter of another inventor) and moved from Menlo Park to homes in West Orange, New Jersey, and Fort Myers, Florida. He did little hands-on inventing anymore, but rather served more as an impresario of new ideas.
Edison’s most famous quote had been that “genius is one percent inspiration and ninety-nine percent perspiration,” but now most of that sweating was being done by an army of assistants, many of whom were better scientists and engineers than Edison had ever been. They were also more efficient: Whereas in the development of the lightbulb Edison had famously tried thousands of different materials before tripping over carbon filament, these new researchers had the skills to deal with the increasingly difficult technologies in a more theoretical and less plodding trial-and-error manner that had been Edison’s forte. As the bitter Nikola Tesla would notoriously say in Edison’s only negative obituary “tribute”:
His method was inefficient in the extreme, for an immense ground had to be covered to get anything at all unless blind chance intervened and, at first, I was almost a sorry witness of his doings, knowing that just a little theory and calculation would have saved him 90% of the labour. But he had a veritable contempt for book learning and mathematical knowledge, trusting himself entirely to his inventor’s instinct and practical American sense.11
Of course, that didn’t prevent Tesla from proudly accepting the Edison Medal a few years later.
But what this undoubtedly accurate observation missed was that Edison didn’t have to be both the visionary and the builder to be a great inventor. That was a myth created by mid-Victorian inventors like Bell, Morse, and the young Edison himself, in an era when a visionary could also be a talented prototype builder. It wasn’t true for very long, but that particular myth has had a very long tail: Even in the early twenty-first century, most of the general public credits Steve Jobs as the inventor of Apple’s extraordinary run of consumer products, which wasn’t true in fact, but was in the larger sense quite accurate.
Also the case in 1890 and in the years beyond was that the Edison Company became more effective the less the founder was involved in the detailed activities—from R&D to marketing and sales—of the company. A classic example of this was Edison’s third great memory invention: motion pictures.
At this point, still photography—another great landmark in the history of memory—had been around for a half-century, arriving in time to give us indelible images of the Crimean War, Indian maharajahs, Chinese farmers, African tribesmen, and Mathew Brady’s pictures of Antietam and Abraham Lincoln. Photography had been invented in the 1820s, but its core technologies (light-sensitive chemicals, lenses and cameras obscura, mechanical shutters) were already well established, with some, such as the pinhole camera, dating as far back as the ancient Greeks and the Chinese.
In the end though, it was Joseph Nicephore Niepce, a French inventor, who in 1826 coated a pewter plate with bitumen, exposed it for a number of hours to light to harden the coating, then washed away the unhardened portions with a solvent … and created View from the Window at Le Gras—a jumble of rooftops and towers that is history’s first photograph. With it, he recorded the first image of the natural world that wasn’t created by the esthetic intervention of man. Time would show that even photographs weren’t bias-free, but they remain in daily life as close to an objective image of reality as we are likely to find. Artificial memory had now added its most important visual dimension.
Niepce, a remarkable inventor (he also built the first internal combustion engine), died in 1833, but before he did, he gave all of his notes on photography to his younger business partner, Louis-Jacques-Mandé Daguerre, a skilled theater designer who had already invented the diorama. Daguerre was a rare case of a skilled inventor who was also a good businessman, and in 1839 after years of improving the process, he filed for a patent and sold the technology to the French government. The resulting “Daguerreotype” was the first widely used photographic technology; and the name has been casually attached to every type of early photograph ever since.
However, great inventions almost never occur alone, and neither do they stay a monopoly for long. Across the Channel, William Fox Talbot was doing similar experiments on his “calotype” process. His greatest contribution to the story of photography was the development of the negative, which allowed multiple prints to be made from the original photograph—impossible with other contemporary photographic technologies. Meanwhile, over the next two decades a number of new and competing (price, quality, durability) technologies were developed, including “collodion”—the one used by Lewis Carroll for his Alice photographs—and tintypes, made famous by thousands of Civil War portraits, which were of inferior quality but offered the advantage to the middle class of being very cheap.
The limitations to these photographic technologies are quite obvious today as we look at surviving examples: They are dark, very fragile (the glass plates break, the images smear) and, because they required their subjects to stay very still for long periods of time, almost always stiff. That’s the main reason why there are few “action” photographs from the Crimean and Civil Wars but a whole lot of images of the battlefield dead.
The biggest breakthrough in nineteenth-century photography took place almost exactly the moment that Herman Hollerith was designing his tabulator and Edison was working on his second-generation gramophone cylinders. The inventor-entrepreneur this time was George Eastman, a self-educated New Yorker who was one of the sanest of his breed.
Throughout the 1880s, Eastman systematically transformed almost every part of the photography industry—giving it the form it still exhibits today. He began in 1884 by applying for a series of patents for paper-based photographic film. Eastman had discovered a means to coat paper with a dry, light-sensitive gel … then had the brilliance to break from the existing plate paradigm and instead cut these sheets into long strips that could then be rolled. Next, he developed a camera that would accept these rolls, which he introduced to the world four years later. He then followed that, in 1892, with the founding of the Eastman Kodak Company (“Kodak” was one of the first made-up brand names) and set out to mass produce the world’s first consumer camera.
This Kodak “box” camera was a huge hit, putting photography into the hands of the average person for the first time. But it was merely a prelude to Eastman’s greatest success, the “Brownie” camera, introduced in 1901.
The Brownie, little more than a cardboard box with a lens in front, a button on top, and a roll of film inside, is a good candidate for the single-most influential consumer product model in history. Driven by its ad message, “You Push the Button, We Do the Rest,” it sold in the tens of millions, and its descendants—built of Bakelite and armed with a flash and color film—were still the world’s most popular cameras well into the 1960s.
FLICKERING FUTURE
As with the phonograph, the influence of paper photography and simple, low-cost cameras on the story of human memory is almost incalculable. Everyday Americans, and soon populations around the world, made the act of taking “snapshots” a regular part of their daily lives. And because they didn’t have to deal with the development of the images but merely sent the Brownie to Eastman Kodak by mail and got back the printed photographs and a reloaded camera, owners took photographs by the hundreds: family portraits, vacations, work, novelty shots, public events, distant lands, famous figures. Together, all of these photographs constituted the greatest visual memory record of a culture ever accumulated to that date. Thanks to those giant archives of Brownie photographs we know more about the daily life of the average laborer at the turn of the twentieth century than we do about some of the more private monarchs living at the same time.
Needless to say, all of this made George Eastman a very wealthy man, which he immediately converted to equally impressive feats of philanthropy. He was the very model of the enlightened tycoon. In 1932, at age seventy-seven, after two years in agony from a degenerative spine disease, Eastman committed suicide. His last note perfectly captured the restless spirit of Eastman and his fellow entrepreneurs: “My work is done. Why wait?”12
The importance—and the implications—of Eastman’s new film technology was recognized almost from the moment of its introduction, and even more so in 1889, when he perfected the process on transparent celluloid film. One person who fully appreciated it was Thomas Edison. Just a year earlier, Edison had attended a lecture by Eadweard Muybridge, Stanford University’s master of stop-motion photography, and had been entranced by Muybridge’s “zoopraxiscope,” a disk with consecutive stop-motion images around its perimeter that, when spun, created the sensation of movement. Two days later, Edison met with Muybridge to discuss the technology, and the latter proposed a joint project that would combine the zoopraxiscope with the Edison phonograph. Edison declined.
But six months later, Edison, back to his old competitive self, notified the Patent Office of his plans to build a device to do “for the Eye what the phonograph does for the Ear.” In his next caveat to the Patent Office, Edison gave this device a name: the kinetoscope. He then turned to one of his most talented assistants (and official company photographer), William Dickson, and gave him the assignment to start building a kinetoscope design that used tiny (1/32-inch wide) photographs developed directly on the surface of a rotating drum … and then to synchronize the rotation of this drum with the cranking of an on-board phonograph. Meanwhile, Edison took off for France, ostensibly to attend the 1889 Exposition Universelle in Paris, where the company had an exhibit, but also to investigate the current state-of-the-art technology in Europe.
During his two-month stay, Edison visited the naturalist-inventor Étienne-Jules Marey, who had invented the first motion-picture camera—a device that resembled a machine gun with a drum magazine—with which Marey exposed multiple images of the same creature on a single strip of flexible film. Edison also saw several other evocative new technologies, including Ottomar Anschutz’s “electrical tachyscope,” which used flickering light to trick the eye’s persistence of vision to create the effect of motion; and Charles-Émile Reynaud’s “Theatre Optique,” which used perforations and a toothed wheel to pull handpainted film cells through a projector.
Meanwhile, back at Edison Labs, Dickson and his assistants had abandoned the drum model as too crude in its imagery and instead replaced it with strips of flexible film bearing consecutive images that were wrapped around the drum. In the process, they created the first-known moving picture on film made in the United States: Monkeyshines No. 1 (1889). It lasted only a few seconds and featured a silly demonstration of physical dexterity by a fellow employee … but the door was now open.
When Edison returned from Europe, he quickly filed a new caveat with the Patent Office and then gave Dickson a new set of marching orders. Now the kinetoscope would abandon the drum and instead create a closed loop of film to be strung back and forth on rollers to tuck the maximum length inside the projector and pulled along at a steady rate in front of a projector at the right speed between frames to create in the eye and brain the sensation of continuous, smooth movement.
The design was sound, but Dickson and his crew were still having trouble with the film medium itself. They were reduced to hand-cutting sheets of brittle and stiff celluloid and gluing them together end to end. It was a wholly unsatisfactory solution. Then, in August 1889, while Edison was still in Europe, Dickson happened to attend a presentation by George Eastman of his new flexible, rolled photographic film—and knew he had his answer.
Then, remarkably, the whole project was put on hold as Edison pursued a mining project. It was 1891 before Edison and the team, now led by William Heise, set out to finish the design. The result, built within a large cabinet, was the loop of film, strung between the multiple rollers and then passed at the top through a viewing lens. The film itself was backlit by both a projector and a slit wheel that acted like a shutter to make the light flicker for a fraction of a second through the film to trick the eye. The crucial breakthrough proved to be an escapement mechanism, derived from watchmaking, which clicked the film forward in a unique stop-and-go motion every 1/46 seconds (in other words, forty-six frames per second) and would dominate motion-picture camera design for generations.
The first public demonstration of the kinetoscope took place on May 20, 1891, at Edison’s laboratory for a group of 150 members of the National Federation of Women’s Clubs. Wrote the New York Sun:
In the top of the box was a hole perhaps an inch in diameter. As they looked through the hole they saw the picture of a man. It was a most marvelous picture. It bowed and smiled and waved its hands and took off its hat with the most perfect naturalness and grace. Every motion was perfect.…13
Edison spent the next five years rolling out the kinetoscope in galleries across the United States, improving the design by adjusting the film speed to improve total run time to about thirty seconds, setting up the Black Maria film studio to produce more content (most notably Fred Ott’s Sneeze, the first copyrighted motion picture, and The Great Train Robbery, cinema’s first classic), and even experimenting with sound. The biggest unveiling took place at the Chicago World’s Fair, where the unexpected new miracle from the Wizard of Menlo Park swept away the crowd (as did the filmed belly dance of Little Egypt). Soon there were kinetoscope parlors from coast to coast—and a number of pirate companies creating and selling their own films for the machines.
During all of this, Edison went back to his original idea of adding sound to his kinetoscopes, calling them kinetophones. But his solution, to put a phonograph in the bottom of the case playing nonsynchronized sound, met with limited interest and was abandoned.
LAST GASP
In 1904 the kinetoscope business exploded in the United States, thanks to the number of installed units reaching critical mass; the creation of a new genre of films, prizefights; and the scandal (and arrests) surrounding another film—this one of a sexy dance by the music-hall performer Carmencita.
That same year, Edison opened kinetoscope parlors in Paris and London. In a move that is still debated, he chose not to file European patents beforehand.14 One theory is that the notoriously cheap Edison refused to pay the $150 filing fee. Another is that he had violated so many European patents in designing the kinetoscope that he had no hope of ever obtaining such a patent. Either way, it was a dangerous move, and in time Edison would pay heavily for it. Not only did various entrepreneurs across Europe quickly steal his design (and revenues) but, perhaps worse, smart inventors “reverse-engineered” the kinetoscope, saw its weaknesses, and rushed to make competitive improvements.
Of the latter, the most important were the Lumière brothers, Auguste and Louis. Before making their place in history by filming (and permanently memorializing the memory of) everyday life in turn-of-the-century France, they permanently transformed the motion-picture experience by creating projecting kinetoscopes that showed their films on screens in front of audiences.
By 1910 the Edison Company was already an also-ran in the movie business. It continued to produce innovations, such as a truly synchronized kinetophone in which the (now projecting) kinetoscope was linked to the phonograph via a belt. But it was hard to maintain in country theaters and didn’t sell well. Another interesting attempt—a home kinetoscope—failed as well, largely because it was so far ahead of its time. Finally, a fire at Edison in December 1914 put the company out of the movie business forever.
But by then, Thomas Edison had made his final great contribution to human memory. His early films represented the first time that the world as it was actually lived, in full motion, was captured and preserved forever. A century later, as we look at these early films, as well as those of the Lumières and others, we often find ourselves staring past the foreground performance to get a glimpse of a larger world now lost forever and to see ourselves in people long since departed.
Thomas Edison died on October 18, 1931. The world mourned. Henry Ford asked Edison’s son Charles to capture and seal a test tube containing Edison’s last breaths. Two years earlier, on the fiftieth anniversary of the incandescent lightbulb, the world shut off its lights to remind itself of what life had once been like. But when the lights went back on and the populations turned to their record players or went to see a movie, they equally honored Edison, for he had given them—and us—the memories of our present into the endless future.
SOUND AND FURY
Thomas Edison may have failed to effectively synchronize sound with film, but there was another genius-inventor waiting in the wings: the ever-mercurial Lee De Forest.
Having transformed music recording, sound reproduction, and eventually radio and telegraphy with his audio tubes, De Forest now set about bringing sound to silent films. He was convinced that there must be a way to put the audio directly on the film, rather than the primitive technique of dropping a needle on a record timed with the start of a motion picture.
This was 1918, and audiences were still getting accustomed to the thrill of watching multireel movie extravaganzas such as Birth of a Nation and Intolerance, in which a whole vocabulary of acting and filming had adapted to the lack of a soundtrack. These audiences hadn’t evinced much interest in sound movies. But De Forest was undeterred. And in 1919, working from basic research by the Finnish inventor Eric Tigerstedt, De Forest patented the first sound-on-film technology, which he called “Phonofilm.”
Phonofilm in action was simplicity itself, and a testament to De Forest’s brilliance. Rather than try to meld the phonograph and film, De Forest instead decided to “film” sound—that is, he used a narrow strip on the side of the roll of film to capture the image of the sound waves picked up by microphones in the filming process. The sound waves were presented and saved as narrow lines of different gradations of gray. Then, as the film ran through the projector, these millions of parallel lines were converted back into their corresponding sounds and projected over speakers.
De Forest premiered Phonofilm in 1922 with a collection of short movies of speeches, stage performances, and musical acts … and waited for the movie industry to shower him with money. Instead, the movie studios ignored him. The already-paranoid De Forest assumed—this time perhaps correctly—that there was a conspiracy against him. A more likely explanation was that the studios had enough on their plates just getting films out with the current silent technology and appreciated more than De Forest the industrywide chaos that sound would (and did) create. And besides that, they probably didn’t want to pay De Forest’s licensing fees and planned to hold him at arm’s length until they could come up with a similar technology of their own.
De Forest refused to give up, and began producing short sound films of his own. He even convinced the Fleischer brothers to create a series of “follow the bouncing ball” sing-along cartoons that were still being shown on television into the 1960s. But the irascible De Forest, who tried to take credit for everything, eventually drove away his business partners, who turned around and sold his patents to Fox Films.
The year 1927 saw the birth of the “talkies”—many of them using Phonofilm audio technology. But De Forest wasn’t there to celebrate. Having been married and divorced three times already (once to a famous suffragette), De Forest married and settled down with a movie ingénue from the Hal Roach Studios, Marie Mosquini, and proceeded to publicly decry the debased uses of his invention—most famously in an open letter that announced: “What have you done with my child, the radio broadcast? You have debased this child, dressed him in rags of ragtime, tatters of jive and boogie-woogie.”
He lived long enough to earn an Academy Award for his work, and a star on the Hollywood Walk of Fame.
TALE OF THE TAPE
Alexander Poniatoff always thought big, even when he was thinking small.
Born in tsarist Russia at about the same moment that Thomas Edison was showing off the kinetoscope in the United States, Poniatoff dreamed of growing up to design and build great locomotives. His father was a wealthy lumberman in Kazan, and so when young Alexi came of age, he was sent off to a technical academy in Karlsruhe, Germany, to study engineering.15 By this stage his dream was to return to Mother Russia and build a great turbine factory.
But it wasn’t meant to be. Alexi returned to a Russia mobilizing for war. Then came the revolution … and then the civil war. A White, Poniatoff joined the army and trained to become a pilot. But as defeats mounted and the Bolshevik Reds consolidated control, Alexi realized that his cause was doomed. He deserted and made his way to China. He found work at the Shanghai Power Company.
Needless to say, China wasn’t the best place to be in the 1920s as it slid into its own revolution. So, in 1927, Poniatoff emigrated again, this time to the United States. By now one of the few experienced electrical engineers in the country, Poniatoff found himself recruited by a number of companies, including Pacific Gas & Electric, Dalmo-Victor, and Edison’s old company, the now-giant General Electric.
Poniatoff’s skills were in even more demand when World War II erupted, and he spent most of the war years designing motors and generators for airborne radar systems. This classified work, combined with Poniatoff’s highly respected expertise, enabled him to be one of the first scientists to get a glimpse of the secret technological spoils emerging out of the fallen Third Reich. What he saw made Alexander Poniatoff quickly abandon all of his own dreams in pursuit of a new one.
Magnetic recording, though it seemed revolutionary when it first appeared, had already been around for a very long time. In 1888, Oberlin Smith, an American machinist, devised a system for attaching steel dust to a fine thread. He then pulled the thread past a magnet that was in turn attached to a microphone. This process magnetized spots on the thread in relation to the transmitted sound signal. But Smith left it at that, published his results, and went back to making machine tools.
A decade later, Danish engineer Valdemar Poulsen picked up on Smith’s theories and ran with them. He used wire this time (which Smith thought couldn’t be done), which he wrapped around a drum, and improved the recording/playback magnet (“head”). He called the resulting design a “telegraphone.” Because there was no amplification, the recordings were weak, but if one used headphones, still hearable. Poulsen showed his telegraphone at the 1900 World’s Fair in Paris, and the recording he made there of Emperor Franz Josef of Austria survives as the world’s oldest magnetic sound recording.16
Amplifying the signal of a wire recorder was a lot more difficult than it seemed. Merely boosting the power of the signal resulted in considerable loss of the lower frequencies and a lot of distortion at higher frequencies. Moreover, adding this power, especially with direct current, also overly magnetized the head, creating further problems. Finally, the wire medium, reduced to the thickness of a human hair to get the mile-plus length on the spool to create an hour-long recording, had a tendency to twist and tangle during rewind and editing. Still, it was better than the alternative of having to work with carved master disks; to make a splice with wire the editor merely held his cigarette to the two ends and welded them together.
The problem of amplification was eventually solved through a process called “bias,” by which a controlled AC signal was added in a specific pattern to the signal before and after the read-write head, thus removing any existing magnetism while moving the recorded sound into a better range and then boosting the resulting signal.17
By the late 1920s, magnetic wire recording had been improved to the point where it could be used for office dictation and telephone recording, and a number of companies sprang up to pursue those opportunities. But wire recording remained generally unknown to the general public, which might have been surprised to learn that several popular radio shows, the first being Edward R. Murrow’s See It Now on CBS, were recorded and edited on wire.18
In World War II, both sides made heavy use of wire-recording technology. One of the most interesting applications was that of the “Ghost Army,” in which the U.S. Army Signal Corps took wire recorders to the front lines to play military sounds in order to confuse the enemy. Immediately after the war, psychology professor David Boder rushed to Europe with a wire recorder and conducted numerous interviews with Holocaust survivors—among the most historically important recorded memories of the twentieth century.
As the war neared its end, rumors began to circulate that the Germans had developed a major improvement on wire recording. And indeed they had. Beginning in the late 1920s researchers had begun to look at ways of improving the wire medium in order to capture more of the analog data emerging from the source. The obvious solution was to make wire “wider”—that is, to turn it into a metal strip. Just such a steel tape recorder was first used by the BBC in 1932. But the technology proved not only unwieldy—a one-hour taping required 3 kilometers of tape racing past the read-write head at 1.5 meters per second—but hugely dangerous: If the spring-steel, razor-sharp tape were to snap, it would thrash around the studio, slashing everything in its path.
But as war clouds gathered, a group of scientists at the IG Farben chemical company subsidiary BASF, working with the Third Reich’s official propaganda radio network, revisited magnetic-tape recording and came up with the first practical solution: plastic tape, coated with iron oxide and run through a ring-shaped head (less destructive than the traditional needle head) and amplified using AC bias. The results were stunning—and not lost on the Allies, who raced to figure out how the Nazis were able to repeat broadcasts to different time zones.
When the Allied scientists, including Poniatoff, finally got a chance to see the German tape recorders, they knew they were seeing the future … and it wasn’t long before companies in the United States and Western Europe initiated their own magnetic tape-recorder development programs.
One of these competitors was a brand-new company founded by Alexander Poniatoff. He called it Ampex, after his own three initials, followed by “EX-cellence.”
What Poniatoff, with his peerless experience in electrical engineering, saw about these recorders that was missed by many of his competitors was that the fundamental challenge was not in making the technology work better, but in making the practical recording time longer. That seemed impossible: The tape was already flying through the recording head, and it couldn’t be made much thinner, so the only solution seemed to be bigger and bigger reels.
But Poniatoff had a better idea: Instead of making the tape go faster, why not slow it down to add recording time … and make up the difference by spinning the recording head instead?
Poniatoff wasn’t the only entrepreneur thinking outside of the box when it came to recording technology. Major Jack Mullin was in the U.S. Army Signal Corps, assigned to find out everything he could about German electronics and radio technology. By chance, in 1945, just as he was heading home to California, he stopped to inspect a newly captured radio station in Bad Nauheim, near Frankfurt. There, he found two suitcase-sized “Magnetophon” tape recorders and fifty reels of Farben tape. He shipped them home to spend some time working with them.
In 1946, after a demonstration for engineers in San Francisco of his improved Magnetophons that met with an enthusiastic response, Mullin decided it was time to pitch his recording technology to Hollywood. His timing was impeccable.
The biggest recording star in the world in those days was Bing Crosby. Crosby, who preferred the casual intimacy of the recording studio to the stopwatch world of live radio, had been fighting MGM for the right to record his radio show. Citing poor recorded sound quality, MGM had refused, and a small war of nerves had erupted—to the point where Crosby had even briefly quit radio in 1946. So, when Mullin demonstrated his Magnetophon at MGM one afternoon, Crosby’s technical director Murdo MacKenzie knew he had heard the answer. He quickly arranged for a meeting between Mullin and his boss.
Crosby, too, was impressed with Mullin and his machine—and quickly wrote out a $50,000 check to cover the purchase of the machine and to make an investment in its manufacturer. Mullin, however, didn’t have a company—but he knew who did: Alexander Poniatoff and his six-man Ampex, where Mullin was a consultant. Poniatoff, who had just completed his own design for a rotating-head recorder, the model 200, filled the order. Crosby and his team went on to use the editing features of the Ampex 200, and 3M Company’s new acetate magnetic tape, to revolutionize radio broadcasting (including the notorious laugh track), Mullin got very rich, and Ampex became the fastest-growing company in business history—a pace that wouldn’t be equaled until the dot-com boom of the 1990s.
In little more than a decade, Ampex grew to thirteen thousand employees and utter dominance of the audio-recording industry. Guitarist Les Paul used an early Ampex recorder to edit together multiple recordings into one—the beginning of multitrack recording. Elvis Presley would make his first recordings at Sun Studios on an Ampex reel-to-reel. Elizabeth Taylor’s husband, Mike Todd, worked with Ampex to place a magnetic strip on film to carry a much higher level of audio quality in movies. And having captured almost the entire professional recording world, Ampex began building consumer-grade tape recorders in the late 1950s, capturing that market as well.
By the mid-1950s, Bing Crosby was experimenting with ways to record video signals on tape as well. Once again, Poniatoff took this idea and ran with it, assigning a team that included nineteen-year-old future sound wizard Ray Dolby to build it. The team came up with a design that ran two-inch-wide tape at fifteen inches per second across four heads that were spinning at almost 15,000 rpm. The first taped network television broadcast, the CBS Evening News, was broadcast on November 30, 1956. Within thirty years, 100 million home videocassette players would be in use around the world, showing billions of professional films and homemade videos created on a new generation of handheld video cameras. The Kennedy assassination would be captured on videotape. So would man’s first step on the moon. Video memory, the defining medium of artificial memory in our time, was born.
ERASURE
There remained one last great market for magnetic tape. Computers, the descendants of Herman Hollerith’s tabulator, had evolved slowly during the early years of the twentieth century, serving as little more than sophisticated calculators. But in the 1930s, once again driven by the oncoming war and the need for powerful tools for everything from encryption/decryption of codes to the computation of artillery trajectories, computer technology developed at a rapid pace throughout the world.
In the UK, the great mathematician Alan Turing built a series of computers that used the ones and zeros of Boolean algebra to create increasingly powerful (and eventually tube-driven) computation engines for code-breaking. In Germany, Conrad Zuse used electromechanical relays in his Z series—making them the first electronic computers—to compute artillery tables at record speed. But it was an American, Claude Shannon, who put together the two technologies—Boolean logic and electrical relays—to devise the modern computer architecture. It would be realized in Harvard’s Mark 1 and, just after the war, the ENIAC.
The fast-moving, fast-growing, information-driven postwar era was just made for the computer, which in turn made that world possible. Many companies fought for the military and then commercial computer market, but one company emerged on top: IBM, the company Thomas Watson Sr. had built from Hollerith’s struggling start-up.
These early “mainframe” computers were the size of a small building and glacially slow by modern standards, but they were still fast enough to quickly outstrip the devices designed to put data into them and then take out the processing information they produced. Indeed, the first great postwar mainframes depended almost entirely on two nineteenth-century technologies: Hollerith’s punched cards (and tapes) and Edison’s alphanumeric printer.
It was the beginning of the still-ongoing race by artificial memory to keep up with the ever faster demands of the digital world. This time the answer came from the typewriter company Remington Rand with its UNIVAC computer line. In 1951 it licensed existing magnetic tape recording technology, adapted it for digital signals, and introduced UNISERVO—the first computer magnetic tape memory system. A year later, IBM introduced a seven-track tape memory and quickly ran away with the market by coming out with a series of milestone tape players over the next decade (and leveraging the power of its leadership in mainframe computers). By the 1960s, the room full of tape memory machines, each with its spinning spools, had become synonymous with the computer itself.
In time, magnetic tape memory was supplanted by other, more powerful digital memory technologies. Nevertheless, though rarely noticed, magnetic tape technology remains the world’s most commonly used memory medium—in the form of the magnetic stripes on the back of credit cards. This format was invented in 1960 by IBM engineer Forrest Parry and proved to be an immediate success. It is estimated that in 2010 80 percent of the world’s population used magnetic stripe technology in some form—and that these cards were swiped through readers 50 billion times per year.19
As for Ampex, the company seemed to grow old quickly, as if its health was tied to the aging Alexander Poniatoff. Poniatoff had been a middle-aged man when he founded Ampex, and in 1955, when he became chairman, he was sixty-three. Almost as if he was bored with his success, he began to take the company into ever more risky new ventures—notably a complete content-production operation on which the company lost a fortune. And he was cavalier about the company’s assets: Contemptuous of that nation’s cheap manufacturing, he casually licensed Ampex’s recording technology to Japanese electronics companies … and then watched as those firms captured the entire consumer electronics industry and made that country rich.
When Ampex tried to fight back, it found that its long-standing philosophy, in the words of a former executive, that it “only knew how to do things well and costly,” no longer worked. Its new home-video recorder was the best in the world … but also oversized and overpriced—and doomed.
So was Ampex. But Alexander Poniatoff barely seemed to notice. He was older than the century, the last tsarist—“A real eighteenth-century man,” said one employee—and showed it with increasing eccentricity:
In the 1970s he became a health nut. He ate only unprocessed foods, drank carrot juice and had one of the first air ionization systems installed in his office. He began backing medical groups studying longevity and the effect of color on behavior. He drove only white cars and took to wearing a baseball cap. Mrs. Poniatoff held her Horticultural Society meetings in the Ampex cafeteria.20
There were even rumors of séances. If so, they must have predicted a bleak future. After Poniatoff died in 1980, at age eighty-eight, Ampex prudently stripped itself of all of these extraneous ventures and went back to its core business of selling professional recording equipment. But it was too late.
Today, other than a skeleton team managing the company’s once-great intellectual property assets, all that remains of Ampex, the company that did more than any other to capture the memory of the sound and look of this world (and others) is the huge old company sign, now a historic landmark, that towers over Bayshore Freeway in Silicon Valley. It stands in memory not only of one of the most remarkable companies ever but of the golden age of entrepreneur-inventors.