Chapter 8
Franklin’s Baby: Electricity, Automobiles, and the Second Industrial Revolution

The greatest invention of the nineteenth century was the invention of the method of invention.

—Alfred North Whitehead1

On a humid afternoon in Philadelphia in September 1752, Benjamin Franklin and his adult son William flew a kite made of silk, with a foot-long piece of wire sticking out of the frame. At the base was a string of hemp, a conductor, and a nonconducting ribbon of silk, with a metal key at their intersection. As rain fell, the Franklins took shelter in a barn. What happened next was described by British correspondent and colleague Joseph Priestley in 1767: “When, at length, just as he was beginning to despair of his contrivance, he observed some loose threads of the hempen string to stand erect, and to avoid one another . . . he immediately presented his knuckle to the key, and (let the reader judge of the exquisite pleasure he must have felt at that moment) the discovery was complete. He perceived a very evident electric spark.”2

If lightning had actually struck the kite, Franklin might have been killed. As it was, he had been knocked out during an earlier attempt to electrocute a turkey.3 In 1753, a Swedish scientist in Saint Petersburg, Russia, was killed by lightning when he repeated Franklin’s experiment.

Franklin built on the work of many predecessors. Otto van Guericke in 1670 devised a mechanism capable of producing an electric charge, and a generation later Charles-François de Cisternay du Fay demonstrated the difference between negative and positive charges. In 1745, the Leyden jar, an early version of the condenser, was invented.

More breakthroughs followed Franklin’s successful experiment, thanks to Priestley, who in 1766 showed that the force between charges varies inversely with their distance, and Charles-Augustin de Coulomb, who in 1777 invented a device to measure electric charges. After Hans Christian Oersted proved in 1819 that electrical currents generate magnetic fields, Michael Faraday demonstrated that an electric current in one wire could induce a current in another. James Clerk Maxwell, building on Faraday’s work, set out the formal mathematics of electromagnetism in 1873 in his Treatise on Electricity and Magnetism.

Technology built on science. Already in 1800, Alessandro Volta had invented the first battery. Then in the 1840s, James Prescott Joule showed that a magneto could convert mechanical energy into electrical energy. In 1867, dynamos that used magnets to turn rotary motion into electrical power were invented in parallel by Werner Siemens in Germany and Charles Wheatstone in Britain.4 As early as the 1870s, Siemens adapted electromagnetic dynamos to create electric arc furnaces for smelting metals.

The American historian Henry Adams, writing about himself in the third person, described his response to the dynamo he encountered at the Paris Exposition of 1900: “As he grew accustomed to the great gallery of machines, he began to feel the forty-foot dynamos as a moral force, much as the early Christians felt the Cross. The planet itself seemed less impressive, in its old-fashioned, deliberate, annual or daily revolution, than this huge wheel, revolving within arm’s-length at some vertiginous speed, and barely murmuring,—scarcely humming an audible warning to stand a hair’s-breadth further for respect of power,—while it would not wake the baby lying close against its frame. Before the end, one began to pray to it; inherited instinct taught the natural expression of man before silent and infinite force. Among the thousand symbols of ultimate energy the dynamo was not so human as some, but it was the most expressive.”5

Adams thought that the dynamo “would not wake the baby lying close against its frame.” Earlier a baby had been used as a metaphor for electrical technology itself.

Franklin may or may not have asked, in connection with electricity, “What is the use of a newborn baby?” (Some accounts have him asking the same question after witnessing a balloon ascension in France.)6 And the great British scientist Michael Faraday may or may not have quoted Franklin in the context of electricity, although he certainly did so in the context of the discovery of new elements in an 1816 lecture: “Before leaving this substance, chlorine, I will point out its history as an answer to those who are in the habit of saying to every new fact, ‘What is its use?’ Dr. Franklin says to such, ‘What is the use of an infant?’ ”7

FROM LONE INVENTOR TO CORPORATE LABORATORY

With each succeeding wave of technological innovation, the individual inventor has become less important than teams of scientists and engineers, informed by the latest scientific advances and working in laboratories supported by financiers, corporations, and, increasingly, by governments, the only entities capable of funding basic research on a massive scale. The first industrial revolution had largely been the result of individual inventors like James Watt and Robert Fulton, relying on trial and error. In the words of the historian Joel Mokyr, “It created a chemical industry with no chemistry, an iron industry without metallurgy, power machinery without thermodynamics.”8 Increasingly, invention depended on science. The European inventors, including Gottlieb Daimler and Karl Benz, who invented the automobile with a gas-fueled internal combustion engine, enjoyed the benefits of state-sponsored science in Germany, the country which, along with France, initially led automobile development before Americans like Henry Ford naturalized the technology and exploited the economies of scale provided by America’s enormous market.

The industrialization of research and development (R&D) and invention became a defining feature of twentieth-century industrial capitalism. In 1901, General Electric established the first corporate research laboratory in the United States. Other corporations that established their own research facilities included DuPont (1902) and Eastman Kodak (1913). The most famous was Bell Labs, established by the American Telephone and Telegraph Company (AT&T) in 1925. Between 1920 and the 1980s, the number of employees of corporate research labs rose from six thousand to nearly a million.9

Even more important in the long run was the contribution of the federal government to innovation. Beginning with the Morrill land-grant college acts in the late nineteenth century, the federal government created a sophisticated technological innovation system in agriculture, in which federally funded regional and state laboratories worked on problems and disseminated solutions by way of county extension services to American farmers. Attempts to construct an equivalent system for American manufacturing repeatedly failed. But a large-scale federal role in research and invention became permanent during the third industrial revolution during and after World War II, with the establishment of the National Science Foundation and the National Institutes of Health and military-related contracts to industrial firms and academic researchers who developed the first few generations of computers and created the Internet.

THOMAS EDISON AND THE RISE OF R&D

Thomas Edison provides an example. Dirty, disheveled, and dressed in workman’s clothes in many of the photographs he posed for, Edison played the archetype of the folksy American genius as successfully as Franklin had played the role of backwoods sage in a coonskin cap. When Nikola Tesla, a brilliant Serbian engineer who had worked in Paris, successfully applied for a job with Edison in the United States, the dapper European recalled: “I was thrilled to the marrow by meeting Edison, who began my American education right then and there. I wanted to have my shoes shined, something I considered beneath my dignity. Edison said, ‘You will shine the shoes yourself and like it.’ He impressed me tremendously. I shined the shoes myself and liked it.”10

In reality, Edison was less an individual than the director and public face of an institution. Born in Ohio, Edison was raised in Michigan, where he struggled to make money in his teen years by selling newspapers and candy on trains. When he saved a three-year-old boy from being hit by a train, the boy’s grateful father, a station agent, taught Edison to be a telegraph operator. Telegraph technology inspired Edison’s early inventions, including a stock ticker and a vote counter.

While Edison was a brilliant inventor himself, most of the products for which he is given credit, from the incandescent lightbulb to the phonograph and motion picture technology, were the work of the engineers he organized into teams in a succession of laboratories. No lone genius, he was the director of his own research lab and was funded throughout his career by corporations like Western Union and by financiers like Jay Cooke and J. P. Morgan.

He established a small workshop in Newark and then, in 1876, he created a large laboratory in Menlo Park, New Jersey. Menlo Park consisted of a six-building complex on thirty-four acres that Edison had purchased in 1875, because of its combination of rural quiet and proximity to New York investors. His team at Menlo Park patented around four hundred inventions, including the lightbulb and the phonograph, before moving in 1882 to a larger laboratory in West Orange, New Jersey.

Called the “Wizard of Menlo Park” by a journalist, Edison drew on the talents of a research team with members from many countries, including his English chief mechanic, Charles Batchelor, his German glass-blower, Ludwig Boehm, and his Swiss clockmaker, John Kruesi. While mathematicians, machinists, and carpenters in his team of two dozen worked on new devices, the business side of Edison’s “invention factory” was handled by bookkeepers, secretaries, and draftsmen who created drawings for the patent office, and his lawyer and agent, Grosvenor Lowery. His research staff included a number of individuals who went on to illustrious subsequent careers, including Johann Shukert, who would found what became Siemens in Germany, and John Kreusi, who became chief engineer at General Electric.11

SUBDIVIDING ELECTRICITY

Everyone knows that Thomas Edison invented the lightbulb, and he did—almost.

Others had experimented with incandescent lighting. The first form of electric illumination was the arc lamp, devised by a Russian, Paul N. Jablochkoff, and improved by an American, Charles F. Brush. In the 1870s and 1880s, arc lighting began to replace gas lights in public places like streets and railway stations, but it could not be used in residences and offices.

As early as 1859, Moses Farmer lit his home in Salem, Massachusetts, with glowing platinum wires. Joseph Swan, a British scientist, patented the first incandescent lightbulb in 1878. After making his home the first to be lit by electric bulbs, Swan turned a lecture hall in Newcastle into the first public building to be lit by bulbs in 1880 and then, in 1881, made the Savoy Theatre in London the first theater to be lit completely by electricity.

When he founded his Menlo Park, New Jersey, laboratory in 1876, Edison focused on electricity. In 1879, after testing six thousand plant fibers, Edison’s team in New Jersey, like Swan, chose carbonized cotton for the filament. The bulb owed its design to the earlier work of two Canadian inventors, Henry Woodward and Matthew Evans.

Swan sued Edison for patent infringement. The British courts forced Edison to make Swan a partner in his British company, which became Ediswan. In 1882, Edison sold the US patent rights to the Brush Electric Company. In 1883, the US Patent Office had declared that Edison’s patents were invalid because they were based on the designs of William Sawyer. But Edison was a master of self-promotion and to this day receives credit for the invention of the incandescent bulb.

Edison’s most important contribution was to create an entire system of electric lighting, from the generator through the wires to the incandescent bulb in the electric lamp. Edison announced that his goal was to make electricity “subdivided so that it could be brought into private homes.”

A DYNAMO NAMED JUMBO

At 3 p.m. on September 4, 1882, Thomas Edison activated a switch and the age of electricity began.

Power in the form of direct current (DC) surged from the Pearl Street Station at 255–257 Pearl Street, several blocks away. Complete with an early version of an electric meter to monitor usage by customers, the Pearl Street Station was the first modern electrical-power network in the United States; another station designed by Edison had begun operating in Holborn Viaduct in London on January 12, 1882. A steam dynamo called “Jumbo” sent electricity through wires in tubes that at Edison’s insistence had been buried deep underground. The Pearl Street Station at its inception provided electricity to 110 customers in Lower Manhattan. One of the early customers was the New York Times, which published a description of the light from the fifty-two incandescent lightbulbs that lit up that afternoon as “soft, mellow and graceful to the eye . . . without a particle of flicker to make the head ache.”

Just as significant as the technology that Edison introduced to the world was the place that he chose for the introduction. Edison flipped a switch and turned on hundreds of lightbulbs in the offices of Drexel, Morgan, in the presence of J. P. Morgan, the preeminent financier in the United States. Beginning in 1878, Morgan and his partners, along with the Vanderbilt family, had financed Edison, helping him incorporate the Edison Electric Illuminating Company in December 1880. In 1892, Morgan would arrange the merger of Edison Electric, based in Schenectady, New York, and Thomson-Houston Company, based in Lynn, Massachusetts, to create General Electric, one of the first twelve companies to be included in the Dow Jones Industrial Average. As his reward, Morgan got not only one of the first offices but also the first home in New York to be lit by Edison’s invention.

The alliance of Edison and Morgan symbolized a new era for the American economy. Between 1880 and 1900, electric lightbulbs became twice as efficient and only one-fifth as expensive. Fluorescent lighting was used for offices and large buildings because of its energy efficiency.12 Along with the automobile, electricity was the most important technology of the second industrial revolution that rendered the steam-based technology of the first industrial revolution obsolete. And the entrepreneurial capitalism represented by the individualistic railroad barons of the Steam Age was rapidly giving way to the new finance capitalism symbolized by Morgan, described in the next chapter.

THE ELECTRIC GRID AND THE BATTLE OF THE SYSTEMS

When first cities and then regions began to build electric grids, a decision had to be made whether to use direct current or alternating current. On the side of DC were Thomas Edison and Edison Electric; the champions of AC were Tesla and Westinghouse.

The “battle of the systems” was fought in multiple arenas: legislatures, courthouses, and the court of public opinion. Harold Brown, a New York engineer with a knack for publicity, was hired by Edison to lead the crusade against AC.

At Columbia University in 1888, Brown claimed to demonstrate the danger of AC by using it to electrocute a dog that he first tortured with DC, to the disgust of his audience. He had practiced for the demonstration by using dogs for which he had paid New Jersey children. Undeterred, Brown carried out other public electrocutions of dogs as well as calves and a horse. When George Westinghouse scoffed at these spectacles, Brown publicly challenged him to an electrical duel: “I challenge Mr. Westinghouse to meet me in the presence of competent electrical experts and take through his body the alternating current while I take through mine a continuous current.”13

When the state of New York, with Edison’s advice, carried out the first execution in an electric chair, the criminal, William Kemmler, did not die at first, but squirmed in agony; successive doses of electricity killed him only by cooking him, to the horror of the witnesses. Edison tried to promote “Westinghousing” as a synonym for “electrocution.”

The war was won by AC. After J. P. Morgan eased Edison out, Edison General Electric merged with Thomson-Houston General Electric (later General Electric) and cooperated with Westinghouse to build a Niagara power station that supplied electricity to a local industrial complex and the city of Buffalo, New York. The electric utility industry adopted a synthesis of DC and AC that remains standard today.

The battle of the currents had a gruesome coda. In 1903, her owners decided to put down Topsy, an elephant at a Coney Island amusement park who had killed three people, including a trainer who tried to force a burning cigarette into her mouth. After it was decided that hanging an elephant was impractical, the owners settled on electrocution, and in front of a crowd on January 4, 1903, festooned with electrodes, Topsy collapsed as more than six thousand volts of electricity coursed through her massive body. Edison distributed a motion picture entitled Electrocuting an Elephant.

THE ELECTRIC MOTOR

Electricity transformed industrial production by permitting the factory to be located far from the ultimate source of power. Equally important was the adoption by industry of the electric motor.

In 1824, the British scientist and inventor Michael Faraday demonstrated that electromagnetic induction could convert electrical current into rotary motion. Faraday invented the electric motor in 1821 and the dynamo in 1831. Edison made small DC motors in the 1880s, but the industry was soon dominated by the three-phase (polyphase) AC motors that Westinghouse began to sell in 1892.

Electric motors transformed industrial production. In steam-powered factories, steam engines transmitted their motion to the machinery by way of shafts that turned belts and pulleys. Within the factory, small electric motors could now power separate machines. This allowed factories to spread out horizontally on a single story and opened up space overhead for electric lighting or natural light admitted through sawtooth windows. By the 1940s, four-fifths of the power in American factories was supplied by electric motors.14 Electric motors transformed the household as well as the factory. Small electric motors powered refrigerators, washing machines, driers, gramophones, radios, televisions, videocassette recorders, and personal computers.

THE STEAM TURBINE

Most of the world’s electricity in the early twenty-first century comes from steam turbines that use heat generated by coal, oil, natural gas, or nuclear energy to turn water into steam. The steam spins the blades of a turbine fan; the mechanical energy is then converted to electricity in a turbogenerator.

While Thomas Edison is a household name, few have heard of Charles Parsons. And yet he was one of the founders of the modern electrical industry. The son of a famous astronomer, William Parsons, third earl of Rosse in Ireland, Parsons graduated from Cambridge with a first-class honors degree in mathematics. He made an unusual choice for someone of his background and became an engineer. In 1884, he was the head of the electrical equipment division of Clarke, Chapman and Co., which manufactured ship engines near Newcastle in Yorkshire. Parsons developed a steam turbine that made electricity both abundant and cheap. In 1888, he installed steam turbines at a Newcastle power station. In 1889, he founded his own company to equip military and civilian ships with turbines. In 1897, he demonstrated the Turbinia, an experimental ship that was the fastest in the world. Parsons licensed his turbine technology to Westinghouse in the United States. Parsons’s steam turbine cost only a third as much as early steam engines, had an eighth of the weight, and occupied only a tenth of the space.15 Water turbines provide roughly a fifth of global electricity today.16

The age of electricity might just as well have been called the age of coal. Americans obtained twice as much energy from wood as from coal in 1876. But in 1900, coal provided 71 percent of America’s energy supply, wood 21 percent, and oil, natural gas, and hydropower less than 3 percent each.17 At the beginning of the twenty-first century, coal still provided by far the majority of the energy used to generate electricity in American power plants.

THE INTERNAL COMBUSTION ENGINE

After electricity, the most important transformative technology of the second industrial revolution was the internal combustion engine.

Internal combustion engines come in two varieties, both invented in Germany in the nineteenth century: the Otto engine and the Diesel engine. In the 1860s, numerous inventors experimented with engines driven by explosive mixtures of gas or oil and air. After a Belgian inventor, Jean-Étienne Lenoir, developed a gas engine between 1859 and 1876, the German inventor Nikolaus August Otto developed the four-stroke engine. In 1878, Otto patented an engine that used coal gas as a fuel. In 1885, Gottlieb Daimler and Wilhelm Maybach adapted Otto’s engine to use gasoline. Early electric cars and steam-powered cars could not compete with the performance of cars with gasoline engines. In the same year, 1885, Karl Benz built the first automobile, powered by a gasoline engine of his own design.

Rudolf Diesel’s engine, patented in 1892, was based on a different approach. High pressure caused the fuel to ignite spontaneously. Diesel engines are more efficient, with energy conversion ratios of 40 percent, compared to 30 percent for the best gasoline engines. Cheaper but heavier, they were quickly used for trucking, rail, and shipping, but not aviation, in which lighter kerosene was employed. Because of high gasoline taxes in Europe, nearly half of passenger automobiles ran on diesel fuel by the beginning of the twenty-first century.

In 1891, Émile Lavassor established what remains the basic design of the automobile, down to the electrical ignition and the carburetor. The automobile industry benefited from the earlier development of the bicycle industry. Widespread use of bicycles followed the development of the safety bicycle by a British inventor, John K. Starley, in 1885. In 1888, an inventive veterinarian in Ireland, J. B. Dunlop, put the first pneumatic tires on the wheels of his son’s tricycle. First the spread of bicycling and then the use of automobiles produced a demand for modern roads and highways.

Internal combustion engines were soon used not only for automobiles but also for planes, boats, tractors, and small devices like lawn mowers. During World War I diesel engines were used in ships and submarines and became the basis of global shipping.

MAGICAL MATERIALS

Many other technologies were part of the second industrial revolution in the late nineteenth and early twentieth centuries. Often they served the most important technologies, as rubber served the electric industry and the oil industry served the automobile industry.

In 1859, Colonel Edwin Drake drilled a petroleum well in Pennsylvania; his original goal was to substitute kerosene for costly whale oil in lamps. As the oil fields of Pennsylvania were depleted, new fields were discovered in Texas and California and abroad, in Dutch Indonesia, the Baku fields on the Caspian Sea, Romania, Mexico, Venezuela, Trinidad, and Iran. After World War II, new oilfields were developed in the Middle East, Nigeria, Siberia, and Alaska. By 1960, oil surpassed coal as the primary fossil fuel in the world.18

When electric lighting replaced kerosene lamps, oil found a new use, as a fuel for cars, trucks, tractors, planes, and ships. Natural gas (methane), at first considered a worthless by-product of crude oil, began to be used for heating and transportation.

Rubber was another key technology of the second industrial revolution, important for electrical insulation as well as for its use in automobile tires. In the 1840s, the American inventor Charles Goodyear succeeded in using a blend of sulfur, latex, and white lead to create “vulcanized” rubber. In 1852, when Goodyear sued a rival in Trenton, New Jersey, for infringement of his patent, he was represented by Daniel Webster, while another great American lawyer, Rufus Choate, represented his opponent. Webster brought all his oratorical gifts to bear in describing the new substance: “It is hard like metal and as elastic as pure original gum elastic. Why, that is as great and momentous a phenomenon occurring to men in the progress of their knowledge, as it would be for a man to show that iron and gold could remain iron and gold and yet become elastic like India Rubber.” Webster contrasted Goodyear’s vulcanized rubber with the older kind, which tended to melt in heat and grew rigid with cold: “A friend in New York sent me a very fine cloak of India Rubber, and a hat of the same material. I did not succeed very well with them. I took the cloak one day and set it out in the cold. It stood very well by itself. I surmounted it with the hat, and many persons passing by supposed they saw, standing by the porch, the Farmer of Marshfield.”19 Goodyear won his case but thanks to further patent litigation he died in debt.

In 1842, Goodyear gave some samples of his product to Stephen Moulton, a British businessman, and they made their way to the Scottish manufacturer Charles Macintosh, who had independently created the waterproof garment that bore his name. But it was in the late nineteenth century that the rubber industry grew rapidly, to supply tires first for bicycles and then for cars.

Goodyear Tire and Rubber Company, founded in 1893, became the largest rubber manufacturer in the United States and the world. The Firestone tire business was founded by Harvey Firestone, a mechanic who worked in Akron, Ohio, at his cousin’s factory, putting rubber tires on horse-drawn carriages. Henry Ford visited in 1895 and adopted Firestone’s solid rubber tires for the rims of the metal wheels of his cars. In later years, Ford, Firestone, and Edison vacationed together. Benjamin Franklin Goodrich, the founder of B. F. Goodrich, adapted the pneumatic tires devised by Michelin in France to American automobiles.

Until the early twentieth century, rubber continued to be derived from rubber trees. Seeking to avoid dependence on the British rubber plantations in Indonesia and Malaya, Firestone established his own rubber plantations in Liberia while Ford tried but failed to do the same in Amazonia in Brazil. Between World War I and World War II, American and German chemists learned how to make artificial rubber. This allowed the United States to make a million tons of rubber a year during World War II, even after Japan had conquered Southeast Asia.

Although steel was superior to wrought iron, in premodern times its cost limited its use to valuable implements like swords and plowshares. In 1856, Henry Bessemer discovered a method to make steel cheap. The Bessemer converter, followed by other innovations, radically reduced the cost of steel, benefiting existing industries like railroads and making possible entirely new uses for steel—in the framework of skyscrapers, for example.

Germany, with its superior system of state-funded research universities, led the world in the development of scientific chemistry and the chemical industry. German scientists and industrialists learned to create synthetic substitutes for natural dyes like indigo. Fritz Haber, Carl Bosch, and Alwin Mittasch devised the Haber-Bosch process for creating artificial ammonia used in fertilizers and explosives, including dynamite, which was developed by the Swedish chemist and engineer Alfred Nobel, who used his fortune to endow the Nobel prizes. The Germans also learned to create artificial potash or potassium, an ingredient of fertilizers, as a substitute for the variety derived from plants. The use of fertilizers produced by the chemical industry rather than nature made possible a revolution in agricultural productivity, as did the falling costs of steel farm implements and the development of tractors and other machines using internal combustion engines.

Plastics were another transformative technology spawned by the chemicals industry. John Wesley Hyatt, an American, devised celluloid, the first plastic, in 1869, and Leo Baekeland, a Belgian immigrant in the United States, discovered Bakelite in 1907.20 Applied chemistry also transformed medicine, by supplying disinfectants, anesthetics, and aspirin (discovered by Felix Hoffman and manufactured by the German firm Bayer AG—thus Bayer Aspirin).21

Canned food first became important during the Civil War and later allowed growing urban populations to eat preserved meat, vegetables, and fruit. As early as 1870, refrigerated beef was shipped from the United States to Britain, and in 1876 Charles Tellier, a French engineer, devised the first refrigerated ship, the Frigorifique.22 The development of small-scale refrigerators for the home helped to revolutionize domestic life.

HOW GOVERNMENT MODERNIZED AMERICAN AGRICULTURE

The modernization of the American economy between the 1890s and the 1930s was not solely the work of the private sector. In agriculture, radio, and aviation, the federal government acted as inventor, entrepreneur, and investor, in a return to the mixed-enterprise tradition of the early American republic.

In the 1790s, George Washington had lobbied unsuccessfully for a national agricultural university devoted to improving American agriculture. His vision of federal support for agricultural research was realized during the Civil War. The Morrill Act of 1862 used federal lands to subsidize land-grant agricultural and mechanical (A&M) colleges in the states. American agricultural reformers were inspired by the success of Germany in applying government-sponsored research to agriculture. The 1887 Hatch Act provided each state with federal funds on the condition that it establish at least one central experiment station “to conduct original researches or verify experiments . . . bearing directly on the agricultural industry of the United States.”23 Subsequent acts—the Adams Act (1906), the Smith-Lever Act (1914), the Purnell Act (1925), and the Bankhead-Jones Act (1935)—also provided money for research.

By the early twentieth century, a sophisticated industrial policy had developed in American agriculture. State land-grant colleges and regional experiment stations worked on the problems of American farmers. New techniques were disseminated by extension agents, who by 1914 numbered more than two thousand and were found in three-fourths of the agricultural counties of the United States.24 County agents initiated the formation in 1919 of the American Farm Bureau, a private trade association that became the most important farm lobby in the nation.

THE FEDERAL GOVERNMENT AND AMERICAN AVIATION

The US government played a key role in the development of manned flight, although initially it backed the wrong inventor. Samuel Pierpont Langley was a brilliant astrophysicist, the director of the Smithsonian Institution, and a friend of Alexander Graham Bell, who witnessed the successful flight of Langley’s unmanned, steam-powered glider above the Potomac River near Washington on May 6, 1896. In 1898, the War Department commissioned Langley to produce a manned military aircraft, giving him a grant of fifty thousand dollars—roughly $1.3 million in 2010 dollars. But in test flights over the Potomac on October 7 and December 8, 1903, the manned version of Langley’s glider, now powered by a gas engine, crashed and the pilot barely escaped each time. A little more than a week after the second attempt failed, on December 17, 1903, Orville and Wilbur Wright made the first successful flights of a manned heavier-than-air craft on the beach at Kitty Hawk, North Carolina. Humiliated and ridiculed, Langley died in 1906.

But the United States quickly lost the lead in aviation, as the great powers of Europe developed the new technology for military purposes. Between 1908 and 1913, the US government spent only $435,000 on aviation, compared to the $28 million spent by Germany, the $22 million spent by France, and the $12 million spent by Russia.25

As World War I approached, however, the government played a greater part. Between World War I and World War II, the federal government promoted the development of the American aviation industry by three methods: military procurement, public R&D in aeronautics, and airmail subsidies.

Although it had bet on the wrong inventors, the US government was quick to get into the airplane business. The military was the first client for the company that the Wright brothers set up.

Patent wars among early aircraft companies ended with the advent of World War I, when the aircraft manufacturers established the Manufacturers’ Aircraft Association to coordinate wartime aircraft manufacturing in the United States and formed a patent pool with the approval of the US government. All patent litigation ceased automatically. Royalties were reduced to 1 percent and free exchange of inventions and ideas took place among all the airframe builders. The government-encouraged pooling of patents set a precedent for similar enlightened technology-sharing arrangements that the military and civilian agencies later imposed on contractors and federal grantees.

The federal government also promoted American aviation by means of a system of publicly funded R&D that resembled the American system of agricultural experiment stations. In 1915, as the possibility of US intervention in World War I increased, Congress used a naval appropriations bill to establish the National Advisory Committee on Aeronautics (NACA), the ancestor of the National Aeronautics and Space Administration (NASA). In the 1920s, NACA performed R&D at its Langley Field facility in Virginia. This was joined in the 1930s by other research centers, including one at Moffett Field near Sunnyvale, California, close to aircraft manufacturing, and a NACA center established in 1940 in the center of aircraft engine manufacturing, Cleveland, Ohio, in order to research improvements in aircraft engines. Thanks in part to public R&D, the 8 percent annual productivity growth in US aviation outstripped that in other industries between the 1920s and the 1960s.26

In 1958, NACA was merged with the newly founded NASA, which was charged with the most spectacular state capitalist project in American history: the US space program, which culminated with the 1969 landing of the first astronauts on the moon.

Beginning in 1918, the federal government used the airmail program to subsidize the infant American aviation industry. The use of contract carriers for airmail encouraged the growth of aviation companies. The McNary-Watres Act of 1930 indirectly subsidized passenger flights, by replacing payment by weight of airmail with a fixed price for airmail per mile, no matter how much space was used for passengers.

President Herbert Hoover’s postmaster general, Walter Folger Brown, used his power to award airmail contracts to compel mergers that created a few large carriers. As a result, from the early 1930s until after World War II, the US airline industry was dominated by four airlines: American, TWA, Eastern, and United.27

The federal government’s investment in aviation paid off. By the time World War II broke out, the United States had the largest commercial airline system and the most advanced commercial airliner, the Douglas DC-3, which continued in service until the 1960s.28

THE FEDERAL GOVERNMENT AND THE CREATION OF AMERICAN RADIO AND TELEVISION

The federal government also shaped the radio industry, which later pioneered television. The US Navy was wary of Britain’s domination of global communications by means of its global underwater cable system. While taking part in postwar negotiations at Versailles in 1919, President Woodrow Wilson identified three areas of economic rivalry with military implications between the United States and Britain: oil production, merchant shipping, and global telecommunications. The United States had a lead in oil production, but the British Empire led in merchant shipping, and the British lead in global telecommunications threatened to increase because the Marconi company was based in London.

Frustrated by the need to rely on the British government because of Guglielmo Marconi’s British patents, in 1919 the navy, led by Assistant Secretary of the Navy Franklin Delano Roosevelt, persuaded General Electric, Westinghouse, AT&T, and other companies to pool their radio-related patents and form the Radio Corporation of America (RCA), to ensure that interlocking American corporations controlled radio development in the US. GE bought out the patents of the American subsidiary of Marconi and gave its patents to RCA.29

The initial purpose of RCA was military and commercial, and large-scale radio broadcasting was delayed by the lack of a business model, since anyone could listen without paying. One proposal, a government station paid for by licenses for radio owners, was suggested by David Sarnoff and taken up as the funding model for the British Broadcasting Corporation (BBC). In 1922, AT&T solved the problem differently by selling advertising and linking several New York stations together in a network. Threatened by AT&T, Westinghouse, GE, and the others in 1926 forced AT&T to sell its stations and agree to lease its long-distance lines to a new network, the National Broadcasting Corporation (NBC). In 1931, antitrust judgments separated Westinghouse and GE from NBC, and RCA was forced by subsequent orders to sell its Blue network, which became the American Broadcasting Company (ABC), in 1943.30

The modern age of television in the United States began on April 30, 1939, when antennas atop the Empire State Building in Manhattan broadcast live images of President Roosevelt at the opening ceremonies of the New York World’s Fair. On the same day, RCA’s affiliate NBC began regular US television broadcasts, which were limited at first to New York and other big cities in the Northeast.

RCA had delayed the evolution of American television by engaging in patent litigation with Philo T. Farnsworth, a brilliant Mormon from Utah who began dreaming of broadcasting images while studying at Brigham Young University in Provo, Utah. Helped by research engineers at the California Institute of Technology and investors after he moved to San Francisco, Farnsworth established the Farnsworth Television and Radio Company and obtained a patent in 1927. RCA, backing television research by Vladimir Zworykin, a Russian émigré engineer, fought Farnsworth over the patent in the courts. The nascent British television industry licensed Farnsworth’s technology and began regularly scheduled programming for a limited audience in 1936. The 1936 Berlin Olympics were the first to be televised. Only after World War II, however, did television transform society by reaching mass audiences.

THE TRANSFORMATION OF THE LANDSCAPE BY THE SECOND INDUSTRIAL REVOLUTION

The second industrial revolution created a distinctive pattern of production, work, and entertainment, based on the automobile and the electric grid. In the pedestrian city, the walkable area was about three square miles, with the edge of town limited to a mile from the center. Although mass transit is frequently advocated as an alternative to automobile-created suburbanization, the earliest suburban sprawl was created in the nineteenth century by mass transit. Taking advantage of the fact that horses could pull greater weights along rails, New York in the 1830s, followed by other American cities, adopted horsecar omnibus lines. The horsecar lines permitted the distance of the edge from the center of town to increase to 2.5 miles, producing an increase in area of twenty square miles. The horsecar began the process of migration of middle-class and working-class Americans to less crowded and less expensive housing on the urban periphery.31

Because of the noise and pollution they produced, steam locomotives were opposed as a method of urban transportation. Electricity provided an alternative. Following many experiments, the first genuine electrical streetcar system was created in the 1880s by Charles Van Depoele in Montgomery, Alabama. Frank Sprague, a former member of Edison’s Menlo Park team, created the Richmond, Virginia, electric streetcar system. Sprague’s streetcars used an overhead electric wire that “trolled” along other wires, thus the name, “trolley.” Sprague’s version became the standard when Henry Whitney adopted it to replace the world’s largest horsecar rail system, the West End Railway of Boston.32 The Boston trolley system pioneered the use of a flat fare for any length of ride, which further encouraged the working class to disperse to suburban lodgings. Electric trolley systems were soon joined by electric interurbans. The difficulty of creating new overland routes through existing neighborhoods led Boston, New York, and other cities to invest in subway systems.

By World War I, middle-class and working-class suburbs were growing up on the edges of cities. Industrial plants followed them, to take advantage of lower rents and more space. Polluting industries were nudged out by means of zoning, a form of urban regulation that spread rapidly in the early twentieth century.

As the central city was emptied of residents and manufacturing, it evolved into the downtown, a district that specialized in retail businesses, including department stores that served entire metropolitan areas, like Macy’s and Gimbel’s in New York and Filene’s in Boston. The new downtowns were characterized by a distinctive skyline created by tall office buildings. In the preindustrial era, large buildings had generally been limited to five or six stories accessed by stairs. The elevator allowed buildings to grow. The earliest elevators were cargo-hoisting machines employed in warehouses of a kind that would have been familiar to medieval and ancient engineers. Elisha Graves Otis, the founder of Otis Elevator, devised a cable-drawn elevator. In 1857, the first safe passenger elevator was installed in the H. V. Haughwort Store in New York.33

While the elevator had solved the problem of vertical transportation, the height of buildings was still limited by the nature of masonry construction, which required greater and greater thickness at the base as the height increased. This problem was solved first by cage construction and later by curtain-wall construction. A wrought iron cage was used as the skeleton of the seven-story Harper’s Building in 1854 by the architect James Broadus. The first true skyscraper, however, was the Equitable Insurance Building in New York, built between 1868 and 1870. It was an office building that combined an iron cage with an elevator.34

But it was Chicago, not New York, that would develop the skyscraper. Architect William Le Baron Jenney’s ten-story Home Insurance Building used a steel cage that permitted more light and larger windows, producing a distinctive Chicago skyscraper style. New Yorkers were so wary of heights that the architect of the eleven-story Tower Building at 50 Broadway in 1889 put his own office in the top floor to persuade Manhattanites of its safety.35

The offices in the new downtown buildings were filled with the clatter of adding machines and typewriters. The product of a long evolution, the familiar mechanical typewriter was first produced in 1874 by Philo Remington, a New York manufacturer of sewing machines and other devices, on the basis of a design patented in 1867 by a retired newspaper editor named Christopher Latham Sholes. Samuel Langhorne Clemens, who was better known under his nom de plume Mark Twain, wrote a testimonial:

Gentlemen:

Please do not use my name in any way. Please do not even divulge the fact that I own a machine. I have entirely stopped using the Type-Writer, for the reason that I never could write a letter with it to anybody without receiving a request by return mail that I would not only describe the machine but state what progress I had made in the use of it, etc, etc. I don’t like to write letters, and so I don’t want people to know that I own this curiosity breeding little joker.

Yours truly,

Saml. L. Clemens36

By the 1920s, electricity and the automobile were reshaping the geography of American production, distribution, consumption, and residence. Thanks to electrical power, factories no longer needed to be located near coal mines or waterways that carried the coal that powered steam engines. Railroads were eclipsed by long-distance trucking, as canals earlier had been eclipsed by railroads. And the migration of Americans from the farms to the cities gave way to the migration from the cities to the suburbs.

While the visible transformation of the American landscape by the second industrial revolution was dramatic, the transformation of the landscape of American business and politics would prove to be even more consequential.