Men have been hunting whales almost since the first ship put to sea. It would take Yankee ingenuity and daring to convert their occupation into big business.
Whales in colonial days were so abundant in the North Atlantic that it was not uncommon for them to wash up on New England beaches. By 1793, though, the enormous animals had become wary of whaling ships, and when the New Bedford whaler Rebecca pursued its prey ’round the Horn that year, whaling was transformed from a fishing expedition to an epic adventure. Expeditions lasted three or four years, sailing to the Sea of Japan or the West Indian Ocean, and might include wintering in the Arctic. (One ship, the Nile, was at sea eleven years before returning to its home port in New London, Connecticut, though that may have been just what the whalers told their wives.)
By the golden age of American whaling in the 1850s, it was the fifth-largest industry in the United States, a $70 million business employing more than seventy thousand people. Of the estimated 900 whaling ships worldwide, 735 came from America—from little river towns off the Hudson and the Connecticut, from Long Island, and from the tiny atoll known as Nantucket. Four hundred ships sailed from the port of New Bedford, which called itself “the City That Lit the World” and was for a time the richest place, per capita, in America.
US dominance of the industry flowed from the boldness of its capitalists, often syndicates of parsimonious Quakers, and the innovations and almost lunatic bravery of its men. The basic techniques of harpooning and rendering whales had been established centuries before by Basque and Dutch whaling fleets, but Americans dramatically improved them all. Lewis Temple, an escaped slave who started a thriving shorefront business in New Bedford, invented a toggle harpoon that became the industry standard, impossible to dislodge from a hooked whale. The six-man boats Yankee whalers used to pursue the leviathans were considered the finest in world: sleek, narrow, brightly painted, thirty feet long and six feet wide, and equipped with a sail and oars. On board the whaling mother ships were state-of-the-art “trying pots” to boil whale oil out of the blubber and dozens of barrels to store it in so that the whaler never had to return to port until it was full.
A whaling trip was a $50,000 investment, and a risky one at that; sooner or later, 40 percent of the ships did not return. But profits could be tremendous. By the 1830s, it was whale oil that kept the lights burning and the wheels turning in America. Spermaceti, a liquid wax drawn from the massive heads of sperm whales, made the best candles ever: smokeless and odorless, with a high melting point and a low freezing point. It burned brightly in the household lamps of the rich, in city streetlights, in lighthouses, and in the headlights of the new trains thundering out across the West (see The Trains They Paid to See: Streamlined Trains). More common whale oil, extracted from the blubber, was an invaluable asset in the emerging industrial economy of the North, greasing the spindles in the cotton and wool mills and the gears of the trains and the steamboats. It made superior soaps, was ideal for softening leather and thickening paint, and worked as a cleaning agent, a cosmetic, and a medicinal ingredient.
Many men signed on for one whaling voyage; few did for a second one—unless compelled to do so by what they owed the shipowners, who deducted every possible expense from their meager earnings. The common seamen’s quarters were described by one denizen as “black and slimy with filth, very small and hot as an oven.” Its denizens ate salted horse and pork, were whipped with a cat-o’-nine-tails for infractions, and were tormented by vermin drawn to the bloody, greasy ships. No matter how much they scrubbed the decks, their homes at sea stank of cooked whale blubber; whaling ships could be smelled downwind long before they were seen.
“A more heterogeneous group of men has never assembled than in so small a place as is found in the forecastle of a New Bedford sperm whaler,” one observer noted, and it was true. Whaling men came from everywhere: from Polynesia and Tahiti, Portugal and the Azores and Cape Verde, Peru and Colombia, and the West Indies. They included Maori from New Zealand; Native Americans from Cape Cod and Martha’s Vineyard, renowned for their ability as harpooners; and former slaves, escaped from their servitude on southern plantations.
When the famous cry of “Thar she blows!” sounded from the crow’s nest, a hundred feet above the deck, the men scrambled to launch the whaleboats and sailed toward their prey, shouting, “A dead whale or a stove boat!” What followed was a terrifying life-and-death struggle that could easily end with a whaleboat smashed or bitten to pieces by a wounded whale—and the whalers with it. Once the harpoon was in, the men might be pulled by the infuriated animal at twenty miles an hour across the open sea in what was called a “Nantucket sleigh ride.” The sailors dumped water on the harpoon line to keep it from bursting into flames as it swiftly unwound, and stood ready to chop it loose with a hatchet if it snagged and started to pull them all under.
Once the whale was finally killed, it had to be towed back to the ship while the men beat off swarms of sharks trying to devour their catch. When it reached the mother ship, all hands descended upon the carcass, chaining it up on the starboard side and peeling it like a gigantic apple with razor-sharp spades, knives, and axes—knowing that one slip could plunge them into the frenzy of sharks gathered below.
They extracted every ounce of value they could find in the gigantic mammals. From the head of the baleen whale they pulled its feeding plates, the plastic of the nineteenth century, used to make buggy whips and parasols and to torture the ribs of countless women as corset stays (see Embracing the Curves: Liberated by the Bra). From deep in the intestines of the sperm whale they scooped out a fragrant gray substance known as ambergris that made the perfume the ladies wore.
Whaling would hang on as an industry into the twentieth century, mainly due to the demand for those baleen corsets. But it was largely finished after the 1850s in the United States, thanks to Confederate raiders that devastated the North’s whaling fleet in the Civil War and, especially, a new source of oil, bubbling up from the ground in western Pennsylvania (see Black Gold: The Oil Rig). In 1927, the last Yankee whaler left New Bedford, and the terrible romance of the hunt departed with it.
the genius details
American whaling’s most profitable year was 1853, when eight thousand whales were killed to produce 103,000 barrels of sperm oil, 260,000 barrels of other whale oil, and 5.7 million pounds of baleen, which generated total sales of $11 million.
Whaling crews consisted of sixteen to thirty-seven men. They included a captain, three or four ship’s mates, boat steerers, harpooners, a blacksmith, a carpenter, a cook, a cooper, a steward, and foremast hands.
A whaling captain would commonly get a “lay,” or cut, of 1/8 of all profits, while an ordinary crewman would get as little as 1/350.
Herman Melville had shipped out on the whaler Acushnet in 1841 at the age of twenty-one. His book Moby-Dick was inspired in good part by the only known case of a whale attacking a mother ship, in 1820.
They knew the oil was there, all over America. In many places it would simply seep up out of the ground, and it was used by Native Americans for decoration and as an insect repellent, a salve, a purge, and a tonic. They would soak it up in blankets or skim it right off the ground. White settlers found it appeared when they tried to dig out salt licks. They retrieved it to lubricate their sawmill machinery or to sell as bottles of patent medicine, but really, what other use was there for it?
One of those snake oil salesmen, an enterprising Pennsylvania merchant named Samuel Keir, discovered a process to extract kerosene from crude and in 1851 began selling this “carbon oil” to local miners for their headlamps. With whale oil, the preferred source of illumination in America (see Power Plant at Sea: The Whaling Ship), growing more expensive, Keir set up the first oil refinery in Pittsburgh.
Now there was a demand, and one Edwin Drake, a former train conductor, provided the supply in 1859, building the world’s first modern oil well in the small town of Titusville in the northwestern corner of Pennsylvania. The Chinese had been drilling with bamboo since at least the fourth century, and Europeans had been digging oil wells by hand in Poland and Romania a few years before Drake, but he was the first to use a steam engine to power his drill and the first to drill down through lengths of iron pipe, so as to keep the borehole from closing up.
After months of futility, Drake hit pay dirt near the end of August in 1859. Soon he was pulling up four hundred barrels a day, pumping the first load of oil into a convenient bathtub before he was able to build enough barrels to hold it. Drake set off the world’s first oil boom, with people pouring into the country around Titusville to start drilling.
Within another generation, oil had largely put paid to the American whaling industry and had set a young refiner named John D. Rockefeller on the road to amassing perhaps the greatest fortune in US history. But it had not yet assumed the central place it would in the world economy, in part because no one was sure just how much of the stuff there was.
This would change dramatically early in the new century, when a veteran wildcatting team headed by brothers Curt and Al Hamill drilled into a salt dome near Beaumont, Texas, called Spindletop.
By this time, offshore oil wells had been drilled in tidal zones all along the nearby Gulf Coast of Texas and Louisiana. Pattillo Higgins, a half-mad, irascible local “oil prophet” who had educated himself in geology, had identified the dome as a likely oil deposit at a time when most professional geologists thought the area was too sandy to contain oil. He in turn had called in an immigrant Croatian geologist named Anthony Lucas (born Antun Luc˘ić), who confirmed his suspicions. But lacking the money to exploit Spindletop, Higgins was forced to sell out to a couple of big-oil money men from Pittsburgh named John H. Galey and James M. Guffey. It was Galey and Guffey who hired the Hamills, who they knew had already perfected a superior rotary drill. Now the Hamills hit upon a revolutionary new technique. To keep the sandy Gulf Coast earth from clogging up their works, they bought a small herd of cattle, let it cavort in a waterhole, then used the mud on their drill to clean out the sand. It worked, and mud would become a universal addition to the drilling process, not only washing debris out of the borehole but cleaning and cooling the drill bit and helping prevent gushers as the drill descended.
Even with the mud, the Hamills’s work was grueling and dangerous. The layers of rock between the sand wore away their drill bits. Sudden pockets of natural gas threw boulders up out of the earth and damaged their machinery until they kept the drill working around the clock, afraid another blowout would end their whole enterprise.
On January 10, 1901, just after the Hamills had put in a new bit, mud began to boil up out of the ground with such force that it pushed the pipe up through the derrick—something they had never seen before. As they watched, incredulous, the pipe knocked the crown block off the derrick, then fell to the ground “like crumpled macaroni,” taking out the boiler smokestack. Then came a deafening roar and a geyser of mud that coated the men, followed by more mud and blue gas flames that shot out of their six-inch drill hole with “a roar like a cannon.” When it was over, the team crept closer, peering down into the hole.
First they could hear it, gurgling beneath the earth. Then they could see it—oil forced up with such pressure that it frothed as it came. As the men ran for their lives, the oil blew a new volley of rocks sky high, then blasted out into a 160-foot geyser, twice the height of the derrick. It rained down on the Hamills, and on the crowds who flocked to see it, every time the wind changed. The gusher produced nine hundred thousand to one million barrels before it was finally capped—one of the most productive single oil wells ever found. By itself, it increased US oil production by 50 percent and world production by 20 percent. Its daily production was twice that of all the oil wells in Pennsylvania, then the nation’s leading oil state, and six times that in all of California.
And save for forty barrels (of forty-two gallons each), it was all wasted, spilled out on the surrounding ground, where a few days later, most of it burst into a raging fire. But by the end of the decade, more wells on Spindletop would triple total US oil production. That single Texas salt dome would still be producing as much as twenty-one million barrels a year by 1927 and would give up a total of 153 million barrels by 1985. Its abundance would lead wildcatters to confirm that oil was everywhere: all over Texas, and Louisiana, and Southern California, and even under the ocean. It was the perfect energy source for the new internal combustion engines that were soon powering automobiles everywhere.
For better or for worse—and it would become a highly problematic addiction—oil was here to stay.
the genius details
Spindletop’s salt cone dates back to the Jurassic era. Its name may have come from the way in which a grove of trees on top of the hill seemed to “spin” when heat rose from the prairie around it. Before the oil was discovered, many ghost stories were set around Spindletop, and St. Elmo’s fire used to play about the mound.
The population of Beaumont rose from eight thousand to over sixty thousand in the space of a year.
Texas’s state antitrust laws banned John D. Rockefeller’s massive, would-be monopoly, Standard Oil, from Spindletop, allowing the rise of several new major oil companies, including Texaco and Gulf Oil.
The city of Houston quickly became the country’s oil capital, and the United States surpassed Russia as the world’s leading oil producer.
As America’s economy raced toward becoming the largest in the world by the 1870s, there was only one thing it lacked: light. Light for its factories now churning twenty-four hours a day, for the streets and homes of its burgeoning cities, and for the libraries where its young men and women read on, long after their punishing workday was done.
Where was the light to come from? Only so many whales could be caught and boiled (see Power Plant at Sea: The Whaling Ship). Kerosene distilled from crude oil (see Black Gold: The Oil Rig) gave off a flickering, smoky light, while gaslight was dim, sinister, and dangerous.
The answer lay with an illumination that had first put America on the scientific map, and with the nation’s first wizard. The potential of electricity had fascinated the Western world since Ben Franklin’s experiments in the 1750s. Its power had been appropriated to invent the telegraph (see What Morse Wrought: The Electric Telegraph), the telephone (see Watson Was Wanted: The Telephone), and the phonograph (see “His Master’s Voice”: The Phonograph)—three world-changing devices Thomas Edison had already played a key role in either inventing or improving by 1878, when he decided to use electricity to light up the world.
The sticking points were how to transmit electricity to a dozen different lights—or a million—at the same time and how to get it to glow in a lightbulb without immediately burning out the filament. But beyond his brilliant and unorthodox mind, his tireless energy for experimentation, and his ability always to look to the practical applicability of any new invention, Edison had another ace in the hole: Menlo Park, New Jersey, a homey complex of laboratories, machine shops, library, and living quarters—the first in a long line of American industrial research centers. It took up two city blocks and included a miniature locomotive Edison liked to drive at up to 40 miles an hour around a 2.5-mile track.
There “the Wizard of Menlo Park” was able to attract some of the most talented young scientists, engineers, and craftsmen from around the world. Together they would invent the entire system of electrical energy almost from scratch, starting with a new dynamo with more than 90 percent efficiency—over twice as much as the best dynamos then extant. Next came a new, handcrafted glass lightbulb that sealed in an almost perfect vacuum.
The right filament proved a sticking point, requiring months of painstaking, dangerous work, with Edison temporarily blinded in one accident. The scientists tested everything from coconut hair to sassafras to a spider’s thread and a strand plucked from an assistant’s beard before finally settling on carbonized cardboard that lasted for forty-five hours without burning out. A few months later, they would switch to a carbonized bamboo filament, which lasted for 1,200 hours.
By New Year’s Eve 1879, Edison was ready for a typically canny public demonstration. All that day and evening, the special trains kept bringing people out to Menlo Park from New York, despite a growing snowstorm. Edison and his men lit up the little town from his lab to the rail station, four hundred lightbulbs in all, giving off what a reporter called “a bright, beautiful light like the mellow sunset of an Italian autumn.” As the old year sank away and the light grew in the darkness, delighted gasps of “Marvelous!” and “Wonderful, wonderful!” rang from the crowd.
“We will make electricity so cheap that only the rich will burn candles,” Edison informed them.
Still, installing the first electrical system, in Lower Manhattan, would mean designing and manufacturing six twenty-seven-ton dynamos; fourteen miles of insulated, underground wiring; a steam engine with four coal-fired boilers; electrical motors, meters, switches, sockets, fuses, and more. Edison had to find and buy a location for his central electrical station, market and advertise his new invention, sell it to the public and municipal authorities, and put up a majority of the capital—something that would cost him most of his shares in the new Edison Electric Illuminating Company. Much of the work he did himself, from jumping down into trenches in the streets of Manhattan to check the wiring, to hosting the mayor and the board of aldermen at a Menlo Park reception catered by Delmonico’s.
Finally, on September 4, 1882, Pearl Street Station, the first central power plant in the world, went on line in Manhattan, sending 110 volts of direct current to bulbs in 450 lamps belonging to eighty-five customers—including several of the city’s leading papers, another perfect Edison touch. The New York Times reported how the new light provided a glow that was “soft, mellow and grateful to the eye, and it seemed almost like writing by daylight.”
Within a year, Pearl Street Station had 513 customers and was lighting ten thousand lamps. The first electric sign appeared on Broadway in 1892, and by 1913 there were over a million lights along New York’s Great White Way. By 1904, the Times would be there, too, in a tower 375 feet high, and “said to be visible from eight miles away,” according to historian Jim Traub, “an ‘X’ that marked the center from which the great, glowing city radiated.”
Here was the perfect synergy of the twentieth century, a transportation hub anchored by a skyscraper, housing a newspaper, and all of it lit to the skies with electricity. Nobody knew it yet, but Times Square marked the start of the decades-long shift of the city’s main reason for being from manufacturing to the new business of entertainment.
Electricity was the third, vital element necessary to create the modern, urban experience, along with steel-frame construction (see Building Backward: Steel-Frame Construction) and Elisha Graves Otis’s safety elevators (see “All Safe, Ladies and Gentlemen, All Safe”: Mr. Otis’s Safety Elevator). Electricity lit homes and offices. It powered the underground and underwater commuter trains (see The Electric Underground: Building the New York Subway) enabling hundreds of thousands, then millions, of individuals to be assembled in, and dispersed from, the city in record time—thereby inventing the suburbs, too.
Electric light could seem terrifying, even surreal—“Squares after squares of flame set and cut into the ether,” as Ezra Pound would write. But “incandescent lighting,” as the cultural historian David Nasaw noted, “transformed the city from a dark and treacherous netherworld into a glittering multicolored wonderland.”
Where murky gaslight had only exaggerated the terrors of the city, clean, dazzling electricity made people—a whole new class of people, single, independent office workers, male and female, with some discretionary income in their pocket—want to linger after work. To meet their needs, entertainment became an industry as never before, giving birth to fantastic, electric-powered amusement parks (see Manufactured Fun: The Amusement Park) and sumptuous new palaces where electricity ran the projectors that spewed forth the new century’s boldest new art (seeWonders and Atmospherics: The Movie Palaces). The Wizard, unsurprisingly, would soon turn his energies to inventing the movies.
the genius details
Other substances tested for the lightbulb filament included rags, grasses, flour paste, leather, macaroni, fishing line, pith, cinnamon bark, eucalyptus, turnips, gingerroot, cedar shavings, hickory, maple, cork, twine, celluloid, flax, paper, vulcanized fiber, and cork.
Soon after lighting up Menlo Park for New Year’s Eve 1879, Edison and his men lit up the mansion of one of his backers, J. P. Morgan. It required 385 bulbs, as compared to just 400 for all of Menlo Park.
Beginning in 1928 in New York City, Consolidated Edison gradually replaced Edison’s direct current with alternating current and completed the switch in 2007. Older buildings—and the New York subways—were fitted with converters to change alternating current, the only kind now generated by ConEd, into direct current for their internal use.
Moses Farmer, an electrical pioneer in Salem, Massachusetts, lit his home with incandescent bulbs in 1859, making it the first electrically lit home in history. But the bulbs quickly burned out.
The river had always seemed to have a mind of its own. It ran as wild as any water on earth, and each year its rampant flooding killed hundreds of people and drowned thousands of acres of farmland. Every attempt to master it had failed. It had smashed apart a canal designed to channel its waters into California’s Imperial Valley, causing massive damage and creating a 150-mile-wide, 60-foot-deep lake known as the Salton Sea. Geologists predicted that if something wasn’t done to divert it back to its original course, it would plow itself a canyon a mile deep and hundreds of miles long.
The Colorado would have to be dammed.
It was as complicated and forbidding a project as any that had confronted Americans in the West, including cutting the Transcontinental Railroad through the Sierras and bridging the Golden Gate (see “The Golden Vein”: The Transcontinental Railroad and Suspended in the Clouds: The Golden Gate Bridge). President Herbert Hoover ordered construction sped up to begin in May 1931 in order to provide jobs in the face of the growing Depression. But a planned company town had not been built yet, and many workers and their families were forced to live in a wretched work camp known as Ragtown. By August they had gone on strike over the bad water, the awful food, a sudden wage cut, and work in weather that killed sixteen men from heat prostration.
A consortium of six construction companies was hired to do the work, and Frank “Hurry Up” Crowe, a tall, stately, fifty-year-old Quebecois who had been building dams all over the West for thirty years, was brought on as superintendent. Crowe’s motto was “To hell with excuses—get results,” and he lived up to it. He broke the strike by summarily firing everyone, then rehired the men he wanted, housing them in the model town of Boulder City that the Nevada authorities had finally gotten around to building. Even there he ran roughshod over his workers, but Hurry Up Crowe at least lived up to his own punishing work ethic. Every detail required—and got—his close attention.
What he saw was the incredibly variegated set of tasks necessary for building his dam. Just to begin it, Crowe’s men had to divert the mad river from Black Canyon, on the border of Nevada and Arizona, where it tore through at up to 175 miles per second. This meant building an enormous cofferdam and diverting the river into four tunnels that had to be drilled and dynamited into the canyon walls. To build the cofferdam, “high-scalers,” suspended from the top of the canyon by rope, would remove loose rock with drills and dynamite to prevent water seepage, in between swinging about spectacularly to show off for the tourists. Inside the tunnels temperatures reached 140 degrees, and one man after another keeled over, some of them dying of carbon monoxide poisoning from the drilling machinery. But the tunnels were finished eleven months ahead of schedule.
Next, the men had to dig as deep as 150 feet into the bedrock, clearing out 1.5 million cubic yards of material—twice the amount displaced by digging the Panama Canal—and filling in any natural cavities they encountered with a “grout curtain.” Then came the enormous logistics challenge of how to run the worksite.
“We had 5,000 men jammed in a 4,000-foot canyon,” Crowe later explained. “The problem . . . was to set up the right sequence of jobs so they wouldn’t kill each other off.”
Crowe employed any number of techniques and devices he had improvised at previous dam sites. He designed and built the most sophisticated cable system ever seen, used to deliver both concrete and workers alike around the job site. It swung huge, steel buckets—patented by Crowe, transported on special railcars, and capable of carrying twenty short tons of concrete apiece—to pour into the dam.
Still, there remained the problem of the dam’s sheer size, 45 feet wide at its rim and 660 feet thick at its base. It would require 6.6 million tons of concrete, enough to lay a 4-foot sidewalk around the earth at its equator. Done in a single pour, the amount of concrete needed would have taken 125 years to cool and harden—and would have cracked in the process, rendering it useless.
The dam’s designer, John L. Savage, had the solution to this: 230 wooden box molds, each equipped with 1-inch pipes and thermometers—582 miles of pipe in all. The concrete was poured into each mold, 5 feet deep; then refrigerated water was pumped through the pipes to cool it down. When the thermometers indicated that it was cool enough, grout was pumped into the pipe sections—thereby further strengthening the dam—and the next section was poured. Sample cores taken from the dam in 1995 showed that “Hoover Dam’s concrete has continued to slowly gain strength” (my italics), and it will continue to do so for years to come.
The great dam was finished in 1936, two years, one month, and twenty-eight days ahead of schedule, and on its completion it was the largest concrete structure in the world, as well as the largest dam and greatest single generator of electricity. The power it makes and the water of its reservoir make possible the continuing existence of civilization as we know it in Southern California and throughout the Southwest.
Behind its concrete walls, the dam created a huge new body of water, Lake Mead, 500 feet deep and 110 miles long. It continues to serve as the largest reservoir in the country and is a highly popular recreation site.
Beyond all its practical uses, however, the dam is also a work of surpassing beauty. Architect Gordon B. Kaufmann’s monumental art deco architecture for the roadway, the water intakes, and the generating plant at the top of the dam are perfectly complemented by the motifs of the Navajo and Pueblo tribes of the region—rain, water, lightning, clouds, lizards, serpent, birds, and the surrounding mesas—that designer Allen Tupper True embedded in the plant’s walls and the terrazzo floors. A. T. True—much like the designers of Pennsylvania Station or the Coney Island parks—also made the practical mechanics of the dam, the giant turbines and pipes, works of art. This was the work of a people supremely confident, even at the depths of the Depression, in its own ability, and its future.
the genius details
In his haste to complete the dam ahead of time, “Hurry Up” Crowe nearly caused an even greater disaster. Pushed by their superintendent to finish the dam’s foundation, the men simply left some fifty-eight cavities in the bedrock when they realized it would take “too long” to fill them with grout. Repeated water seepage revealed this defect, which was rectified only by nine more years of work filling in the grout curtain from 1938 to 1947.
One hundred and twelve people died building Hoover Dam. The consortium of construction companies listed forty-two of them as victims of “pneumonia,” but this was a cover-up of deaths caused by carbon monoxide, due to carelessness in the rush to build the diversionary tunnels.
Eventually, enough water to fill fifteen swimming pools every second would pass through the dam’s seventeen generators, producing 2,080 megawatts of power, more than enough to repay its construction costs ($49 million, about $833 million in today’s dollars) and its yearly maintenance expenses.
The Tennessee Valley Authority
No region of the country was harder hit by the Great Depression than the Deep South, which had never fully recovered from the devastation of the Civil War. But with the Depression would come opportunity at a bend in the river.
A tortuous twist in the Tennessee River, at a place called Muscle Shoals, Alabama, sent water plunging 140 feet over the course of thirty miles—and wherever water rushes and falls with such ferocity lies power (see Taming the Colorado: The Hoover Dam). During World War I, the US government built two plants there to process nitrates for use in munitions. They were supposed to be powered by a hydroelectric dam (named for President Wilson), but the war ended before the dam could be completed.
It was another setback to a region where the average annual income would still be just $639 by the end of the 1920s, and many families made as little as $100 a year. One-third of the population suffered from malaria. The soil was played out after decades of overuse, and deadly floods from the unruly Tennessee constantly swept away family farms. In their desperation for fuel or new land to farm, the locals chopped or simply burned down 10 percent of the area’s once capacious forests every year.
Private utilities saw no reason to provide this indigent area with electricity. While local cities were over 90 percent electrified by 1929, only one in ten rural families had electric power, the rest still living much as they had a century before.
Then manna seemed to drop from heaven. The unfinished dam project attracted the attention of Henry Ford, still a name to conjure with in American life. Ford brought his friend Thomas Edison down to Muscle Shoals near the end of 1921 and offered to transform the region.
“I will employ one million workers at Muscle Shoals, and I will build a city seventy-five miles long at Muscle Shoals,” he promised.
Who could doubt the man who had brought America the Model T (see Utility as Beauty: The River Rouge), especially with the wizard of invention (see “His Master’s Voice”: The Phonograph) at his side? Giddy Alabamans immediately started a land boom. But not everybody was convinced.
First among the doubters was one George Norris, an independent-minded Republican senator from Nebraska. Norris had been interested in the potential of Muscle Shoals since the war, even though he’d never been there. Now he went to see the region for himself and thought Ford’s offer was big on dreams and short on cash: only $5 million offered for a project that had already cost the government $130 million.
It may seem incredible today, a senator from one party concerned about a region dominated by another party, hundreds of miles from his home state. But then Norris, whom many historians name as the greatest senator in our history, never much cared for the conventions of party politics. One of eleven children, he was raised in poverty after his father died when he was four, and worked his way through law school. Elected to Congress from Nebraska as a Progressive Republican in 1902, he led a rebellion in the House that overturned the autocratic rule of his own party leaders. Elected to the US Senate in 1912, Norris, who looked and acted like a hero from a Frank Capra movie, was one of only six senators to oppose America’s entry into World War I. Throughout his long career he sided with whichever party he believed would help him in his ceaseless quest to “repudiate wrong and evil in government affairs.”
Norris convinced the Senate to turn down Henry Ford’s offer to buy the Wilson Dam. When he returned to Muscle Shoals, he had to bring an armed bodyguard. But he was not about to abandon the area. Twice in the 1920s, Norris got bills passed to finish the abandoned Wilson Dam and develop public power there. Not one but two presidents from his own party vetoed these bills as socialistic.
In 1933, the man and his moment were finally met. Franklin Delano Roosevelt saw a chance to put into place all of the ideas he’d long had regarding public power, planning, economic development, and conservation. FDR pushed the Tennessee Valley Authority (TVA) through Congress during the famous “First Hundred Days” of his administration.
The federal government completed the Wilson Dam—and a dozen others, over the next few years—in what constituted the largest hydropower construction program ever undertaken in the United States. But the TVA was much more than just a utility. Here was the genius of America not in a single, doughty inventor but in a comprehensive development plan for a region that had long lagged behind the rest of the nation. The new dams controlled the constant flooding, made whole rivers navigable, and created beautiful lakes for recreation and fishing. The power was used to bring the cheapest electrical rates in the country to everyone, on the farm or in town. (It also set a standard for rates nationwide, which drove the private utilities mad.) Electricity transformed daily life, bringing industry to the area and electric light, electric washing machines, and electric refrigerators to people’s homes. The Department of Agriculture used the nitrates produced at the old war plants to refurbish the soil and set up services for farmers, increasing local agricultural yields. The Forest Service and the Civilian Conservation Corps replanted the trees and made the valleys bloom again.
Trying to supply its workers with reading material, the TVA even started local public libraries throughout the region in stores, post offices, and gas filling stations when necessary. Run by Tennessee librarian Mary Utopia “Topie” Rothrock, the libraries remained even when the dams were finished.
Advocates for private utility companies and limited government fought the TVA all the way to the Supreme Court. But the Court ruled that it was constitutional—and the TVA rewarded George Norris’s faith by quickly paying off its cost and becoming self-sustaining. By 1941, the authority was the largest producer of electrical power in the United States, and it would prove invaluable to the war effort, powering the production of aluminum and other vital metals, countless nitrates, and a top-secret program at Oak Ridge, Tennessee, that was part of something called the Manhattan Project.
This was a war effort that George Norris fully supported. Previously an isolationist, he had come out for US intervention after seeing pictures of Japanese army atrocities in China. But then, Senator Norris always did think for himself.
the genius details
Centered on the Tennessee Valley, the TVA serves customers over eighty thousand square miles, including most of Tennessee, much of Mississippi, Alabama, and Kentucky, and parts of Georgia and North Carolina.
Sixteen hydroelectric dams and a steam plant were constructed by the TVA between 1933 and 1944, and a total of forty-seven dams were built in six different states.
During the Great Depression, the TVA was a great tourist attraction. One thousand people a day came to see the Wilson, Wheeler, and Norris dams.
The TVA set up the Electric Home and Farm Authority (EHFA), which offered low-cost financing to farmers, to allow them to buy electric stoves, refrigerators, and water heaters at affordable prices.
The TVA is still America’s largest public power company, with seventeen thousand transmission lines delivering nearly thirty-two thousand megawatts of power to 8.5 million individuals. Its revenue in 2013 was almost $11 billion and its operating income nearly $1.5 billion.
The rates at which scientific discoveries advance into practical application can seem capricious, determined as they are by so many outside factors: the availability of related enabling technologies; government support; or market economics. But one factor that the American experience with invention shows to have always been a boon is the gathering of brilliant minds in one place.
Such a place was Bell Labs, the now legendary New Jersey research center for AT&T that pioneered groundbreaking work in so many fields (see “A Computer on a Chip”: the Microprocessor). When its final history is written—many years from now—perhaps the invention of the modern solar or “photovoltaic” cell will be considered its most important breakthrough.
The concept of the solar cell had been around for a long time. In 1839, nineteen-year-old Alexandre-Edmond Becquerel invented the very first such cell, showing how light could be converted into electricity in his father’s Parisian laboratory. The American inventor Charles Fritts developed the first solid-state photovoltaic cell in 1883 by coating the semiconductor selenium with a thin layer of gold, but it proved only about 1 percent “efficient” in converting light to electricity.
The next big step forward wasn’t taken until Bell Labs engineer Russell Ohl discovered the “p-n [positive-negative] junction” or “barrier” in 1939. This boundary between two different types of semiconductor material would serve as a diode, a circuit that would allow one to send a current of electricity in one direction but prevent it from going in another. Think of it as a sort of subatomic circuit breaker.
While transistors were beginning to transform our world after World War II, Daryl Chapin was working on a seemingly more mundane project at Bell Labs: trying to come up with an energy source to power dry-cell batteries for Bell Telephone in the tropics, where they degraded too fast in the intense humidity. But the only solar cells then available were selenium ones that converted just 0.5 percent of sunlight to energy—less than a tenth of the nearly 6 percent level they had to reach to be commercially viable.
Chapin turned to an old friend, Gerald Pearson, a fellow research physicist at Bell and former classmate of his from Willamette University, who was doing semiconductor work with Bell chemist Calvin Souther Fuller. Pearson and Fuller had already gained insight into how to transform silicon from a poor conductor of electricity to a superior one by introducing impurities into it. Now they discarded Chapin’s selenium cell, introduced gallium into silicon, then gave it a nice hot lithium bath.
Voilà! Pearson and Fuller had created the best solar cell yet devised, with 2.3 percent efficiency. But just then a completely different energy source—one that seemed about to make solar energy or, for that matter, any other kind of energy all but irrelevant—barged noisily onto the scene.
In January 1954, “General” David Sarnoff, the bombastic head of RCA, announced that his company had made a dramatic breakthrough: the “atomic battery.” At a press conference at Radio City, Sarnoff tapped out “Atoms for Peace” on an antique telegraph powered by his new device.
“Atomic batteries,” Sarnoff told reporters, “will be commonplace long before 1980,” predicting that they would power “ships, aircraft, locomotives and even automobiles. . . . Small atomic generators, installed in homes and industrial plants, will provide power for years and ultimately for a lifetime without recharging.”
Americans were eager to hear that there could be a great new, constructive role for atomic energy in peacetime. The New York Times called Sarnoff’s predictions “prophetic” and predicted the atomic battery would also fuel “hearing aids and wrist watches that run continuously for the whole of a man’s useful life.”
“Who cares about solar energy?” crowed the head of RCA Laboratories. “Look, what we really have is this radioactive waste converter. That’s the big thing that’s going to catch the attention of the public, the press, the scientific community.”
“Radioactive waste” was right. The atomic battery ran on strontium-90, one of the most hazardous elements of nuclear waste.
All that pesky radiation stuff would ultimately preclude development of the atomic hearing aid—or, for that matter, the atomic anything. At Bell Labs, meanwhile, Calvin Fuller suspected that a problem with his team’s solar energy cell was that the p-n junctions tended to be too far from the surface of the cell for enough sunlight to penetrate. Fuller cut the silicon for the cells into long, narrow strips, then added a smidgen of arsenic and coated it all with an extremely thin layer of boron in a furnace. Eureka! The arsenic created p-n junctions, and the boron kept them very close to the surface of the cell, thereby making it possible to pick up sunlight and distribute 5.7 percent of it as electricity.
On April 25, 1954, just three months after “General” Sarnoff’s announcement, Bell Labs announced that its “solar cells,” linked together, delivered “power from the Sun at the rate of 50 watts per square yard, while the atomic cell recently announced by the RCA Corporation merely delivers a millionth of a watt” over the same area. The New York Times announced “the beginning of a new era, leading eventually to the realization of one of mankind’s most cherished dreams—the harnessing of the almost limitless energy of the sun for the uses of civilization.”
Not just yet. The low cost and ready availability of other fuels constrained the further development of solar cells for decades. But constant improvements on the photovoltaic cell, plus the need to combat global climate change, have made it a competitive energy source and the most likely energy of the future. Best of all, we won’t glow in the dark ourselves.
the genius details
Far from the original Bell Labs quest to attain 6 percent efficiency for solar cells, the most efficient photovoltaic cells today reach 46 percent efficiency, or more than twice the 20 percent efficiency of internal combustion engines. Most commercial solar panels in use today, though, routinely achieve 15 to 18 percent efficiency, with some above 23 percent.
Bell Labs’ announcement of its new solar cell in 1954 included a press conference in which the cell fueled a twenty-one-inch Ferris wheel and powered a radio transmitting a song.
The new solar cells were of immediate interest to the US military, which used solar panels to help fly the Vanguard 1 satellite. Solar panels would go on to play a key role in powering geostationary communications satellites.
In 1954, the world had less than a single watt of solar cells capable of running electrical equipment. By 2004, over a billion watts of solar power were in active use, and solar energy was a $3 billion to $4 billion industry in the United States alone.
As of early 2015, some 784,000 American homes and businesses operated on solar power, and there were over 174,000 people employed in the solar power industry.