1

WEAPONS OF MASS
CONSUMPTION

9781741768152txt_0035_001

And he shall judge among the nations, and shall rebuke many people:
and they shall beat their swords into plowshares, and their spears
into pruning hooks: nation shall not lift up sword against nation,
neither shall they learn war any more.

—ISAIAH 2:4
After total war... total living.
—AD IN LIFE MAGAZINE FOR REVERE COPPER AND BRASS
INCORPORATED, OCTOBER 1942

The view from atop Coventry Cathedral differs from that of many of Europe’s old towers. Rather than cobblestone streets and centuries-old edifices, Coventry’s oldest structure is surrounded by shopping malls. Off in one direction you can see a soccer stadium, in another a big blue Ikea store. There are few tourists scrambling for photos of landmarks here, just busy locals hurrying to get some shopping done during their lunch breaks as police cars zip by on the paved roads, sirens wailing. Radiating out from the cathedral in all directions are the hallmarks of twentieth-century construction: steel, concrete and glass buildings, both low- and high-rise, housing businesses and residences. There is little trace of antiquity in this city of more than 300,000 inhabitants, smack dab in the middle of England. Coventry resembles the new, manufactured metropolises of North America more than it does the Old World. But that wasn’t always the case.

Coventry has had three cathedrals in its history. The earliest, dedicated to St. Mary, was established in 1043 as a Benedictine community by Leofric, the Earl of Mercia, and his wife, Lady Godiva—the same woman who, according to legend, rode a horse naked through the streets to protest the excessive taxes her husband had imposed on residents. A statue commemorating her ride now stands in the middle of a pedestrianized outdoor shopping mall, just a stone’s throw from the current cathedral.

Over the next few centuries the settlement around the church grew, mainly on the strength of its textile trade, and by 1300 Coventry was England’s fourth-largest city. During the following two centuries, it became the kingdom’s temporary capital on several occasions when the monarchy relocated to avoid rebellions in London. St. Mary’s was also replaced by a grand Gothic church, St. Michael’s, but this fell into disrepair during the sixteenth century when Britain’s monasteries were dissolved. By the early twentieth century, Coventry had evolved into a major manufacturing centre, particularly for cars—the city was the Detroit of the United Kingdom, headquarters for heavyweights Jaguar and Rover—and the population had risen to a quarter of a million. St. Michael’s church, meanwhile, was elevated in status in 1918 to become the city’s second cathedral. With the outbreak of the Second World War, Coventry, touted by its government as the best preserved medieval city in England, also became one of the country’s top producers of airplanes and tanks, a status that made it a prime target for the Nazis.1

On the evening of November 14, 1940, German bombers commenced Operation Moonlight Sonata, Hitler’s most ambitious and vicious attack on England. Luftwaffe bombers pounded Coventry with wave after wave of high explosives and incendiary bombs from dusk till dawn, killing more than 550 civilians, injuring thousands, destroying more than 4,300 homes and damaging three-quarters of the city’s factories.2 St. Michael’s Cathedral, the city’s figurative and historical heart, was destroyed, save for the miraculous sparing of its spire. The German attack, intended to hurt England’s production capability and soften it up for an all-out invasion, was called “one of the worst bombardments from the air since the Wright brothers presented wings to mankind.”3 Hermann Göring, the Luftwaffe commander, boasted of the destruction and coined a new word to mark the occasion: Koventrieren, to “Coventrate” or raze to the ground.

In an editorial decrying the bombing, the New York Times pointed out that the horror of the Blitz had only happened because there was no defence against such a night-time assault. Anti-aircraft guns and mines strapped to balloons, a tactic seemingly borrowed from a Road Runner cartoon, brought down only a handful of attacking planes, which meant that “other great industrial centers and ports in England are exposed to the same fate whenever the moon is bright and the weather favorable to raiders.” Until a new defence could be developed, Prime Minister Winston Churchill’s warnings that “death and sorrow will be our companions on the journey, hardship our garment, constancy and valour our only shield” would continue to ring true.4

The development of such a defence was secretly underway in the ruins of Birmingham, which had been similarly “Coventrated.” Physicists John Randall and Harry Boot were experimenting with an improved version of the cavity magnetron, a copper tube that generated microwaves. At the magnetron’s centre was a cathode that pumped out electrons, which were spun around the tube by an attached electromagnet, a process that gave the device its name. The electrons spun past cavities drilled into the tube and produced radio waves. Those waves were then emitted and, if they hit something, bounced back to their source. This echo effect let the device operator know that something had been detected, and pinpointed the object’s position. Earlier versions of the magnetron, developed in the United States and Germany, were of limited use because they didn’t generate much power and had poor range. Randall and Boot boosted the tube’s power output a hundred-fold by drilling eight cavities into the magnetron instead of the standard two, and by adding a liquid cooling system. The result was a more powerful magnetron that was compact enough to be fitted into aircraft. The British government believed that giving planes the ability to see enemies at night would be a major advantage, perhaps enough to turn the tide of the war. The problem, however, was that Britain was cut off from its traditional European allies, now all under Hitler’s thumb, and lacked the production capacity and manpower to produce the device in the large numbers needed.

Bring in the Yanks

Britain turned to its long-time pal, the United States, for help. With the Nazis pressing and time running out, Henry Tizard, head of British aeronautical research, set out on a voyage across the Atlantic in late September 1940, taking with him the nation’s most valuable technological secrets, including blueprints and diagrams for explosives, rockets, self-sealing fuel tanks, the beginnings of plans for an atomic bomb and, the main attraction, the magnetron. Tizard put the magnetron’s fate into the hands of Vannevar Bush (no relation to the presidents), an inventor, electrical engineer, entrepreneur and patriot who resembled a beardless Uncle Sam.5

In his early adulthood in the 1910s, Bush had supplemented his undergraduate studies at Tufts College near Boston by working at General Electric and as research director for American Radio, a small company started by his fellow students Charles Smith and Al Spencer. (The company achieved some minor success during the First World War with Smith’s invention of the S-Tube, which eliminated the need to use batteries in radios, but was all but wiped out by the Great Depression.) In 1917 Bush received his doctorate in electrical engineering from Harvard and the Massachusetts Institute of Technology (MIT), and by 1923 had become a professor at the latter. In 1922 Bush and fellow Tufts engineering student Laurence Marshall teamed up with Smith and set up the American Appliance Company to market another of Smith’s inventions, a refrigerator with no moving parts—its solid state making it less prone to breaking— but failed miserably when they found no takers. The trio’s backup plan was an improved version of the S-Tube. They brought Al Spencer back on board, along with his younger brother Percy, and by 1925 American Appliance was earning a profit.6 To avoid problems with a similarly named company operating out of the Midwest, the group renamed the business Raytheon Manufacturing—adding “theon,” Greek for “of the gods,” to the rays of light their tubes produced. For the beleaguered British, Raytheon proved to be a godsend indeed.

In his public service life, Bush had helped develop a submarine detector for the American government during the First World War, but the system was never used because of bureaucratic confusion between industry and the military. “That experience forced into my mind pretty solidly the complete lack of proper liaison between the military and the civilian in the development of weapons in time of war, and what that lack meant,” he later recalled.7 In 1932 Bush became vice-president and dean of engineering at MIT, then moved to the prestigious Carnegie Institute of Washington as president in 1939, to be closer to the corridors of government power. Lack of co-operation was something he would not tolerate during the new conflict. Along with a group of fellow science administrators, including MIT president Karl Compton, Bush pitched President Franklin D. Roosevelt on an organization that would oversee research and development work between industry and the military. Bush showed Roosevelt a letter that proposed his National Defense Research Council and the president approved it on the spot. “The whole audience lasted less than ten minutes ... I came out with my ‘OK-FDR’ and all the wheels began to turn,” he later wrote.8 On June 12, 1940, the American military-industrial complex was born, with the patriot Vannevar Bush as its beaming father.

Bush was the chairman of the new NDRC while Compton was put in charge of developing radar. The first few meetings between Tizard’s delegation and the new American military-industrial brain trust were cautious, like a high-stakes poker game with neither side wanting to reveal its hand. Compton cautiously showed the British visitors the low-powered magnetrons developed by American scientists, which thawed the atmosphere between the two camps. After seeing that the two nations were on the same path, Tizard proudly demonstrated the high-powered British magnetron to the astonishment of his hosts, prompting the envious Compton to order the immediate establishment of the Radiation Laboratory at MIT to develop the device further. Large electronics manufacturers including General Electric, Bell Telephone and Western Electric were brought in to mass-produce the magnetron, but they encountered a problem: because the gizmo had to be machine-chiselled from a solid copper block, producing it in mass quantities was difficult, time-consuming and expensive.

Both Compton and Bush, who by now had extricated himself from the day-to-day operations of Raytheon but still held a seat on its board of directors, knew the company’s talented lead inventor, Percy Spencer, well. Raytheon was small compared with the likes of GE and Bell, but the company was just down the road from MIT in Waltham, Massachusetts, so Spencer was called in to take a look at the magnetron.9

Percy Spencer was an orphan and, as a child, poor as dirt. His father died when he was eighteen months old and his mother abandoned him soon after, leaving him to be raised by his aunt and uncle in Howland, Maine. More bad luck struck at the age of seven when his uncle died. Spencer spent his childhood doing country chores such as saddling horses and chopping wood, and was so poor he used to hunt to eat. From the age of twelve he worked at a spool mill, starting before dawn and continuing on until after sunset.

The enterprising youngster was extraordinarily curious, though, and when it came time to install electricity in the mill, he volunteered to do it. He learned by trial and error and emerged from the project a competent electrician. When the Titanic sank in 1912, his imagination was sparked by the heroism of the radio operators who had helped rescue survivors. So he joined the navy and learned wireless telegraphy: “I just got hold of textbooks and taught myself while I was standing watch at night,” he later recalled.10 His self-education went so well that the navy made him head of wireless production during the First World War. By 1940 the Raytheon engineer was renowned among scientists at MIT. “Spencer became one of the best tube designers in the world; he could make a working tube out of a sardine can,” one said.11

This reputation served Spencer well when he asked if he could take the magnetron, Britain’s most closely guarded technological secret, home for the weekend. It was like asking the Queen if he could borrow the Crown Jewels. But with the combined brain trust of MIT vouching for Spencer, Henry Tizard reluctantly gave his blessing. Spencer returned with what now seems like a no-brainer of a suggestion: rather than carving the magnetron out of a single lump of copper, why not create it piecemeal from several sections?

Western Electric had already been awarded a $30 million contract to manufacture the magnetron tubes, but was only managing to produce about fifteen a day using the machining method. Spencer promised he could outdo that production with his alternative procedure, so MIT gave Raytheon a contract to make ten tubes. Raytheon president Marshall then made a bet-the-company decision by investing in a new building and the special equipment required for the process, including a hydrogen oven.12 Within a month, Raytheon was making thirty magnetrons a day, twice Western Electric’s output. With Spencer’s promise fulfilled, the contracts started to roll in. Before long, the company was manufacturing the majority of the magnetrons for American and British forces. By the end of the war, Raytheon was pumping out nearly 2,000 magnetrons a day,13 about 80 percent of all the devices used by the Allies.14 Spencer and Marshall’s gamble had paid off handsomely. In 1945 Raytheon pulled in revenue of $180 million, a staggering jump from $1.5 million before the war.15

More importantly, the gamble paid huge dividends for the Allies. From early 1941, when the new magnetron-powered detection system began to be installed, British and American planes had air superiority over their German rivals. The new system, dubbed “radio detection and ranging” or “radar,” persuaded Hitler to permanently cancel his already-delayed invasion. Radar ultimately saved an inestimable number of lives. During the first two years of the war, German bombs killed more than 20,000 London residents. In 1942, after radar had been fully installed, the number of fatalities plummeted to a mere twenty-seven.16 The scale of the horror experienced in Coventry in the fall of 1940 was never seen again in Britain. The country’s remaining architectural treasures, including the massive Gothic edifices of Wells Cathedral and Winchester Cathedral, escaped the war largely unscathed; thanks to the RAF’s secret weapon, England’s storied past survived to be admired by future generations. A new, modernized Coventry Cathedral, also dedicated to St. Michael, was built right next to the old one after the war, becoming the city’s third cathedral.

In the later years of the war, Raytheon expanded beyond magnetron tubes into building whole radar systems, which were then installed on American ships in the Pacific. “With radar we could see the Japanese warships at night,” says Raytheon archivist and former vice-president Norman Krim, who has been with the company in various executive roles since its beginning. “They had no idea we could see them and that turned the war around.”17 Vannevar Bush shared that view in his memoirs, where he wrote that radar’s importance to ending the war was surpassed only by the atomic bomb.18 James Phinney Baxter III, the official historian of the U.S. Office of Scientific Research and Development, was no less effusive: “When the members of the Tizard Mission brought the cavity magnetron to America in 1940, they carried the most valuable cargo ever brought to our shores.”

The magnetron’s military impact is hard to overstate. The scientists who developed radar had an easy moral justification: they were working on a defence system for an unjust war fought against an evil enemy. As with all technology, however, radar also had its dark side. Just as it saved thousands of lives, it also helped end many more. Radar guided the Enola Gay to its destination, Hiroshima, where it dropped the atomic bomb that killed an estimated 140,000 people, and helped Bockscar find Nagasaki, where another 80,000 were killed by the second bomb.19 Radar has been installed in every guidance system, fighter jet and bomber used in every war since, bringing its total death count to date to an inestimable figure. Journalists who hailed the invention as “our miracle ally” in 1945 also correctly identified radar’s dual nature by tracing it back to its roots. “In a very real sense it represents the mythical death-ray by giving accurate precision so that the death stroke may be delivered,” said a New York Times editorial.20

Radar in the Kitchen

When the war ended, Raytheon’s fortunes sank just as fast as they had climbed. The American government had ratcheted up its defence spending as the war progressed, devoting almost 90 percent ($82.9 billion) of its entire 1945 budget to military expenditure. The following year, that spending decreased dramatically to just over three-quarters of the total budget, then plummeted to 37 percent ($12 billion) in 1947.21 Raytheon was scrambling. At the end of the war the company had employed 18,000 people but was down to 2,500 by 1947. Profit dropped to $1 million by 1956,22 from $3.4 million in 1945.23 Krim, a young engineer at Raytheon in its early days, remembers how dismayed Percy Spencer was. “He said, ‘What the hell am I going to do?’” Krim recalls. “No more war, no more radar, no more magnetrons. ‘I’ve got to find some use for these magnetrons to keep these people working.’ There was a mad rush for products we could make.”24

Raytheon was able to sell radar devices to commercial shipping operations, including public services such as ferries, but needed to put its invention to commercial use if it was going to stay afloat. The company’s first real foray into the wider consumer market was the poorly thought-out Microtherm, a gadget that used the heating properties of the magnetron to treat a variety of ailments, including bursitis and arthritis. The equipment, sold only to doctors, medical suppliers and institutions, could heat “any area, allows temperature penetration of as much as two inches and increases blood circulation by 250%,” according to news reports at the time.25 As smart as Spencer was, frying away aches and pains with microwave radiation was simply not one of his better ideas. Doctors in the forties and fifties agreed and the Microtherm sold poorly. Krim, who by the sixties had risen through the Raytheon ranks to become the company’s “undertaker”—the person called in to dispose of unwanted assets—sold off the money-losing Microtherm business in 1961.26

The magnetron’s ultimate commercial use was found by accident. Near the end of the war, Spencer was experimenting with magnetrons in his lab and noticed that a chocolate bar in his pocket had melted. Curious about the device’s heating effects, he brought in some popcorn kernels, which popped after being exposed. The next day, he exploded an egg using its heat waves. (I remember making the same discovery at age eight, when I blew up an entire pack of hot dogs in our microwave, much to the dismay of my screaming mother.) Spencer knew he was on to something so he applied for and got a patent on microwave cooking. A team of engineers set to work on transforming the magnetron into a cooking device and before long, their efforts bore fruit: they created an oven that heated the water molecules in food but left moisture-free ceramic or plastic containers cool.

The first microwave ovens were hulking behemoths. They stood nearly two metres tall, weighed three hundred kilos and were the size of a refrigerator. They weren’t cheap, either; Raytheon sold them mainly to large restaurants, hotel chains, ocean liners and railways for between $2,000 and $3,000, or the equivalent of $22,000 to $34,000 dollars in 2010 terms.27 They were made of solid steel with lead-lined ovens to prevent the microwaves from escaping. Their name, however, was perfect: the Radarange.

Large industrial customers loved the Radarange because it dramatically cut down on cooking times. The oven cooked a potato in four minutes, a ten-ounce sirloin steak in fifty seconds, hot gingerbread in twenty seconds and a lobster in two-and-a-half minutes. The highly competitive steamship industry— where cruise liners emphasized speed, style, luxury and, above all, cuisine—particularly prized them. Potato chip makers such as Lay’s also greatly preferred the microwave ovens to their traditional infrared counterparts for drying chips that had just been cooked in oil. Drying with infrared ovens took days while the Radarange did the trick in minutes.

Raytheon tried to expand its market with the first Radarange for the home in 1955, but its enormous expense—about $1,200, or the equivalent of $9,000 today—meant few sales.28 There were also Microtherm-like safety concerns; many families weren’t sure if they wanted to be near a radiation-emitting device. By 1957 only a few thousand had found their way into American homes.29 Five years later, the ovens had dropped in price to just under $800, but that was still beyond the means of most families, and only 10,000 units had sold.30 Still, some consumers recognized the irreversible hand of progress when they saw it. “This is not a trend,” one housewife said. “The only thing I don’t cook in my electronic range is coffee. It is a time saver because I can prepare dinner in a half an hour.”31

Raytheon’s new president Thomas Phillips shared that sentiment, even though the company had lost millions on the Radarange by 1965. He felt the only way to get a return on investment was to speed the oven’s adoption in the home, so he acquired Iowa-based Amana Refrigeration and transferred Raytheon’s knowledge of microwave ovens to the freezer maker. Krim recalls that Amana president George Foerstner’s plan to spur sales was simple. “He said, ‘I don’t give a damn what’s inside that box, it has to sell retail for less than $500.’” The home-appliance maker succeeded where the military contractor failed— by squeezing production efficiencies into the manufacturing process. Amana not only brought the Radarange’s price down to under $500, it also shrank the oven to fit on a countertop. Helped by government safety regulations that assured consumers the ovens were safe, sales boomed. Estimates pegged sales of microwaves in 1975 at 840,000, with Amana predicting that 10 percent of American homes would have one by early 1978. The ovens took off even faster in Japan, where safety concerns were less prevalent; about 1.5 million were sold in 1975, representing about 17 percent of households.32

The secret behind the ovens’ success was best summed up in a 1976 New York Times article. Estelle Silverstone, a New York attorney whose husband was a radiotherapist, was quoted reflecting on the new reality facing women—that of a double-income, dual-career family, short on time for meal preparation. “I’ve had a microwave for seven years. I don’t think I could live without it,” she said. “Leftovers don’t taste like leftovers anymore. I hate to clean up and there are no pots and pans. It’s not a substitute for a conventional oven, but I find it indispensable.”33

The age of cooking convenience had finally arrived in the home. The microwave oven was the perfect invention for a postwar society that put a premium on speed. With an increasing number of families seeing both parents going off to work, spare time was becoming more and more precious. The microwave helped facilitate those busy lives.

By the early twenty-first century, 96 percent of American homes34 and 87 percent of British homes had a microwave oven.35 Today, about 350 million are in use around the world and another twenty million are sold each year. The microwave has reached such a level of ubiquity that it is no longer considered the iconic aspirational purchase it once was. In Britain, where the magnetron was invented, the microwave was removed from the basket of goods used to measure the cost of living in 2008. The plummeting value of the ovens, which can now be had for as low as $25, no longer provides a useful indicator of consumer trends. “We have to make room for new items in the basket and microwaves are no longer different to any other household appliance,” a British statistician said.36 The microwave’s invasion of the home is complete, with the previously high-tech device now as mundane as a toaster or can opener.

The Microwave’s Sidekicks

The Radarange didn’t revolutionize home cooking on its own, though. It had lots of help in the form of new plastics such as Teflon and Saran that were also side effects of weapons development. Teflon, for one, was a direct by-product of the Manhattan Project.

In 1942 U.S. Brigadier-General Leslie Richard Groves, the military commander of the atomic bomb project, twisted the figurative arm of chemical and explosives maker DuPont to help. The company had wanted to steer clear of the conflict after being accused of profiteering during the First World War for selling munitions to Britain and France before the United States joined in. DuPont accepted Groves’s task reluctantly and limited itself to an official fee of one dollar37 after the general argued that the bomb would shorten the war and prevent tens of thousands of American casualties.38 His argument was likely strengthened by the fact that President Roosevelt’s daughter-in-law Ethel was also the DuPont family’s heiress. Appearances aside, DuPont took on the key responsibility of producing plutonium, the man-made element derived from the chemical separation of uranium atoms. The company embraced the mission with zeal and selected Hanford, a small, remote mountain town along the Columbia River in Washington State, as the site of its main production facility. By late 1944, after an investment of several millions toward building chemical reactors, separation plants, raw material facilities, acres of housing and miles of roads, the once desolate town had grown to become the third largest city in the state, with a population of 55,000.39 Hanford was in fact the largest plant DuPont had ever constructed.40

Plutonium production was a laborious and expensive process that required miles upon miles of pipes, pumps and barriers. An ounce of dust, grime or grease could ruin the entire system by entering through a tiny pinhole, yet a sealant that could perform a perfect patching job did not exist. DuPont decided to try out a substance that research chemist Roy Plunkett had accidentally discovered in 1938 at one of its labs in New Jersey. While experimenting with refrigerants, Plunkett opened a container of tetrafluoroethylene, only to find that the gas inside had solidified into a white resin. He found the new substance, which he dubbed polytetrafluoroethylene, to be extremely slippery and resistant to chemicals and heat. DuPont tested the substance as a sealant in its plutonium plant and found it plugged all the pipes and pumps perfectly. It was also put to use as a non-corrosive coating for artillery shell nose cones and as a high-performance lining for liquid fuel storage tanks, tasks at which it also excelled. The company patented the substance in 1941 and trademarked it just before the war ended under the name Teflon.

The substance was first sold in 1946 as a sealant for the electronic, chemical and automotive industries and took off in the late fifties once a home use was found. In 1954 French engineer Marc Grégoire invented a process for bonding Teflon with an aluminum frying pan, with which he launched his Tefal company. Consumers, happy about no longer having to fry their food in a pound of butter to stop it sticking to the pan, snapped up Tefal’s product (and the inevitable clones) in droves. By 1961 the company was selling a million pans a month.41 Teflon’s use expanded again in 1969, when American engineer Bob Gore discovered it could be stretched into a porous, super-strong filament. His new version of Teflon turned out to be an excellent transmitter of computer data and a good material for surgical supplies. Its ability to keep out moisture but let in air also meant it was the first material that could actually “breathe,” which made it ideal for waterproofing clothing. After several years of development, “Gore-Tex” clothes hit the market in 1980, and skiers would never again have to come home soaking wet.

Saran Wrap also had its origins during the war, and like so many good inventions, it too was an accident. In what reads like the origin story of a comic-book super hero, Ralph Wiley, an unwitting college student working at Dow Chemical’s labs in Michigan, was performing his chores one night when he found some beakers he couldn’t scrub clean. He dubbed the green substance stuck to them “eonite” after an indestructible metal that was supposed to save the world from the Great Depression in a Little Orphan Annie comic strip. Upon examining the goo, Dow researchers gave it the more scientific name of polyvinylidene chloride (PVDC). Wiley didn’t end up gaining super powers, but Dow did turn the substance into a greasy green film, which was dubbed Saran, and tested it during the war by spraying it on fighter planes. Saran did a good job at keeping out oxygen and water vapour and was perfect for protecting the planes on aircraft carriers from the spray of salt water. The substance saved the navy time and effort by allowing planes to be shipped on the decks of aircraft carriers, rather than disassembled and stored below decks in pieces. Guns were also wrapped in the protective plastic, like death-dealing lollipops. A wartime ad from Dow proclaimed that when “men on our fighting fronts throughout the world ... unpack a machine gun they find it protected from moisture with Saran Film. There are no coatings of grease to be removed—no time lost. The gun slips out of its Saran Film envelope clean, uncorroded, ready for action!”

After the war, Saran went commercial. By 1950 the plastic was being sprayed onto everything from bus seats to clothes to drapes, all of which it made more water-resistant. Dow’s revenue climbed steadily thanks to Saran42 and by 1952 the company was churning out more than fifty million tons of the stuff.43 The plastic’s real impact, however, came a year later, when it was turned into a clear, clingy film that could be stretched over food, allowing people to store leftovers in the refrigerator. Saran Wrap was the perfect partner for the Radarange; the plastic kept leftovers from spoiling long enough to be reheated in the microwave. Buoyed by sales of new plastic products, particularly Saran Wrap, Dow ended the decade with record sales and profits.44 Saran and other competing plastic wraps were becoming ubiquitous kitchen fixtures.

The most important plastic to come out of the war, however, was polyethylene. This highly versatile, variable-density substance was discovered, again by accident, before the war by British researchers working for Imperial Chemical Industries in London. In their search for new plastics, Eric Fawcett and Reginald Gibson found that a mixture of ethylene and benzaldehyde produced a white, waxy substance. The experiment yielded a usable result only because it had been contaminated with an unknown amount of oxygen, an accident that took the scientists years to recreate. Polyethylene wasn’t truly born until 1939, just in time for the war, when it became the primary material for insulating cables on another new British invention: radar. Although Germany developed its own detection system during the war, its scientists never did come up with polyethylene, which meant its troops faced a disadvantage in situations where moisture was a factor. German boats and planes travelling through rain and clouds often saw their radar malfunction.

DuPont licensed polyethylene from ICI, but aside from insulating radar and other telecommunications equipment, the company didn’t know what to do with it.45 Late in the war it gave one of its former engineers, Earl Tupper, a few tons of the plastic to play with, which he used to make gas masks and signal lamp parts. Tupper struck it rich after the war by using the plastic to create a range of storage containers with liquid-and airtight sealable lids.46 The containers, egotistically dubbed “Tupperware”—not that Gore-Tex was particularly modest— showed DuPont and others the way in plastics. Before long, polyethylene was everywhere: in dishware, furniture, bags, toys (including two of the biggest crazes of the fifties, the Frisbee and the hula hoop), shampoo and soda pop bottles, packaging, pens and even clothes. The plastic’s versatility and uses were limited only by manufacturers’ imaginations. ICI and DuPont proved to be the most imaginative and established themselves as the biggest plastics makers in the world. For his part, Tupper sold his Tupperware company to Rexall Drugs in 1958 for a reported $10 million, which he used to buy a small Central American island where he lived as a hermit until his death in 1984. This only sounds odd until you compare it with the fate of Richard James, the inventor of the Slinky, as we’ll see in chapter four.

The Dark Side of Plastic

For all the Allied advances in plastics, it was actually Germany that led the way in the development of synthetic materials. The Nazis had two good reasons for their accelerated research: Germany had experienced material shortages more acutely than any other nation during the First World War, and after that conflict it was prohibited by Allied sanctions from stockpiling resources that could be used for armaments. As a result, the German people were already familiar with synthetic or “ersatz” products by the twenties. In 1925 a group of chemical companies were brought together into the Interessen-Gemeinschaft Farbenindustrie conglomerate, better known as I.G. Farben, as part of a strategy to create materials that could circumvent treaty limitations and allow Germany to prepare for future wars. Over the next decade, the conglomerate hired the country’s best scientists, who then rewrote the book on polymers—chemical compounds made up of a large number of identical components linked together like a chain. With their government-approved mandate, Farben chemists synthesized an average of one new polymer every day for ten years.47 When the Nazis came to power in 1933, Hitler immediately recognized the value of plastic and put Germany’s scientific community at the disposal of the state. “It is the task of science to give us those materials nature has cruelly denied us,” he said. “Science is only free if she can sovereignly master those problems posed to her by life.”48 By the time the war broke out in 1939, Germany’s military machine was largely synthetic, and more than 85 percent of its strategic materials were being made by Farben.

Hitler knew Germany would be cut off from two key resources—oil and rubber—once the war began, so he urged Farben to come up with synthetic alternatives. The results were two big hits: a hydrocarbon made by mixing carbon dioxide, carbon monoxide and methane, which German tanks and other vehicles could use as fuel; and a new, plastic type of rubber. The synthetic rubber was created by bonding together two polymer compounds, butadiene and styrene, into a so-called “copolymer.” Farben called its new substance an elastomer for its elastic properties and officially dubbed it Buna, a contraction of butadiene and Na, the chemical symbol for sodium, which was the catalyst for the polymer reaction.49

Farben’s most notorious invention, however, was Zyklon B, the pesticide used by Nazis to gas concentration camp victims. The chemical conglomerate’s brain trust profited heavily from its association with the Nazis and their concentration camps, both through the slave labour the camps provided and, even more horrifically, from the bountiful test subjects. At its peak, Farben’s factory at Auschwitz in Poland alone made use of 83,000 slave labourers and an undocumented number of unwilling human guinea pigs.50 The conglomerate and many of its most zealous scientists faced justice in the Nuremburg trials that followed the war (thirteen of its directors were found guilty of war crimes and served prison time), but much of the work lived on through component companies after Farben was dismantled in 1951. Several of today’s largest multinational firms owe part of their post-war successes to the often ill-gotten intellectual property inherited from Farben, including film manufacturer Agfa-Gevaert, chemical maker BASF and pharmaceutical companies Sanofi-Aventis (derived from a merger of Farben spinoff Hoechst and France’s Rhône-Poulenc Rorer) and Bayer.

Plastics also caused their share of physiological damage. Through the fifties and sixties, research showed that workers producing synthetic substances were vulnerable to developing a host of medical conditions, including heart arrhythmia, hepatitis, gastritis, skin lesions, dermatitis and cancer. Worse still, some plastics—particularly polyvinyl chloride (PVC) and polystyrene—were found to be able to leach into food and cause cancer in unsuspecting consumers. By the seventies, the public was wary of plastic and companies using it began to feel the backlash. Coca-Cola, for one, was set to introduce the world’s first plastic soft drink bottle in 1977, but had to pull the plug amid fears that the product could cause cancer. Coke’s bottle, made of acrylonitrile styrene, was designed in conjunction with chemical giant Monsanto at an estimated cost of $100 million. Monsanto’s product application stated that low levels of about fifty parts per billion of carcinogenic plastic “may form and migrate into the beverage”—a negligible amount, but it was enough for the Food and Drug Administration to reject the bottle.51 Monsanto closed its bottle manufacturing plants and Pepsi beat Coke to the punch with a polyethylene bottle, designed by DuPont, that passed FDA muster.

Coke’s first attempt at a plastic bottle—it eventually introduced an FDA-approved product in 1978—was just one casualty of the public’s growing unease with plastics, a discomfort that became part of the battle between consumer advocates and food processors that continues today. In the past few years, for example, health authorities in several countries have banned the plastic Bisphenol A from baby bottles because tests have linked it to cancer and hormone imbalance.

In addition to health risks, plastics are also strongly associated with environmental damage. Most plastics degrade very slowly, which means that the ketchup bottle in your fridge is likely to be around long after Armageddon. By the eighties, this presented a huge problem for overflowing garbage landfills around the world. Late in the decade, large corporations started to feel pressure from consumer groups to limit their plastic waste output and institute recycling programs. In 1987 one group, the Citizens Clearinghouse on Hazardous Waste, found that McDonald’s alone was contributing more than a billion cubic feet of foam packaging waste each year.52 Along with another grassroots group, the Vermonters Organized for Cleanup, the CCHW pressured the fast-food chain into switching to recyclable paper packaging in 1990, a move McDonald’s said would reduce its waste output by about 90 percent.53 For environmentalists, the chain’s switch was a major win, but it was only a small victory in the battle against an overflowing tide of permanent, non-biodegradable waste, a struggle that continues today.

But health and environmental concerns were the furthest thing from the collective minds of people in the fifties and early sixties. They had made it through the worst economic crisis and wars in human history and they wanted a chance to stretch their legs and live it up. As the Life ad said, after total war came total living. The microwave oven freed up people’s time from chores like cooking, while plastics provided a veritable cornucopia of new products on which hard-earned salaries could be spent. These consumer goods kicked off a new lifestyle, one dedicated to instant gratification, prosperity and indulgence, the diametric opposite of life during the thirties and forties. These weapons of mass consumption, derived from inventions that helped perfect the art of killing, forever changed daily life and paved the way for the sort of total living that would come in later decades.