The road we have traveled to our current state of eating is actually a very long, interconnected highway. After World War II, American national security strategists decided that protecting the homeland required building a network of broad interstates that mirrored the German Autobahn. This monumental road-building project—now close to 47,000 miles long—was initially conceived as a way to efficiently move troops and military machinery, but it has also had dramatic peacetime consequences for the American landscape, and for the American diet.
Suddenly, big, safe interstates—and the millions of miles of ring roads, state roads, and town roads they encouraged—allowed people to live farther and farther from the cities where they worked. People moved out of cities in droves, looking for new places to live. Land prices outside cities skyrocketed, and small farmers occupying that land had a hard time resisting when real estate developers came to call.
Suburban development hit small American farms like a virus. In the 1950s alone, some 10 million people left family farms. Chances are, your grandparents (or even your parents) can tell you stories about all those farms in your area that over the last few decades have been turned into subdivisions and shopping malls. In Maryland, where I live, suburban development has replaced 900,000 acres of farmland (and 500,000 acres of forest) in just the last forty years.
All these new roads, and the suburbs and industries to which they gave birth, caused a second tectonic shift in American culture: in the way we came to eat. Car-friendly fast-food chains like McDonald’s and Carl’s Jr. and Burger King started popping up along the new highways like weeds. By the early 1960s, Kentucky Fried Chicken was the largest restaurant chain in the United States.
These restaurants did not cook, exactly; what they did was heat up highly processed, prepackaged foods that tasted exactly the same, whether you were in Dallas or Des Moines. The ingredients didn’t need to be fresh, they needed to be uniform, and storable, and—most important, given skyrocketing demand—they needed to be provided in vast quantities.
Fast-food joints didn’t need local asparagus from New Jersey or collard greens from Georgia or one-of-a-kind apples grown in small orchards in New York. They needed commodity grains to sweeten their sodas, fry their fries, and feed the animals that could be turned into hamburgers and hot dogs and fried chicken. What these restaurants needed was corn, and wheat, and soybeans. And lots of them.
As small family farms near population centers went bankrupt or sold their land to developers, and as the American diet started demanding processed meals, food production flowed like beads of mercury to the control of larger and larger industrial farm operations in the Midwest. As food production became centralized, companies that controlled the grains, chemicals, and processing factories became bigger and much more politically powerful. Thanks to intensive lobbying, tens of billions of dollars in federal farm subsidies began flowing to giant agribusinesses that were driving the development of the industrial food system. As early as the 1970s, farmers around the country were being told (in the words of President Nixon’s Agriculture Secretary Rusty Butz) to “get big or get out.”
Most farmers got out. A little over a hundred years ago, there were 38 million people living in the United States, and 50 percent of them worked on a farm. Today, we have 300 million people. How many work on farms? Two percent.
Today, if you drive across the grain belt—Pennsylvania, Ohio, Indiana, Illinois, Iowa, Nebraska, Missouri, Kansas—you will spend many, many hours crossing an ocean of just three crops: corn, wheat, and soybeans. They are being grown by farmers you will likely never meet, processed in factories you will likely never see, into packaged foods containing ingredients that look nothing like the crops from which they were made. You won’t see it, but your soda will be sweetened with high-fructose corn syrup, which replaced sugar in the 1980s. Your fries will be dunked in boiling soybean oil. And your burgers and nuggets and sliced turkey breast will all be processed from animals fed corn or soybeans, or both.
What you most likely won’t see, out along on the great American road system, are regional food specialties, or the mom-and-pop diners and restaurants that used to serve them. New England clam chowder, New Orleans gumbo, Maryland crab bisque: all these foods require local ingredients, which (by definition) giant farms in Iowa or Kansas are unable to provide. Replacing them has been the food that these farms can provide: Fast food. Processed food. Soda. Pizza. Chicken nuggets. Cheap hamburgers. A vast culinary sameness, all essentially built out of two or three crops, controlled by a small handful of companies. All available twenty-four hours a day in any restaurant, dining hall, or gas station in the country.
It wasn’t just fast-food restaurants pushing this new food system. Food-processing giants like ADM, ConAgra, and Cargill learned to take monoculture corn and soybeans and turn them into the raw ingredients that could be made into just about anything a supermarket shopper wanted. Companies like General Mills or Coca-Cola could take a few cents’ worth of wheat or corn and process it into Cocoa Puffs or a two-liter bottle of soda and sell it for a few dollars. As food scientists became more creative, they learned how to take wheat and corn and soy and turn them (along with the secret “fragrances” and “flavors” whose provenance only the food scientists seem to know) into limitless quantities of foods sold in suburban supermarkets—as often as not built on top of former farms.
These new foods were cheap to make, enormously profitable, and consumers seemed to love them. Americans spent $6 billion a year on fast food in 1970. By 2014, they were spending more than $117 billion. Today, Americans drink about 56 gallons of soda a year—about 600 cans per person—and every month, 90 percent of American children visit a McDonald’s.
As industrial farms continued to grow, they gobbled up not just good land but marginal land, changing the face of millions upon millions of acres of forest, grasslands, hillsides, even wetlands. The strange thing was that the plants they grew—corn, soy, wheat—didn’t seem to mind this change. The plants could grow, weed-like, even in marginal soil.
So, for better or worse, could the animals. Industrial feedlots across the Midwest began buying trainloads of corn and soybeans to feed an industry that now slaughters 9 billion animals a year.
As farms consolidated and grew, and as industrial processors increased their demand for ingredients that could be turned into shelf-stable food, farmers responded by growing what the market demanded—and eliminating what the market did not. Over the course of the twentieth century, the varieties of fruits and vegetables being sold by commercial U.S. seed houses dropped by 97 percent. Varieties of cabbage dropped from 544 to 28; carrots from 287 to 21; cauliflower from 158 to 9; tomatoes from 408 to 79; garden peas from 408 to 25. Of more than 7,000 varieties of apples, more than 6,200 have been lost.
—
THE DEVELOPMENT of American highways and suburbs caused one of the most dramatic changes in land use in the history of the world. But running parallel to this was an equally momentous shift in agricultural technology, which grew up fast to supply the rapidly changing American diet. In the 1930s, a plant breeder named Henry A. Wallace began boasting of the benefits of crossbred or “hybrid” corn, which he had meticulously developed to produce unprecedented yields. Even Wallace knew he was on to something dramatic. “We hear a great deal these days about atomic energy,” he said. “Yet I am convinced that historians will rank the harnessing of hybrid power as equally significant.”
Wallace was right. Corn yields doubled—from around 25 bushels per acre to 50 bushels per acre—in ten years. From 1934 to 1944—even before the postwar boom in agribusiness—hybrid corn seed sales jumped from near zero to more than $70 million, and rapidly replaced the enormous variety of seeds farmers had saved and traded for generations. By 1969, yields were up to 80 bushels an acre, and fully 71 percent of the corn grown in the United States was being grown from just a half-dozen types of hybrid seed. Industrial monoculture had arrived. Wallace’s Hi-Bred Corn Company became Pioneer Hi-Bred International, America’s largest seed company.
Since the 1960s, corn yields have doubled again, and now stand, in some places, close to 200 bushels per acre—nearly a tenfold increase in a single century. This phenomenal increase in production was dramatically accelerated by the invention, in the early twentieth century, of the Haber-Bosch process, which won its German inventors Nobel Prizes for discovering how to convert atmospheric nitrogen into ammonia. The ability to synthesize ammonia—routinely called the most important invention of the twentieth century—made it possible for industry to mass-produce two things that changed the world: explosives during the war and synthetic fertilizers after the war.
By the late 1940s, the war over, American industries found themselves with an enormous surplus of ammonium nitrate, the primary ingredient used to make TNT and other explosives. Since the synthetic compound also proved to be an excellent source of nitrates for plants, the U.S. Department of Agriculture (USDA) started encouraging the use of these chemicals on American farmland.
Suddenly, farmers (and their crops) shifted from a reliance on energy from the sun (in the form of nitrogen-fixing legumes or plant-based manure) to a reliance on energy from fossil fuels. Liberated from the old biological constraints, farms “could now be managed on industrial principles, as a factory transforming inputs of raw material—chemical fertilizer—into outputs of corn,” Michael Pollan writes in The Omnivore’s Dilemma. “Fixing nitrogen allowed the food chain to turn from the logic of biology and embrace the logic of industry. Instead of eating exclusively from the sun, humanity now began to sip petroleum.”
A similar pattern emerged for the poison gases that industry had developed for the war: they were repurposed as agricultural pesticides and herbicides. Monsanto had begun the twentieth century making things like aspirin. In 1945, the company began making herbicides like 2,4-D, which would become a prime ingredient in Agent Orange, and is now one of the most popular farm sprays in the world. Monsanto also spent decades making PCBs, a compound used in both pesticides and electrical transformers (and long since banned as a dangerous carcinogen). By the 1960s, Monsanto was making a whole host of pesticides, with tough-sounding cowboy names like Lasso, Lariat, and Bullet. But the company’s star product was Roundup, the glyphosate that is now the most popular herbicide in the world—and which, in a few short years, would be the star player in the growth of GMOs.
DuPont, Dow, Syngenta, Bayer, BASF—all the world’s largest chemical companies made fortunes manufacturing compounds like DDT, atrazine, and scores of other farm chemicals. Today, the six top chemical companies control nearly 75 percent of the world’s pesticide market.
This transition, from wartime chemicals to petroleum-based farm chemicals that now cover hundreds of millions of acres in the United States alone, has proven a double-edged sword for the world’s farmers, and for the rest of us. For one thing, it means that most of us, in the words of the Indian food activist Vandana Shiva, are “still eating the leftovers of World War II.”
True, it cranked up the amount of food farmers could grow, but it also (in the words of Czech-Canadian scientist Vaclav Smil) “detonated the population explosion.” Farmers could now grow a lot more food, but suddenly—thanks in no small part to all this extra food—there were a lot more people to feed. Since the end of World War II, chemical fertilizer production jumped from 17 million tons per year to more than 200 million tons. Excess fertilizers and pesticides that are not taken up by plants seep into the rivers and bays, where they contaminate drinking water and cause algae blooms (and aquatic dead zones) so large they can be seen from space. They evaporate into the air, where they serve as major contributors to climate change.
And it’s not just plants that these chemicals fertilize. Since their advent, the human population has nearly tripled. Without these chemicals, Smil writes, billions of people would never have been born. The dousing of our crops with fossil fuels, in other words, meant we could now make unprecedented amounts of food. But now we had to.
For the large chemical companies, the global demand for more food provided a huge new market opportunity, not only for fertilizers and pesticides but for novel seeds to grow the crops themselves. By the late 1980s and early 1990s, the explosion of biotechnology—and especially in the ability of scientists to genetically engineer plants—meant that companies once devoted to chemistry began frantically shifting their emphasis to molecular biology. Chemical giants like Monsanto, DuPont, Syngenta, and Bayer began a frenzy of mergers and acquisitions, racing each other to dominate the world’s seed industry. Monsanto’s CEO Robert Shapiro moved especially aggressively in the mid-1990s, spending billions of dollars buying up seed companies and instantly making Monsanto the world’s biggest ag-biotech company. The company bought Calgene, the maker of the Flavr Savr tomato, mainly because the smaller firm had ideas about GM cotton and canola.
Similar changes were under way at Dow and DuPont, which started out as makers of explosives like phenol and dynamite and are now two of the biggest GM seed companies in the world. In 1999, DuPont spent $7.7 billion to buy Pioneer Hi-Bred, which controlled 42 percent of the U.S. market for hybrid corn and 16 percent of the country’s soybeans. The deal gave DuPont control of the world’s biggest proprietary seed bank, as well as a global seed sales force.
The consolidation of the agrochemical giants has continued. In late 2015, DuPont and Dow Chemical announced a $130 billion merger, and Monsanto made a $45 billion offer to buy Syngenta. The deal fell through, but Syngenta was immediately targeted by China National Chemical Corp., and Monsanto turned its attention to acquiring the crop science divisions of German chemical giants BASF and Bayer. Bayer responded in the spring of 2016 by offering to buy Monsanto for $62 billion. Monsanto rejected the bid as too low, but the companies remain in negotiations.
As late as the 1990s, the United States had hundreds of different seed companies; now we have a half-dozen. The biotech industry owns at least 85 percent of the country’s corn seed, more than half of it owned by Monsanto alone. “This is an important moment in human history,” Monsanto’s CEO Robert Shapiro said in 1999. “The application of contemporary biological knowledge to issues like food and nutrition and human health has to occur. It has to occur for the same reason that things have occurred for the past ten millennia. People want to live better, and they will use the tools they have to do it. Biology is the best tool we have.”
This, then, was the monumental shift that gave us GMOs. In a few short years, companies that had long known the power of chemistry discovered the power of biology. And the way we eat has never been the same.
—
GENETIC ENGINEERS are correct when they say that the fruits and vegetables we see in the supermarket look nothing like their wild forebears. The tomatoes we eat today—juicy and sweet, not bitter and toxic—are the result of thousands of years of human selection. So is the corn. The first cultivated carrots—typically yellow or purple—were grown in Afghanistan. It was only after traders carried them to Europe and the Mediterranean, where they were crossed with wild varieties, that their offspring gradually turned orange.
In the nineteenth century, the Austrian monk and scientist Gregor Mendel discovered how a plant passed its traits from parent to offspring. Taking anthers from one variety and dusting them with pollen from another, he crossed some 10,000 plants: round peas with wrinkled peas; peas from yellow pods with peas from green pods; peas from tall and short plants. Every trait a plant’s offspring exhibited—height, color, shape—depended on what Mendel called “factors” that were either dominant or recessive. So if a round pod was crossed with a wrinkly pod, three out of four times the offspring would be round, meaning that was the dominant trait. The last one could either be round or wrinkly. That’s because these factors apparently came in pairs, one from each parent, and were inherited as distinct characteristics.
DNA was known to be a cellular component by the late nineteenth century, but Mendel and other early geneticists did their work without understanding its role in heredity. By the late 1940s, most biologists believed one specific kind of molecule held the key to inheritance, and turned their focus to chromosomes, which were already known to carry genes. As agricultural research began moving from the field into the laboratory, scientists discovered a new way to mirror natural selection: by exposing plants to chemicals or radiation, they could alter the plant’s biochemical development. They could force it to mutate. By some estimates, radiation mutagenesis has introduced some 2,500 new varieties of plants into the world, including many that find their way onto our plates, like wheat, grapefruit, even lettuce.
With the flowering of genetic engineering in the 1970s and 1980s, scientists figured out how to go into an organism—a plant or an animal, a bacteria or a virus—remove one or more genes, and stitch them into the genetic sequence of another organism. This process became known as recombinant DNA technology.
The first commercially available product of genetic engineering was synthetic insulin. In humans, insulin is normally made by the pancreas and helps regulate blood glucose; produce too little insulin, and you can develop type 1 diabetes. Traditionally, increasing a diabetic’s insulin required collecting insulin from the pancreatic glands of pigs or cattle, a problem not only for the animals but also for people who became allergic to the insulin’s different chemical structure.
In 1978, scientists at the company Genentech used genetic coding to create a synthetic insulin known as humulin, which hit the market in 1982. Today, this GM insulin is produced around the clock in giant fermentation vats and is used every day by more than 4 million people. Similar technology has been used to produce vaccines that combat hepatitis B; human growth hormone, which combats dwarfism; and erythropoietin (EPO), which helps the body produce red blood cells (and has been, illegally, used to boost racing performance by riders in the Tour de France).
In the late 1980s, genetic engineers turned their sights on cheese. Just a few years before the release of the Flavr Savr tomato, the combination of a single gene from a cow was stitched into the genome of a bacterium (or a yeast) to create rennin, a critical enzyme in the production of hard cheeses. Once obtained as a by-product of the veal industry, rennin was traditionally collected from the lining of a cow’s fourth stomach. GM rennin is now used in some 90 percent of the cheese made in the United States.
But compared with what was to come, these early experiments were, well, small potatoes. The real money, agrochemical companies knew, would come through genetically engineering the crops Americans ate most. Not cheese, but corn and soybeans. Control those crops, and you could dominate a fundamental part of the global economy.
Monsanto’s most important push was to create seeds the company could sell alongside Roundup, already the bestselling farm chemical in the world. Creating (and patenting) Roundup-resistant seeds would secure the company’s global share in seeds and herbicides. The world’s farmers wouldn’t buy just one. They would buy both.
“It was like the Manhattan Project, the antithesis of how a scientist usually works,” said Henry Klee, a member of Monsanto’s Roundup research team. “A scientist does an experiment, evaluates it, makes a conclusion, and goes on to the next variable. With Roundup resistance, we were trying twenty variables at the same time: different mutants, different promoters, multiple plant species. We were trying everything at once.”
It took four years, and a bizarre eureka moment, for Roundup Ready seeds to be born. Frustrated in their lab work, company engineers decided to examine a garbage dump 450 miles south of Monsanto’s St. Louis headquarters. There, at the company’s Luling plant on the banks of the Mississippi, the engineers found plants that had somehow survived in soil and ponds near contamination pools, where the company treated millions of tons of glyphosate every year. The hardiest weeds were collected, their molecular structure examined, their genes replicated and inserted into potential food crops.
When Roundup Ready soybeans were finally launched, in 1996, they instantly became an essential part of a $15 billion soybean industry. Roundup Ready soybeans covered 1 million acres in the United States in 1996; 9 million acres in 1997; and 25 million in 1998. Today, 90 percent of the country’s 85 million acres of soybeans are glyphosate resistant.
The first insecticide-producing corn plant was approved in 1996, the same year Monsanto released its Roundup Ready soybean. Today, the overwhelming majority of the GM crops grown in the United States—some 170 million acres of them—are still grown to feed the industrial food system. In Iowa, GM corn is grown to feed the numberless cows and pigs that enter into the fast-food system. In Maryland, GM soybeans are grown to feed the hundreds of millions of chickens on the state’s Eastern Shore, which will enter the same system. In Nebraska, GM canola is grown to make the oil to fry the french fries served in the country’s galaxy of drive-through restaurants.
Why are the crops genetically engineered? For the same reason the highways were built: they make everything faster, more uniform, more efficient. In the United States, GM crops are grown mainly for two reasons: to increase yields and—especially—to allow farmers to spray their crops with chemicals that kill insects, diseases, or weeds. By developing crops that can withstand regular pesticide dousing (or, like Bt corn, that can provide their own insecticide), scientists have enabled farmers to eliminate everything but the crops whose numbers they are trying to maximize. Gone are the weeds. Gone are the insects. The whole system works—in the most literal sense—like a well-oiled machine.
Food and chemical companies—and the farmers who grow for them—say that GM crops allow them to deliver a lot of food to a lot of people for very little money, and this is true, as far as it goes. Americans have become very comfortable spending relatively little money for their food. According to the World Bank, Americans spend considerably less per capita on food than anyone else in the world. Food expenses are much higher in the UK (9 percent), France (14 percent), South Africa (20 percent), and Brazil (25 percent). And our food is cheap not just compared with other countries; it’s cheap compared with the food we used to eat, before all our small farms moved to the Midwest. In 1963, the year I was born, Americans were spending close to a third of their income on food. Now we spend about 6 percent.
—
SO HERE WE ARE. Genetic engineering did not create any of the structures that hold up our current food system. It merely added a set of tools—very powerful tools—to keep the whole machine running. The fact that these tools arrived on the scene at the very moment that the American food economy was becoming so intensely industrialized has created both enormous profits for the companies and enormous health and environmental problems for the rest of us. Had genetic engineering come about at a different time—were we still a nation of small farmers, for example, and were biotech companies making seeds to help local farmers grow nutritious produce—things might have turned out entirely differently.
But that’s not what happened. When it comes to GMOs, it’s impossible to separate science from industry, or industry from politics. It’s all tangled up together, and we are eating all of it. The argument that genetic engineering is just another step in a tradition of plant breeding that goes back 10,000 years is absolutely true. But it is also true that biotechnology has developed at a time when its primary use has been to fuel a food system that is far bigger, more complex, and more destructive than anything the world has ever seen.
Because this system has become so profitable, companies have gone to great lengths to cement their control over it in all three branches of the federal government. Through the White House, they push their own people to the top of federal regulatory agencies. In Congress, they use lobbyists and political muscle to influence policy, and to keep federal farm subsidies flowing. In the courts, beginning in 1980, they have repeatedly convinced judges that they deserve patents (to quote a famous court decision) on “anything under the sun that is made by man.” To date, tens of thousands of gene patents have been awarded to biotech companies, and tens of thousands more wait in the wings. This means, in the most fundamental way, that our food supply is owned and controlled by a very small handful of companies.
This is nowhere more evident than in the hundreds of billions of taxpayer dollars that move through federal regulatory agencies into the hands of companies these same agencies are supposed to regulate. Between 1995 and 2010, large agricultural companies received $262 billion in federal subsidies, a great percentage of it going to companies developing GM food products.
It is also evident in the way federal agencies view their relationship with the companies they are charged with overseeing. Since the 1980s, regulation of GMOs has been handled through a complex web of three vast federal agencies. A genetic engineer has to get a permit from the USDA to field-test a GM crop. Then—after several years of trials—the engineer must petition for the deregulation of the crop. If the crop has been designed to be pest-resistant, the EPA will regulate it as pesticide and demand more data. Finally, the FDA evaluates the plant to make sure it is safe for consumption by people or animals.
But in reality, safety testing of GMOs in the United States is left to the companies that make them. This is very much in line with much of American regulatory policy and is dramatically different from the approach taken in Europe, where regulators require that the introduction of GM foods should be delayed until the long-term ecological and health consequences of the plants are better understood. In the United States, industry and government have decided that GMOs are “substantially equivalent” to traditional foods, and therefore should not be subjected to new federal oversight.
U.S. policy “tends to minimize the existence of any risks associated with GM products, and directs the agencies to refrain from hypothesizing about or affirmatively searching for safety or environmental concerns,” legal scholar Emily Marden writes.
The shift in federal policy from “regulating” GMO foods to “promoting” them was subtle, and to most of the country, entirely invisible. Back at the beginning, in 1974, Paul Berg, often called the father of genetic engineering, persuaded other molecular biologists to be cautious in the pioneering work they were doing in their laboratories. “There is serious concern that some of these artificial recombinant DNA molecules could prove biologically hazardous,” Berg wrote at the time. To address these questions, Berg and his colleagues at the National Academy of Sciences urged caution in the development of genetic engineering technology until scientists could form standards for biological and environmental safety. Addressing the technology itself, rather than its application to food production, the now famous “Berg Letter” acknowledged that such a cautious approach was based on “potential rather than demonstrated risk,” and might well mean the “postponement or possible abandonment” of some ongoing experiments.
“Our concern for the possible unfortunate consequences of indiscriminate application of these techniques,” Berg wrote, “motivates us to urge all scientists working in this area to join us in agreeing not to initiate experiments until attempts have been made to evaluate the hazards and some resolution of the outstanding questions has been achieved.”
After Berg’s letter was published, a group of scientists organized a closed-door conference at Asilomar, California, in February 1975 to formulate research guidelines that would prevent health or ecological trouble from rippling out from this new technology. But the letter also made it very clear that scientists themselves, and not the government, would be in charge of keeping an eye on things. No new legislation was needed, the letter noted. Scientists could “govern themselves.”
James Watson, one of the discoverers of the double helix structure of DNA and an attendee at the Asilomar conference, made it clear that scientists were not interested in ethical guidance from outside the profession. Although some “fringe” groups might consider genetic engineering a matter for public debate, the molecular biology establishment never intended to ask for guidance. “We did not want our experiments to be blocked by over-confident lawyers, much less by self-appointed bioethicists with no inherent knowledge of, or interest in, our work,” Watson wrote. “Their decisions could only be arbitrary.”
Watson had nothing but contempt for those who would stand in the way of scientific research; he once referred to critics of genetic engineering as “kooks, shits, and incompetents.” The risks from this technology, he wrote, were about the same as “being licked by a dog.”
The National Institutes of Health quickly adopted the Asilomar conclusions and turned them into a national research standard: biotechnology research would be largely self-regulated and should be encouraged, not hampered, by federal oversight.
At first, most of the research being done in biotechnology had to do with medical research, not food production, and given the lack of public debate on the issue, few health or environmental groups paid much attention to genetically engineered food. But within a few years, the potential applications—and the potential profits—in agriculture became obvious. The question was, what would happen once this technology escaped the laboratory and was scaled up to reach all our dinner tables?
“In the 1970s, we were all trying to keep the genie in the bottle,” said Arnold Foudin, the deputy director of biotechnology permits at the USDA. “Then in the 1980s, there was a switch to wanting to let the genie out. And everybody was wondering, ‘Will it be an evil genie?’”
The genie was released in the 1980s and 1990s by the Reagan and Bush administrations, which had long made industrial deregulation a national priority. To their eyes, the burgeoning biotech industry was a perfect merging of business and science that—if left alone—would generate colossal corporate profits for American agricultural conglomerates.
“As genetic engineering became seen as a promising investment prospect, a turn from traditional scientific norms and practices toward a corporate standard took place,” sociologist Susan Wright observes. “The dawn of synthetic biology coincided with the emergence of a new ethos, one radically shaped by commerce.”
If nothing else, all this grain would supercharge the meat industry: Reagan’s first secretary of agriculture was in the hog business; his second was president of the American Meat Institute. George H. W. Bush later appointed the president of the National Cattlemen’s Association to a senior USDA position.
The trick was to come up with federal policy that would allow this new technology, and the products it generated, to enter the marketplace without regulatory hassles—and without worrying the public that the foods they made were somehow different from traditional foods.
Creating these rules required some fancy bureaucratic footwork. Since 1958, Congress (through the Federal Food, Drug, and Cosmetic Act) had mandated that “food additives”—typically chemical ingredients added to processed foods—should undergo extensive premarket safety testing, including long-term animal studies. Commonly used ingredients, like salt and pepper, were considered GRAS (for “generally regarded as safe”) and were exempted from further testing.
The billion-dollar question was: Should genetically altered foods be considered a “new” food additive—and thus be forced to undergo extensive testing—or “safe,” like salt and pepper?
In the early 1990s, the FDA put together a scientific task force to study this question. A consensus quickly emerged that these new products should be developed cautiously, and should be tested to see just what impact they might have on the health of people and animals who eat them.
“The unintended effects cannot be written off so easily by just implying that they too occur in traditional breeding,” wrote microbiologist Dr. Louis Pribyl. “There is a profound difference between the types of unexpected effects from traditional breeding and genetic engineering.”
Pribyl said applying the GRAS label to GMOs was not scientifically sound. It was, instead, “industry’s pet idea”—a way to apply a formal stamp of government approval on foods that were, in fact, a completely new thing under the sun.
The director of the FDA’s Center for Veterinary Medicine went further, warning that using GMOs in animal feed could introduce unexpected toxins into meat and milk products. The head of the FDA’s Biological and Organic Chemistry Section emphasized that just because GMOs had not been proven to be dangerous did not confirm their safety. Saying that GMOs were as safe as traditional foods “conveys the impression that the public need not know when it is being exposed to new food additives.”
Likewise, deep inside the labs of government and university laboratories, enthusiasm for genetically engineered food was not nearly as uniform as its promoters in government or industry claimed. “This technology is being promoted, in the face of concerns by respectable scientists and in the face of data to the contrary, by the very agencies which are supposed to be protecting human health and the environment,” said Suzanne Wuerthele, a toxicologist at the EPA. “The bottom line in my view is that we are confronted with the most powerful technology the world has ever known, and it is being rapidly deployed with almost no thought whatsoever to its consequences.”
But given the revving engines of industry, it was tough for GMO skeptics in the scientific community to have their voices heard. University scientists applying for grants to look more closely at potential dangers of GMOs were routinely underfunded, squashed, or simply shouted down. Government scientists were stymied by the influence of the food and chemical industries, whose former executives were routinely placed at the top of the very agencies charged with regulating products made by the companies they used to work for.
It was no secret that the Reagan and Bush administrations had made subsidizing (and deregulating) these companies, and this technology, a national priority. There was no way a regulatory agency could fairly scrutinize an industry it was also funding with so much money, said Philip Regal, a professor at the University of Minnesota’s College of Biological Sciences.
“The more I interacted with biotech developers over the years, the more evident it became that they were not creating a science-based system for assessing and managing risks,” Regal said. “And as momentum built and pressures to be on the bandwagon mounted, people in industry and government who were alerted to potential problems were increasingly reluctant to pass the information on to superiors or to deal with it themselves. Virtually no one wanted to appear as a spoiler or an obstruction to the development of biotechnology.”
From biotechnology’s earliest days, “it was clearer than ever that the careers of too many thousands of bright, respected, and well-connected people were at stake—and that too much investment needed to be recovered—for industry or government to turn back,” Regal said. “The commercialization of GE foods would be allowed to advance without regard to the demands of science; and the supporting rhetoric would stay stretched well beyond the limits of fact.”
Despite this backbeat of scientific concern, the administration of George H. W. Bush—with considerable input from policy executives at companies like Monsanto—ruled that GMOs would not be subjected to any more testing than traditional foods. The GRAS policy would be explicitly designed not to test new food science but to assure “the safe, speedy development of the U.S. biotechnology industry,” Bush’s FDA Commissioner David Kessler wrote.
The FDA policy made it official: the agency was “not aware of any information showing that foods derived by these new methods differ from other foods in any meaningful or uniform way, or that, as a class, foods developed by the new techniques present any different or greater safety concern than foods developed by traditional plant breeding.”
Genetic manipulation was no different from breeding techniques farmers had been using for centuries, Kessler argued. Properly monitored, GM foods posed no special risk and should not require advance federal approval before being sold. “New products come to our kitchens and tables every day,” Kessler said. “I see no reason right now to do anything special because of these foods.”
Beyond giving a green light to the technology itself, the FDA even left the decision about whether new GM foods were GRAS to the companies themselves.
The victory of agribusiness over the FDA’s own scientists furthered a decades-long tradition, in which government agencies “have done exactly what big agribusiness has asked them to do and told them to do,” a fifteen-year veteran of the FDA told The New York Times. “What Monsanto wished for from Washington, Monsanto—and by extension, the biotechnology industry—got.”
Genetic engineering now had the full-throated support of the U.S. government. Administration officials and food industry groups of all kinds lined up to tout the benefits of biotechnology and celebrate the wall that had been erected to protect companies from federal oversight. The hands-off approach was framed as a fine example of what Bush administration officials called “regulatory relief.”
Such policy “will speed up and simplify the process of bringing better agricultural products, developed through biotech, to consumers, food processors, and farmers,” Vice President Dan Quayle announced. “We will ensure that biotech products will receive the same oversight as other products, instead of being hampered by unnecessary regulation.”
Quayle’s declaration put the full weight of the federal government behind a policy that had largely been dictated by agribusiness—especially Monsanto, which by this time had become the world’s largest developer of GM seeds.
“What Monsanto wanted (and demanded) from the FDA was a policy that projected the illusion that its foods were being responsibly regulated but that in reality imposed no regulatory requirement at all,” writes Steven Druker, an environmental attorney and author of the book Altered Genes, Twisted Truth. The FDA “ushered these controversial products onto the market by evading the standards of science, deliberately breaking the law, and seriously misrepresenting the facts—and that the American people were being regularly (and unknowingly) subjected to novel foods that were abnormally risky in the eyes of the agency’s own scientists.”
In the twenty-five years since the GRAS decision, the FDA has never overturned a company’s safety determination and, thus, has never required food-additive testing of any transgenic crop.
Allowing industry to regulate itself has led to a great deal of criticism, of course, since it’s rarely in industry’s best interest to reveal problems with its products, even when they are well known. This is not a new game, as users of countless other products—from Agent Orange to cigarettes to opioid pain relievers—have learned. In those cases, industry scientists knew their products were harmful, but companies continued to promote them and withhold conflicting data for years.
In 2002, a committee of the National Academy of Sciences, the country’s premier scientific advisory body, declared that the USDA’s regulation of GMOs was “generally superficial”: it lacked transparency, used too little external scientific and public review, and freely allowed companies to claim that their own science was “confidential business information.” The committee itself complained that it was denied access to the very information it needed to conduct its review—and not just by the companies. The amount of information kept secret by the USDA itself “hampers external review and transparency of the decision-making process.”
The EPA has also come under intense criticism for—among other things—the way it has handled the staggering population declines of bees and monarch butterflies, both of which have been linked to chemicals sprayed on hundreds of millions of acres of GM crops. The monarch is now as much a symbol for the anti-GMO movement as the polar bear is for climate change activists.
And the FDA? The agency’s own policy states that “it is the responsibility of the producer of a new food to evaluate the safety of the food.” Denied essential proprietary data by companies they are supposed to oversee, the FDA “is unable to identify unintentional mistakes, errors in data interpretation or intentional deception, making it impossible to conduct a thorough and critical review,” a study by William Freese and David Schubert at the Center for Food Safety found.
Such voluntary self-regulation means “government approval” amounts to little more than a rubber stamp, Michael Hansen of the Consumers Union told me. Even though companies test their products, they have a way of doing tests over and over until they get the results they like, Hansen said, and show only favorable results to the agencies overseeing them. Companies are not always forthcoming with the data they do accumulate, and sometimes actively refuse to turn over research even when federal regulators ask for it.
For four decades, the American legal system has repeatedly upheld the industry’s right to control the seeds underpinning our food. Monsanto alone filed 147 seed patent infringement lawsuits in the United States between 1997 and April 2010, settling all but nine out of court. The cases that went to court were all decided in favor of the company.
North of the border, where GMOs are considerably less popular, it has been a bit more complicated.
Wandering around his canola farm in Saskatchewan in the late 1990s, Percy Schmeiser noticed plants growing not just in the fields, but in a nearby drainage ditch. He did what many farmers would have done to get rid of an unwanted infestation: he sprayed the plants with glyphosate.
Nothing happened.
The canola plants, it turned out, had sprouted from genetically modified Roundup Ready seeds that had floated in from nearby farms. Schmeiser’s neighbors—and 30,000 other Canadian farmers—had paid Monsanto $15 an acre for the right to use these GM seeds; their harvests constituted more than 40 percent of Canada’s canola crop. Monsanto was keen to protect its product: they had farmers sign contracts agreeing not to save or replant the seeds, and they sent out inspectors to make sure the farmers were complying with seed contracts.
Schmeiser had not been part of this deal. For several years, he had planted his own (non-GM) canola fields with seeds he had saved from his own plants. After discovering Roundup-resistant plants in his ditch, he wondered just how much of his farm had been contaminated by Monsanto’s seeds. He sprayed three acres with glyphosate; 60 percent survived.
When word got out that Schmeiser had acres of Monsanto-patented seeds growing in his fields, someone called the company, using an anonymous-tip line the company had set up for farmers to turn in their neighbors. Monsanto sent private investigators to patrol the roads near Schmeiser’s farm. They took crop samples from his fields, and in 1998, Monsanto notified Schmeiser that he was using the company’s seeds without a license.
Undaunted, Schmeiser saved seeds he had harvested from plants that had survived spraying, and planted them on about 1,000 acres. Later tests would confirm that nearly 98 percent of these plants were Roundup resistant.
Monsanto sued Schmeiser for patent infringement. “We’ve put years, years, and years of research and time into developing this technology,” said Randy Christenson, Monsanto’s regional director in Western Canada. “So for us to be able to recoup our investment, we have to be able to pay for that.”
Schmeiser had a different take. “I’ve been farming for fifty years, and all of a sudden I have this,” he said. “It’s very upsetting and nerve-racking to have a multi-giant corporation come after you. I don’t have the resources to fight this.”
In court, Schmeiser argued that the GM seeds on his field had arrived the same way seeds have always arrived—they were blown in on the wind. “You can’t control it,” he said. “You can’t put a fence around it and say that’s where it stops. It might end up 10 miles, 20 miles away.” Furthermore, he argued, a company should not be allowed to patent a higher life form, like a canola plant. Plants were part of the natural order of things, not widgets that came off a company’s factory floor.
This was not the first time Canada’s courts had to wrestle with whether a company could own a life form. The country’s supreme court had previously ruled that Harvard University did not have the right to patent a genetically altered “OncoMouse” (a rodent genetically designed to rapidly develop cancer) even though it had taken university scientists seventeen years to develop it. Courts in Europe and the United States had sided with Harvard, but Schmeiser argued that the Canadian ruling—that an advanced life form could not be patented—ought to apply in his case too.
In 2001, a trial judge rejected Schmeiser’s argument and fined him $20,000 for infringing on Monsanto’s patent. On appeal—and in a show of just how complex biology and patent law can be—the Supreme Court agreed, but its 5–4 decision was split. The court’s minority argued that Monsanto had claimed patent protection over the gene and the genetic process, not the life form (that is, the plant itself), and that Schmeiser should not be held liable for using an (unpatentable) plant. The majority, interestingly, agreed: plants could not be patented in Canada. But the five justices also ruled that a plant’s genes could be patented: by “using” the plant, Schmeiser had in effect “used” the patented gene.
The ruling forced Schmeiser to turn over any Roundup Ready seeds or crops on his property. In a small consolation, the court ruled that Schmeiser did not have to pay Monsanto for the profits he had made from his crop. Monsanto, for its part, made sure the world knew its point of view. “The truth is Percy Schmeiser is not a hero,” the company says. “He’s simply a patent infringer who knows how to tell a good story.”
In effect, the Canadian court gave Monsanto legal control over something it could not patent—Roundup Ready canola plants—by giving it legal control over something it could control—the plant’s genes. In Canada at least, plants themselves are still not patentable, and farmers are allowed some protection if they don’t intentionally use patented seeds. Canadian growers also (for the moment, at least) still enjoy a “farmer’s privilege”—protected by the national Plant Breeders Rights Act—that allows them to save and replant traditionally bred seeds.
This is in direct contrast to laws in the United States, where the Supreme Court has ruled that plants can be patented in spite of laws protecting a farmer’s right to save seeds. This position—promoting the rights of large companies over the rights of small farmers—is very much in keeping with the American government’s longstanding and unwavering support of the biotech industry.
In the end, the early (and ongoing) rush to develop, plant, and profit from GM seeds has simply outpaced and overwhelmed our ability to understand their impact on our lives, John Vandermeer, the agroecologist at the University of Michigan, told me.
“I would be far less negative about GMOs had the people developing [them] taken the same approach as they did at Asilomar,” Vandermeer told me. “They could have said, ‘Let’s have a moratorium on selling them until we can be sure that they are safe.’ But partly because of the profits involved, that was never done. Had we done this, it’s my guess that Bt and Roundup transgenic crops never would have been developed and spread throughout the landscape. What we are discovering is that they should probably never have been used.”
It’s not just that such company-directed testing might miss (or cover up) a dangerous product. Chemical-intensive farming has also led to a tremendous loss of biodiversity—both above and below ground, Vandermeer says. From the massive genetic biodiversity of traditional agroecosystems, we now have millions upon millions of acres planted with the same hybrid corn variety. Soil, a fantastic ecology of interdependent living organisms, has been reduced to a medium “as devoid of life as possible,” Vandermeer writes.
This positive feedback loop—vast acreage planted with single crops, all propped up by rivers of chemical fertilizers, which then cause the monocultures to flourish—has also created a dramatic increase in the potential for collapse. All it takes is for an insect (or a virus) to pick the lock of a plant’s defenses, and an entire crop can disappear. A nineteenth-century blight in Ireland ruined a potato crop, and fully a million people starved to death. In the 1950s, the Gros Michel banana—planted on monoculture plantations across Latin America—was virtually wiped out by a fungus. Today, the Cavendish—the Gros Michel’s successor and likely the only banana you have ever eaten, whether you’ve eaten it in Los Angeles, New York, London, or Hong Kong—is grown on vast plantations in Asia, Australia, and Central America. And a fungus, called Tropical Race 4, has picked its lock. Unless breeders (including geneticists) can figure out a way to get banana trees to develop resistance, there will likely come a day very soon when we—outside the tropics, at least—will have no more bananas.
Here at home, this system also means that companies get to decide what products to create. In the United States, GMOs are designed more to make corn for cheese puffs and cheap hamburgers than to develop nutritionally dense food for people either here or in developing countries. Such uses cheapen the promise of food technology by using it to create empty calories and poor nutrition, serving industry profits but not the general welfare of either people or the planet.
Without broader research conducted outside the food industry itself, the editors of the scientific journal Nature say, the development of genetic engineering “will continue to be profit-driven, limiting the chance for many of the advances that were promised thirty years ago—such as feeding the planet’s burgeoning population sustainably, reducing the environmental footprint of farming and delivering products that amaze and delight.”
Leaving the power of GM technology to a group of global food conglomerates is plainly problematic for a whole array of reasons. But there are small pockets out there, mostly in university and other nonprofit research labs, where an entirely different approach to genetic engineering is taking place. Because while most GMOs currently bolster the production of cheap, unhealthy, processed food, there are scientists at work developing foods that could actually change the world for the better.