13

We have come this far without having examined the current structure of incentives surrounding America’s agricultural system. The ways in which we encourage the production of certain foods (but not others) via payments to farmers might seem somewhat disconnected from the issue of food safety, but as I’m about to argue, it has led to exponentially more pain, suffering, and even death than all strains of pathogenic bacteria combined. Indeed, if we are ever to realize a truly comprehensive definition of “safe food,” it is essential that we understand how our nation’s ag-related policies have led to sweeping changes in the businesses of food production, processing, distribution, and even eating. The last in this list might, at first glance, seem unrelated to the agricultural policies implemented by our government, but as you’ll see, it is very much connected in ways that might surprise you.

First, let us briefly examine the chief agency responsible for disseminating the subsidies that have played a massive role in shaping the American food landscape and, in the process, the actual American landscape. The United States Department of Agriculture is perhaps the best known of our food and ag-related agencies and, befitting its place at the top of the heap, has its hands in numerous facets of agriculture and its byproducts, along with a few other things you probably weren’t aware of. The USDA’s annual budget totals $149 billion for 2011, which I think you’ll agree is a nice little chunk of change. That reflects a rather substantial increase over the past few years: In 2008, the budget was a “mere” $93 billion. To put the USDA budget in context, consider that the Department of Homeland Security chews up about $55 billion annually, while the Department of Defense is good for more than $700 billion.

The USDA does a lot of things with its loot; as I skimmed through its 157-page 2011 budget report, I was frankly stunned by the depth and reach of the agency into facets of American and even global life that I’d never considered. The United States is a major exporter of agricultural products, totaling nearly $80 billion annually, and agriculture accounts for 1.2 percent of the US gross domestic product. So I suppose it only makes sense that one of the strategic goals laid out in the agency’s budget is to “help America promote agricultural production and biotechnology exports as America works to increase food security.” In other words, help us sell more food and, in particular, food that is derived from new technologies, including genetic engineering. That’s not me twisting the USDA’s words, by the way. Under the “key efforts” heading, the number-two goal is to “enhance America’s ability to develop and trade agricultural products derived from new technologies.” If I were in the business of big ag, I’d really appreciate that effort.

Cynicism aside, the USDA does some good work with our money. For one, it oversees the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC Program) and provides soil and water conservation technical assistance for farmers and property owners. For 2011, it is requesting $30 million for its Sustainable Agriculture Research and Education program, which provides competitive grants to scientists and producers experimenting with sustainable farming techniques.

Particular to the issues of foodborne illness that fall under its purview, the USDA oversees the Food Safety and Inspection Service, a $1 billion program that, according to the mission statement on its Web site, “is the public health agency in the US Department of Agriculture responsible for ensuring that the nation’s commercial supply of meat, poultry, and egg products is safe, wholesome, and correctly labeled and packaged.” (FSIS is also the agency that Bill Marler suggested to me should mandate the tattooing of its mission statement on every employee’s body. Presumably somewhere they will see it frequently.)

The USDA does something else with our money that arguably has more impact on the health and well-being of Americans than all its other programs combined. (Notice how I keep calling it “our money?” Well, that’s because it is; we are talking about a taxpayer-funded agency, after all.) The USDA hands out billions of dollars’ worth of agricultural subsidies each and every year, part of the farm income stabilization portion of the US farm bill, a 5-year ag-policy bill that was passed into law on June 18, 2008 (the first US farm bill began in 1965 as a multi-year bill, and a revised version has been passed approximately every 5 years since). Most of these dollars are earmarked for farmers cultivating America’s five most prolific crops: corn, cotton, rice, wheat, and soybeans.

In 2009, for example, growers of these five crops pulled in $15.4 billion, while growers of “specialty crops,” which include fruits, nuts, and vegetables, received a mere $825 million.

As if that imbalance weren’t pernicious enough, consider that agricultural subsidies disproportionately benefit a tiny number of large-scale operations, a fact that stands in contrast to the popular myth that subsidies exist to keep small farms afloat. In fact, fully 60 percent of American farmers don’t receive any subsidies at all; of the remaining 40 percent, 10 percent receive nearly three-quarters of the cash that’s handed out. This has helped create an emerging generation of megafarms. In 1982, farms with at least $1 million in sales accounted for 25 percent of US agricultural output; today, they are responsible for over half.

Making sense of American agricultural subsidies in the early part of the 21st century is a daunting task, in part because they are both complex and convoluted and in part because they simply don’t seem to make much sense at all. “The US agricultural subsidy program is a labyrinth,” Don Carr told me when I called him at the Environmental Working Group, a nonprofit that monitors and tracks subsidy payouts. “There are people who spend their careers studying the program and can still barely keep up with it.”

As Carr explains it, more than 20 programs fall under the rubric of “commodity subsidies.” They range from the $5 billion annually in direct payments, which are not based on need and are paid to farmers regardless of market conditions or other factors; to the Conservation Reserve Program, under which our current secretary of agriculture, Tom Vilsack, collected $62,805 over the past decade; to the Crop Insurance Program, whereby the USDA foots 50 percent of the premium for catastrophic coverage. The Conservation Reserve Program, by the way, is intended to help landowners address natural resource conservation issues, such as soil erosion or water quality.

It was not always this complicated (and it is currently much more so than what I’ve just described). Ag subsidies as we currently know and define them were enacted in the years following World War I, when commodity prices collapsed on the global market. This was a direct response to Europe’s war-diminished economy, which could no longer afford to import massive quantities of food from the United States. Like most commodity crashes, it came on the heels of a production bubble: American farmers had recently expanded to meet the needs of hungry Allied forces, utilizing the rapidly evolving agricultural technology of the day to churn out more food than ever. Land that had been grazed by animals since the days of abundant buffalo populations was quickly converted to grain production to cash in on the boom.

Things might’ve worked out okay, at least in the short term, if not for the drought that followed directly on the heels of this unprecedented expansion of American crop-growing acreage. In the southwestern United States, in a region commonly known as the “Staked Plains,” cultivated acreage more than tripled between 1925 and 1930, and this sort of increase was common throughout our nation’s breadbasket. Native prairie grasses were plowed under, leaving the fragile topsoil exposed and vulnerable. What little rain did fall quickly evaporated in the hot midwestern sun, and when the winds blew, countless particles of dust were carried aloft. The Dust Bowl had begun.

As if it weren’t enough that American farmers had just lost a significant chunk of their global business and were now being chased off the farm by the endless dust storms, the Great Depression was establishing its crushing grip. Suddenly, millions of Americans couldn’t afford food basics, and farmers found themselves sitting on massive piles of grain they couldn’t sell even as their fellow countrymen and women went hungry. To put it mildly, it was not a good time to be farming in America.

So the government stepped in. This would not be the first time the US government came to the aid of its farmers (remember the government-run seed program?), but it marked a dramatic shift in American agricultural policy. Previous efforts, such as the Homestead Act, had all been aimed at developing the infrastructure and providing the inputs necessary to enable growth and success. But now, having done that job all too well, the country was faced with an entirely different problem: how to support an industry that had grown too big, too fast, and was now floundering in the face of many and varied forces.

Thus came the aptly named Agricultural Adjustment Act of 1933, a nifty little piece of New Deal legislation that paid farmers to reduce production in hopes of bringing some equilibrium in the demand-supply equation. In other words, it had just become law to pay farmers not only for what they grew but also for what they didn’t grow, which sounds like pretty nice work if you can get it.

That’s a bit flip, of course: The producers who were getting paid to allow their fields to lie fallow weren’t exactly laughing all the way to the bank. Since 1929, farm income had fallen by over 50 percent, and the nation was mired in the Great Depression. Many farmers were suffering under the oppressive clouds of dust and relentless drought that characterized the Dust Bowl; the implementation of the Agricultural Adjustment Act didn’t exactly mark a U-turn in their fortunes.

But it did unquestionably mark the dawn of a new agricultural era in the United States, one whereby the rules that had forever dictated the farm market—supply, demand, weather, disease, and so on—were manipulated and forced to cede to federal oversight. For the next 40 or so years, US farm policy spun on an axis of “supply management” that attempted (and at least partially succeeded) to smooth over the boom and bust cycles inherent to commodity markets—and in particular, commodity markets that are affected by forces beyond human control. Over this period, American farm policy was relatively equitable, offering a degree of stability to large-scale and family farmers alike.

That did not last, and the reason it did not last can be summed up in one name: Earl Butz. One can only imagine the childhood cruelty aimed at a boy named Butz, but Earl wasn’t about to let the bullies get him down. Instead, he applied himself to the business of agriculture, first on his parent’s 160-acre Indiana farm, then in a variety of ag-related associations, before serving as assistant secretary of agriculture in the Eisenhower administration and becoming dean of agriculture at Purdue University. But it wasn’t until 1971 when President Nixon appointed him secretary of agriculture that Butz began to shape the American diet.

The way he did this was disarmingly simple: Rather than attempt to balance supply and demand via the lever of government policy, typically by encouraging supply restrictions, Butz did everything he could to promote what he called “fence row to fence row” farming. Dissenters worried that the scheme would result in massive surpluses and a subsequent price collapse, but Butz insisted that global markets and a growing population would absorb surplus production, thereby avoiding the bust component of previous boom and bust cycles.

Just as conditions of the late 1920s and early 1930s had converged to give rise to the original ag-subsidy program, the early ’70s saw a chain of events that made Butz look like a prophetic genius. In 1972, Russia suffered an abysmal grain harvest; to meet demand, they turned to the United States, where they were able to purchase two-thirds of America’s wheat reserves at a bargain price. (In order to ensure a favorable price, Russia had concealed the truly desperate nature of its crop failure. The deal would come to be known as the Great Grain Robbery.) Thus Butz’s assertion that international markets would buy up America’s grain surplus was proved—at least for one season—correct.

On the heels of the Russian deal, US food commodity prices went ballistic. Farmers were soon enjoying tremendous profits, and Earl Butz’s standing rose exponentially. But before long, something else began rising: domestic food prices. Sitting in the political hot seat, Nixon commanded Butz to orchestrate an immediate increase in grain production. Butz, who must have felt as if he’d just been handed the keys to America’s agricultural kingdom, used his newly minted credibility with the nation’s farm industry to make good on Nixon’s order. Thus began the era of never-ending American agricultural plenty.

What does any of this have to do with food safety, you may ask? Imagine a farm industry that is suddenly being encouraged to produce more grain (yes, corn is a grain) than ever before, more grain, in fact, than food producers know what to do with. After all, of $113.6 billion in taxpayer-paid commodity subsidies doled out by the USDA between 1995 and 2004, $41.8 billion went to corn growers. That’s more than cotton, soy, and rice combined.

No wonder that corn has become the poster grain of America’s agricultural industry; no wonder that, with subsidies flowing like corn kernels through a grain chute, this abundance quickly becomes the new normal. No wonder that the never-ending flow of commodity food ingredients dragged prices steadily lower, thereby encouraging food producers to dream up ways to utilize these bargain inputs. And no wonder that the primary building block of these new foodlike products is corn—most frequently, corn that has been engineered into high fructose corn syrup (HFCS). The result: Between the early 1970s and 2008, annual US consumption of HFCS per capita rose from practically nothing (to be fair, it wasn’t invented until 1971) to 38 pounds.

Finally, consider this: In 1971, when Earl Butz took command of US agricultural policy, the obesity rate in the country was about 15 percent. Today, more than a third of Americans are classified as obese, and rates of type 2 diabetes have doubled since Butz’s appointment. Coincidence? Perhaps, but I rather think not, because it just so happens that type 2 diabetes is associated with the consumption of refined carbohydrates, and HFCS is about as refined as carbohydrates get. High fructose corn syrup does not deserve all of the blame for the doubling of afflictions that have killed and sickened millions of Americans over the past few decades (far more, I’ll note, than pathogenic bacteria), but it sure as heck deserves a lot of it, a fact that is becoming clearer by the day.

In 2007, for instance, a Rutgers University study found that soft drinks sweetened with HFCS contained “astonishingly high” levels of reactive carbonyls, which are highly reactive (hence their name) compounds associated with unbound fructose and glucose molecules. Reactive carbonyls are not present in table sugar, where the fructose and glucose components remain bound together and are chemically stable. None of this would be a problem if not for the fact that reactive carbonyls are widely believed to contribute to diabetes—particularly diabetes in children.

More recently, a 2010 study by Princeton University demonstrated that male rats given access to water sweetened with HFCS became obese, while those fed water sweetened only with table sugar did not. The Princeton researchers could not say with certainty why HFCS appeared to cause obesity while table sugar did not, but the answer may again lie in the reactive carbonyls, which are more readily absorbed and utilized than the bound fructose and glucose molecules in sugar.

The upshot is that the trend of skyrocketed diabetes and obesity in this country is not expected to level off anytime soon. In fact, the total number of Americans with type 2 diabetes is expected to double in the next 25 years (but then, considering the number has already doubled from where it was a decade ago, we might well beat that by a wide margin). For most of us, that’s terrible news, because type 2 diabetes is associated with blindness, kidney failure, heart disease, and stroke, among other things you don’t want to deal with. In the boardrooms of the pharmaceutical companies that manufacture the drugs used to treat diabetes, it’s a case of turning lemons into lemonade. Really, really sweet lemonade: Already, diabetes drugs add $5 billion annually to the pharmaceutical coffers (never mind the loot they collect from selling the medications used to treat the disease’s complications).

And there’s this: Thanks to agribusiness subsidies, we’re actually paying many times over for the privilege of being fattened and sickened by corn. According to research conducted by the Cato Institute in 1995, every dollar of profits earned by Archer Daniels Midland’s corn sweetener operation costs consumers $10 in subsidy payouts.

The good news is that consumers seem to be catching on: Over the past decade, consumption of HFCS has actually declined from its peak of 45.4 pounds per year, a decrease that can almost surely be credited to the rash of studies linking it to disease. The industry response has been nothing if not predictable: In September 2010, the Corn Refiners Association petitioned the FDA for approval to change the sweetener’s name to “corn sugar.”

Despite the slight drop in consumption of HCFS, the number of American deaths attributable to obesity and diabetes is staggering. In 2005, the latest year for which statistics are available, diabetes contributed to a total of 233,619 deaths; according to the Office of the Surgeon General, an estimated 300,000 American deaths can be attributed to obesity each and every year. That is 295,000 more than are believed to die from foodborne illness annually.

And there’s this: A peer-reviewed study published in the January 26, 2009, edition of the journal Environmental Health showed that nearly half the samples of HFCS tested contained mercury. That’s because in the HFCS manufacturing process, lye is used to separate cornstarch from the kernel, and one of the popular methods for making lye involves pumping salt through mercury. A second study, conducted by industry watchdog group the Institute for Agriculture and Trade Policy, found that nearly a third of the 55 brand-name foods tested contained mercury. Of the products that tested positive, most contained HFCS.

Of course, it is impossible to prove that our nation’s agricultural policy has directly contributed to what is arguably the most deadly aspect of our food system: its utter lack of nutritive value. Correlation doesn’t mean causation and all that. But as Don Carr put it to me, “If they hadn’t built the subsidy system, it would be much harder to produce these cheap calories. We have essentially taken the risk out of producing certain raw ingredients, while at the same time accelerating the scale and consolidation of our food system in ways that favor the production of these crops.”

For the literally hundreds of thousands of Americans who suffer and die from diet-related afflictions, E. coli and salmonella are not the most palpable threats in their food. No, the most palpable threat in their food is the policy behind it, a policy that has given rise to a system of constant abundance that, even as it fills our stomachs to bursting, offering a false promise of wellness and short-term satisfaction, starves us of our long-term health.

This is the unspoken truth about food safety in the United States: Our food doesn’t even need pathogenic bacteria to sicken. It does just fine on its own.