Upon a thousand hills the corn
Stands tall and rank and glossy green;
Its broad leaves stir at early morn,
And dewy diamonds drop between.
A myriad banners wave o’erhead,
And countless silken pennons fly;
The tasseled plumes bend low, ’t is said,
And only silken ears know why.
—from “The Growing Corn”
by Frederick J. Atwood
Perhaps it is because the United States is a nation of immigrants and pioneers, but there is something in the American spirit that has tended toward exploration and innovation. And perhaps the immense openness of the prairies and plains made it natural that big ideas and big changes would be at home in the Heartland. Never before in the history of the world had so much changed so fast—and in the Midwest, a lot of that change was in some way related to corn. Trains, cities, stockyards, grain elevators, disassembly lines, crops becoming commodities, new machines and technologies, all these contributed to changing lives—and to greater and greater demand for corn. That demand called for additional changes, revolutionary ones in some cases. These changes essentially fall into the categories of breeding, feeding, weeding, killing bugs, altering nature, and legislating.
As for questioning changes: that refers to the pushback from those who have adopted a different paradigm: quality over quantity, sustainability, or some other approach that challenges the trends. Even in this group, there are differences, such as those who use no chemicals and those who use natural chemicals accepted by the government. Interestingly, this “counterrevolution” started well before the developments that changed farming so dramatically after World War II. Sustainable, organic, and alternative farming all have their roots in the early 1900s—because using chemicals was not the first farming practice to come under criticism. The destructive plowing practices that led to the Dust Bowl in the 1930s were being attacked even before the topsoil blew up into legendary dust storms. People have, in fact, been trying to improve and/or reform farming practices ever since settlers first spread across the Midwest.1 With so much land, “the way we’ve always done it” was suddenly not good enough. As a result, many of the problems farmers have today are improvements on the problems farmers had back then—but, more importantly, the history of trying to improve things suggests that people will keep trying to make things better. It’s just that not everyone has the same definition of “better.”
For millennia, and on all the inhabited continents, people have been exploring how things work and how they might be changed—most especially in the arena of food plants, since they are so vital to survival. However, during the 1800s, the application of science to agriculture exploded, both in Europe and in the United States. New scientific approaches to farming were explored. As had always been the case, observation was a key element of research, but now experiments were systematized—and done over and over. There was also far more sharing of information, so that more people could benefit from others’ work.
Isaac Funk, a German immigrant who came to Illinois in 1824 and worked his way up from poverty to become a successful farmer and landowner, made it something of a personal mission to spread scientific concepts that he believed would help both farmers and society as a whole.2 As a farmer, proponent of science, and later a legislator, Funk had an impressive impact on agriculture. However, his heirs would have an even greater impact on the Midwest. His son, Lafayette Funk, helped found the Chicago Union Stock Yards,3 and his grandson, Eugene, would become a key player in promoting the hybridizing of corn—the first major change discussed below.
Funk was not alone in his effort to spread new ideas about farming. John S. Wright was a dedicated urban booster, but he believed the growth of Chicago was tied to growing and improving farms, so in 1841 he began publishing a magazine called Prairie Farmer, which was filled with experts’ writings about irrigation, plant disease, fertilizer, pest control, and other issues key to farmers’ success. Wright’s magazine became well respected among farmers, and while it contributed to the farming boom after the Civil War, it also had the effect Wright had originally hoped for, which was making Chicago a center of trade.4
The drive for more knowledge about agriculture led to the creation of colleges across the Midwest that focused on training future farmers. These schools were often known as “land-grant colleges,” because they were funded by grants of land given to the states by Congress. The Morrill Act of 1862, signed into law by Abraham Lincoln, granted each state 30,000 acres of public land (or the equivalent in scrip, if land was not available) for each member of the Senate or House. States were to sell the land or scrip in order to build schools that would teach agriculture and mechanical arts, to meet the nation’s need for scientifically trained and forward-thinking agriculturalists and technicians.5 (Perhaps because the Civil War had started by this time, the grants also required military studies for the students at these schools, which is where the Reserve Officers Training Corps, or ROTC, had its genesis.)6
Sixty-nine land-grant schools were created as a result of this act, and while these agricultural schools span the nation, some of the best known of them are in the Midwest.7 Kansas State was one of the first of the land-grant universities established. Originally named Kansas State Agricultural College, it was founded in 1863.8 The Illinois Industrial University opened its doors in 1868—and welcomed women by 1871. The school’s offerings soon expanded beyond the purely practical, and by 1885 it was renamed the University of Illinois (Urbana).9
Iowa had been planning a college before the Morrill Act was passed, and it was the first state to accept the terms of the law, in September 1862. The Iowa Agricultural College and Model Farm welcomed its first students (twenty-four men and two women) in 1869. The school was renamed Iowa State University of Science and Technology in 1959.10 Classes at the Ohio Agricultural and Mechanical College began in 1873, with twenty-four students. The school changed its name in 1878 to Ohio State University.11 Purdue University in Indiana was founded in 1869, named for major donor John Purdue. Classes began in 1874, with thirty-nine students and six instructors.12 The University of Wisconsin (Madison) existed before the Morrill Act but later became a land-grant school. It was founded in 1848, women were admitted in 1863, and the school became a land-grant college in 1866.13 These are by no means all the midwestern colleges founded at this time, but they are perhaps enough to suggest the impact of the Morrill Act. It was, in fact, revolutionary, putting a college degree within reach of all high-school graduates who wished to pursue higher education.14
With the opening of so many agricultural schools and with so many students now taking a scientific approach to agriculture, things began to change. Several of the developments in farming and processing corn have already been described. But a lot of new “revolutions” still lay ahead.
In the 1850s, Gregor Mendel, a monk living in what is now the Czech Republic, began experimenting with inherited traits in plants, and so was born the field of genetics. It took a few decades for news of Mendel’s research to reach the scientific community on the far side of the world, but once it did, corn immediately became the darling of American geneticists. As scientist and corn researcher Dr. Stephen P. Moose notes, “Technically, genetics had been done with corn all along. That’s how we got so many varieties. But with Mendel, we entered an age of scientific genetic research. Corn was perfect for this type of work, because the male tassels and female silks are separated. You can control pollination. As a result, corn led the way in plant genetics research. For more than a century now, we’ve made one improvement after another.”
Serious research began at the University of Illinois in 1896, when scientists began discussing how to improve the protein content of corn for livestock feed. This was one desired change, but the improvement most sought as the 1900s began was making corn more reliable—and more abundant. Corn farmer and researcher Eugene D. Funk felt it was the responsibility of those who farmed to “do something for the rest of the world,” and feeding the world was where he wanted to start.15 After more than a decade of research, Funk wrote, “That little kernel, corn, capable of springing forth into a beautiful living plant and growing to a height of twelve or more feet within the short period of ninety days, and what is greater still, to be able to reproduce itself over 1,000 fold during one short season, surely we ought to talk more about it, to study its characteristics and habits until we have learned many things yet unthought of. The farmer of the corn belt has scarcely begun to realize the possibilities and necessities that lie before him in order to meet the future demands of corn.”16 He then went on to outline his own observations, including the fact that no two ears of corn were really alike and could, in fact, be quite variable—which made corn less predictable than one might wish. Was it possible to make ears of corn not only better, but also more reliably similar, with each ear reflecting the desired improvements? That was the question researchers now faced.
As is usually the case with major developments of any kind, a number of people were involved in pursuing the goal of improved corn and greater yield. Because of this, different sources often cite different people as being responsible for the next dramatic step to occur. All of them were involved, but there were a few key players behind an innovation that would change corn farming—along with everything touched by corn—dramatically.
That innovation was hybridizing. The discussion about corn that began at the University of Illinois in 1896 had included Dr. Cyril G. Hopkins. Hopkins hoped to determine whether corn could be developed in which the quantities of starch and protein differed from one variety to another. So he proposed a series of experiments to explore this idea.17 Everyone understood that, because corn is wind-pollinated, pollen could come from anywhere, which would explain why even with very careful selection, a farmer couldn’t really be certain what he’d get when he planted corn. Inbreeding had been attempted, with discouraging results. A young chemist at the Illinois Experimental Station, Edward M. East, told Hopkins he thought he knew how the problem could be solved. East put forth the idea of doing more than simply inbreeding maize; he thought they should go further and inbreed the inbreds. Tassels had to be enclosed in paper bags, to collect the pollen. Silks were protected by other bags, to keep them from being pollinated. Then each plant was pollinated with its own pollen. East wanted to establish several inbred lines, and then cross the inbreds with each other. The experts laughed.18
In 1905, East packed up his collection of Illinois corn and moved to Connecticut, where he tested his theories (and apparently benefited from the ideas of geneticist George Shull, who began inbreeding corn in 1906). East developed some solid theories, but it was one of his students, Donald Jones, a Kansas farm boy who moved to Connecticut to work with East, who took the corn experiments to the next level.19 Jones increased the level of inbreeding and produced an improved corn, but unfortunately, only about one in a thousand of his crosses would grow corn exhibiting the desired improvements. But at least inbreeding had been shown to be capable of producing vigorous corn with the most desirable traits preserved.20
As part of the pursuit of better corn, Eugene Funk, along with several family members, had founded the Funk Bros. Seed Company in 1901.21 Funk had the energy, passion, land, and financial resources to pursue improved grain, but he was not a scientist. So he turned his attention to interesting scientist Dr. James Holbert in dedicating himself to corn research.22 In 1916, at the same time Jones was working in Connecticut, Holbert began the first corn-inbreeding experiments in the Midwest. He hoped not only to create higher-yielding corn, but also to produce corn that would be resistant to drought and diseases such as root rot. By the early 1920s, Holbert had developed his own double-cross hybrids at the Funk Farms Federal Field Station. This was the first hybrid corn grown in Illinois.23
George Hoffer visited Donald Jones in Connecticut in 1918 and was so impressed that he took some of the seeds Jones had developed back to Purdue College Farm. This would be the first hybrid corn grown in Indiana. Encouraged by the results, Hoffer launched new research.24
The first hybrid corn grown in Iowa was developed by Iowa native son Henry Wallace, who later became FDR’s vice president. Wallace’s father and grandfather had founded Wallace’s Farmer, an agricultural magazine, so it is perhaps not surprising that Wallace was interested in farming, but he later attributed at least part of his passion for experimentation with corn to his friendship with plant scientist George Washington Carver, who lived in the Wallace home while pursuing his master’s degree at Iowa State (where Carver was the school’s first African American student and faculty member).25 In 1919, Wallace visited Jones and East, and then kicked off his own intensive program of corn inbreeding.26 He was so enthusiastic about the potential shown by hybrid corn that in 1926 he incorporated the Hi-Bred Corn Company. The name was changed ten years later to Pioneer Hi-Bred, and it grew into the world’s largest seed company.27
Not everyone working on hybrid corn was a scientist. Lester Pfister had left high school at age fourteen to help his widowed mother with the family farm in El Paso, Illinois.28 He began his own experiments in 1925, stating, “I’ll be happy if I can contribute just a little that will take some more of the gamble out of farming.” When the Great Depression struck, the farm was in danger of being foreclosed, but Pfister knew that he had the corn that would change his life. He went to a bank for a loan, unwrapping a beautiful ten-inch-long ear of corn, as fine as any grown in the area. He then unwrapped a second ear, grown from his hybrid seed. It was fourteen inches long and just as beautiful as the first ear. He got the loan. The seed he’d developed, known as the 187 Hybrid, became one of the most important of the decade. The USDA released it to agricultural stations and private firms (including the major seed companies, Funk, Pioneer, and DeKalb) for testing, and it proved to be outstanding. Pfister was an overnight success. The title “Outstanding Corn Breeder of the World” was conferred by the Museum of Science and Industry. Pfister Seed joined the rank of the giants in producing hybrid corn seed.29
Hundreds of inbred lines had been created, and researchers evaluated thousands of crosses. By the 1930s, reliable hybrids had been developed. Demonstration plantings were convincing many farmers to try hybrid corn, and in 1935, demand for hybrids in the Corn Belt exceeded production. The hybrid seed industry was on its way. However, to grow this controlled corn on a commercial level, fields had to be isolated and the tassels had to be removed from every cornstalk before they could release their pollen.30 (And because seed has to be produced every year, this job has never gone away. Detasseling corn is a tough job. Days are long, since pollen won’t wait, and the work is hot—especially because detasselers are usually covered from head to toe, to keep from being slashed by sharp-edged corn leaves. The job is demanding, because at least 99.7 percent of the corn must be detasseled, to keep seed from being contaminated. Detasseling, which compensates the discomforts fairly well, has now become a classic summer job for thousands of high school and college students in corn-growing areas across the Midwest.)31
By the late 1930s, the USDA was promoting hybrid corn. Every state in the Midwest had experimental stations or universities doing research, and many farmers were eager to try the new wonder crop. In 1942, Iowa, always a top corn-producing state, became the first state to plant its entire corn crop with the new hybrid seeds.32
Richard Crabb, farm editor for the Moline Dispatch during the first half of the 1900s, relates in the foreword of his book The Hybrid Corn-Makers that when he asked farm adviser Frank Shuman what he thought of the then-new hybrid corn, Shuman responded, “Greatest food plant development in 500 years, greatest plant discovery since Columbus found corn itself.”33
What did all this inbreeding accomplish? Hybrid corn produced ears that were both longer and fatter than regular ears. The plants were vigorous; the kernels were well filled out. And every ear was very much like every other ear. When planting hybrid corn, one knew exactly what one would get. (Neither farmers nor researchers knew it at the time, but there was another huge advantage to the hybrids: the similarity of ear and kernel size would eventually make mechanical harvesting possible.) In addition to consistency, hybridizing provided a lot more corn. In Iowa, in 1900, yield was around thirty-eight bushels of corn per acre—and that was impressive by the standards of the day. By 1940, it was fifty-one bushels per acre.34 People were excited. Some considered it the country’s most important plant-breeding innovation.35
There was a downside, of course. Because of all the work that went into them, hybrids could be patented, so farmers had to buy hybrid corn, if they didn’t want to do their own inbreeding. Trying to save seed from hybrid corn didn’t work, because hybrid seed doesn’t breed true—the corn that grows will not be like the kernel that was planted. However, that had always been true of corn to a certain extent, and most people accepted the trade-off—paying for seed in exchange for large, handsome, reliable ears of corn that dramatically increased the number of bushels per acre that were harvested. Today, approximately 95 percent of the corn acreage in the United States is planted with hybrid corn.36
Research continued, and continues today. As Dr. Moose noted, there has been one improvement after another. Those 51 bushels per acre that were so impressive in 1940 had become 100 bushels by 1970 and 145 by the year 2000.37 Many farms are approaching 200 bushels per acre today. So the promise of increased yield has proven to be true.
However, some have their reservations about the benefits of this research. Might something have been lost in the course of these developments?
Dr. Walter Goldstein, agronomist, corn researcher, and founder of the Mandaamin Institute in Wisconsin, has focused his work on breeding corn for enhanced nutrition. He notes that corn has been bred to be a higher-yielding crop that will take more stress. “We’ve adapted corn to high population density,” Goldstein explains. “Companies have improved varieties so plants can be packed together. They can withstand the stress of crowding. They have also bred for synchronicity of flowering and silking, which improves seed set and limits cross-pollination. The inbred corn varieties are getting stronger and stronger, but the tassels are getting smaller and shedding less pollen. This may not sound like a problem, but it can be, and it’s especially bad in a drought year, because the stress of high heat or drought can interfere with the pollination process.
“Yields are increasing, but the quality is decreasing,” Goldstein continues.
There has been a constant decrease in protein content. If we compare our corn to what Native Americans were eating, theirs had a lot more protein, especially lysine and methionine, which are essential amino acids. Also, under conventional conditions, the root systems are not healthy. I’ve been studying root health across several states, comparing roots of conventionally raised corn to those of organically raised corn. There is about double the amount of root disease in conventionally raised corn. The corn compensates by growing more roots, which was not what we expected. Older races would have just fallen over. But new varieties adapt. However, with more disease, corn needs more fertilizer and is more sensitive to a lack of water—and needing more water is a big problem in drought years. I use organic growing methods, and my corn did well even during the recent drought. Manure helps reduce root disease, plus my corn didn’t need as much water.
The focus of Goldstein’s work has been on old-fashioned breeding techniques. He says he respects the productivity and stress resistance of the hybrids, but he is concerned about outcomes. “Older corn varieties taste better,” he states. “I want to get the quality back in corn. I believe it can be productive and reliable, but can also be more nutritious and tastier. We won’t be able to produce as much corn per acre, but we could produce more protein per acre. We can also produce more carotenoids, which, when fed to chickens, produce very orange/yellow egg yolks—which means more nutrition for the consumer. Carotenoids are more easily assimilated from eggs than they are from carrots.”
Goldstein is concerned primarily with corn consumed by humans and animals. High-starch corn is easily converted to ethanol, so corn raised for nonfood purposes would not need to reflect his methodologies—though ethanol production does contribute to shaping the debate. However, because such a tremendously large percentage of the corn the United States raises is consumed by humans and animals, Goldstein is determined to educate as many people as possible about what can be gained from breeding for taste and nutrition, and not just for high yield and stress resistance.
Minnesota-based organic-farming pioneer and educator Martin Diffley is involved in breeding sweet corn. Sweet corn is not raised in the same high-stress situation as field corn, but it can still benefit from an alternative approach. “People think organic sweet corn can only be raised for the local farmers’ market, but sweet corn for commercial use is being grown organically here in Minnesota.” Diffley’s corn is open-pollinated, and he encourages other farmers to cross-breed their corn with his. “Through five years of cooperative breeding with the Organic Seed Alliance, we’ve developed open-pollinated, early-maturing, cold-soil, vigorous corn. The pericarp is tender. The flavor is outstanding. My focus is on helping the public get better food, so I really want that public to continue breeding, using what we’ve developed.” He notes that he prefers the hands-on aspect of farming over a laboratory approach. “The best part of an organic farming system is observing and working with it, and living with a holistic system.”
In the early 1800s, concepts such as crop rotation and fertilizing, though they’d been in practice for millennia on farms in the Old World, were not widely known or practiced in recently settled territories. On the frontier, settlers were often new to farming. They made an effort based on things they’d heard or read in almanacs, but with the wilderness still stretched out ahead of them, following the tradition of the Native American and simply moving to new land sometimes seemed like the most viable option when present land became exhausted. For those who did fertilize, there were compost, ashes, and manure—with manure as often as not spread simply by releasing livestock into the cornfields to graze after harvest. By the mid-1800s, the first commercial fertilizers appeared, often in the form of bird guano brought in from South America or phosphate rock mined in South Carolina. (Of course, the growth of towns was a key element in the increase in commercial fertilizer use, as there had to be someplace selling it before people could buy it.) Then, with the rise of interest in scientific approaches to agriculture in the late 1800s, more options arose as more experiments were done and more attention was focused (by new universities and research centers) on how to keep the land fertile.38
Soil science was well enough developed by 1880 that soil testing, to evaluate nutrients, was becoming part of planning fertilizer use. Soil scientists worked to develop standards for determining nutrients in synthetic fertilizers. Farmers were buying fertilizers that contained primarily phosphates and potash but had little nitrogen. This was actually an economic decision, rather than a lack of understanding—nitrogen, represented in fertilizer by nitrates, is fairly rare, so nitrogen was the most expensive of the three key fertilizer ingredients. (Bird guano contained nitrates, but mining it in Peru and shipping it to the United States did not make it a cheap alternative.) Not until World War I was a process developed that could produce abundant nitrates, in the form of ammonia, out of methane, which is a major component of natural gas. (Plants can absorb nitrogen directly from ammonia.) It was, in fact, the same process used for developing the nitrates needed for explosives during the war.39
By the 1930s, more people were using commercial fertilizer, but use was still not widespread. The farmers who did add nitrogen, phosphorous, and potassium to the soil found that it boosted their yields. However, corn prices were down, so there seemed to be little point in spending money on fertilizer to get more corn, when that additional corn wouldn’t sell for enough to cover the cost of the fertilizer.40
During the Great Depression, the government stepped in. The Tennessee Valley Authority (TVA) was established by Congress in 1933. The heart of the TVA’s mission was to build dams along the Tennessee River to help control flooding and to produce low-cost electricity.41 Electricity was one of the things needed to produce nitrates. The National Fertilizer Development Center was established at Muscle Shoals, Alabama. When World War II started, the Muscle Shoals facility began producing nitrates for munitions, but it also began to sell nitrates to farmers in 1943. When the war ended, the TVA became the country’s primary research-and-development center for fertilizer. They began selling ammonium nitrate, the first nitrate fertilizer widely available to farmers. Scientists at the TVA facility continued to do research, often sharing their findings with private fertilizer companies.42
Older farmers were less likely than younger ones to jump into using the new fertilizers. However, by the 1950s, the evidence that commercial fertilizer improved yields was so great that the number of those accepting it began to increase. After all, switching to hybrid corn had been effective, so why not try this new product of science? Another reason to try the new fertilizers was that people were running out of alternatives. Farmers used manure (and most still do, if they have access to it), but not enough manure was being produced by livestock to supply all that was needed.43 As more farmers tried the commercial fertilizers, word began to spread. In 1953, about 69 percent of the farmers in Iowa were using the new fertilizer, and of those, 87 percent said they thought the fertilizer had benefited their crops. Reports like these increased interest in fertilizer.44
As more farmers accepted commercial fertilizer, more of the key components were produced. The United States was for many years the world’s leading producer of nitrates. However, as the price of natural gas increased, making the methane needed for producing nitrates less affordable, U.S. production of nitrates decreased, and the country now imports most of its nitrate fertilizer. It also imports much of the needed potash. However, phosphates remain abundant, and the United States still exports those.
Research and experimentation continued, with both scientists and farmers trying to figure out what worked best. In the 1950s, overuse of nitrates, or using nitrates at the wrong time, created pollution problems. Two changes helped reverse this. First, it was demonstrated that “side dressing,” applying the nitrates after the corn was growing, meant nitrates were absorbed by the plants and did not leach into the soil. Second, the increased precision of measurements of soil nutrients—not just how much, but exact location in the field—allowed farmers to reduce the amount of fertilizer used.
Use of commercial fertilizer continued to increase until about 1980, and then it leveled off. It had created the unprecedented ability to grow crops on the same fields year after year. Most corn farmers still rotate corn and soybeans, but they rely on fertilizer to enrich the soil. However, though continuous use of fields offers the promise of providing enough food for the world’s increasingly large (and increasingly urban/nonfarming) population, the practice has led to problems, such as an increase in plant disease.45 Once again, the choices are not easy.
Martin Diffley, who has forty years’ experience in organic farming, describes how feeding takes place in an ideal organic system:
If an organic farmer employs a biological approach, he or she can feed the soil naturally, reestablishing its health. You need three years of cover crops such as hay, grass, and legumes before you plant “heavy feeders”—plants that take nutrition out of the soil, such as corn, broccoli, potatoes, or cabbage. You incorporate the cover crops into the soil and let them decompose, and that helps feed the soil. Add compost or manure, and your soil is ready to grow crops for two or three years. Then you need to rest the land and start again with the cover crops. The nice thing about cover crops is that animals can forage on them. So you feed the livestock and the soil at the same time. This is how farming was done before World War II. Conventional farms are now largely dependent on synthetic nitrogen sources. Rotating with soybeans helps, but not enough.
Researcher and organic farmer Carmen Fernholz encourages an even more robust crop rotation. “For cover crops, we use small grains, such as wheat or rye, tillage radish, and hairy vetch. In North Dakota, they’re using clovers, turnips, cowpeas, and sunflowers as cover crops. These will all pretty much just get plowed under. If we can improve the soil, we can improve everything else.”
While people had been using substances to feed plants or kill bugs for millennia, only after World War II was there an alternative to pulling up or plowing under weeds.46 For the nonfarmer, it might be hard to appreciate the issue of weeds, but it was, and is, a big one. Weeds rob the ground of moisture and nutrients. They choke out plants one wants to grow or create shade so no sun gets through. They are often toxic to livestock. For all these reasons, weeds have to be eliminated, or at least reduced, but that traditionally involved a huge amount of labor. The lost crops, sickened livestock, and additional labor meant huge expenses for farmers. Toss in the fact that the draw of the big city and the two world wars made it increasingly difficult to get the needed labor, and farmers were ready for an alternative to physically removing weeds. Fortunately for farmers, the University of Chicago had been working on potential chemical weapons during World War II, and while these chemicals never got used during the war, they turned out to be a magic bullet for harried farmers. And it was not a minute too soon. With farmers fighting overseas, farms that had previously been painstakingly weeded and absolutely clean had become completely awash in noxious weeds. As Wallace’s Farmer magazine had put it, “Weeds Won in War Years,” but now the tide was about to turn against the weeds.47
The United States had gone from the hardships of the Great Depression straight into the rationing of World War II, so everyone, including farmers, was excited about the possibility of a return to abundance. A solution to the devastating problem of weeds was welcome. At least, most welcomed the idea. Not everyone jumped on herbicides right away, but something had to be done. In Iowa, a law passed in 1947 stated that farmers were required to control weeds, not only in their fields, but along fencerows and in roadside ditches, to reduce the amount of weeds going to seed. However, herbicides were not required, just weed control. Most experts actually promoted a balanced approach of using both cultural weed control (that is, crop rotation and cultivating fields to physically rip up or plow under the weeds) and chemical control.48
The promise of herbicides was not an empty one. Iowa farmer Tom Decker states, “While a cultivator was doing well to rip out half of the weeds, with herbicides, if applied correctly, we have the potential of killing 100 percent of the weeds.” With more weeds gone, production exploded. The increase in yield per acre was even greater than that offered by hybrid corn. By 1982, herbicides were being used on about 95 percent of American cornfields.
As people became more aware of the problems of soil compaction, and, in the 1980s, as the EPA began to recommend no-till farming on erodible land, herbicides became even more important. If one wasn’t allowed to till the land before planting, there was no way to get rid of the weeds that sprang up in the spring. Herbicides solved that problem. A farmer could walk through the field to spray the weeds a few weeks before planting. Then, when the planters rolled through, a field of dead weeds provided cover for the soil and, as time progressed, added nutrients.49 No-till farming reduces the amount of chemical runoff. It also protects against erosion. So making no-till possible was a noteworthy benefit of herbicides. That said, there are drawbacks. Herbicides kill plants, so one must be careful where they get sprayed, so crops aren’t accidentally damaged. Also, they can cause damage to wildlife habitats if sprayed indiscriminately—which is why training classes on safe application and licensing are required for some herbicides. The objective of most people in agriculture is to protect both the crops and the environment.50
Scientists continue to work to make herbicides safer for humans, wildlife, and the environment. The most popular one in use today is Roundup. University of Wisconsin agronomy professor William Tracy, an organic-farming advocate, notes, “The herbicide Roundup replaced other herbicides, some of which were more toxic, and if chemicals are going to be used, Roundup is not a bad choice.”
Tom Decker adds that pain was another, though smaller, consideration for moving to herbicides:
Driving a cultivator through the field, you have to be careful not to tear up the crops as you’re tearing up the weeds. Farmers had to hang over the side of the tractor, watching the rows as they towed the cultivators. You do that for a few hours, bouncing up and down at the same time, and you ruin your back. Some farmers actually enjoyed cultivating, but it was rough work. Of course, we also appreciate getting rid of more weeds. Plus, we’re concerned about compacting the soil, which happens when you keep driving back over the field—compaction is a big issue, as is erosion. The way we farm today is much better for the soil. Still, pain and loss of mobility were definitely part of what made herbicides attractive.
Martin Diffley notes, “Cover crops help with weed control, too. Up until the 1960s, most farmers used hay and alfalfa to manage weeds. If you grow hay and cut it, and then grow corn, you have fewer weeds. Some still do this, but more and more conventional farmers rely on sprays.
“Unfortunately, in terms of weeds and insects, nature adapts,” Diffley adds. “Nature soon gets around what you’re doing to stop it. That’s not as much of an issue with organic farming, because you keep changing things. Weeds don’t have time to adapt. Also, as soil becomes healthier, you have fewer problems. The conventional approach is similar to the way we approach medicine these days. We treat symptoms rather than creating health. And just as overuse of antibiotics has created resistant germs, so the overuse of chemicals on farms is creating resistant weeds. Again, that problem doesn’t occur with organic farming practices.”
There are more than seven hundred kinds of insects that can do serious damage to crops. While farmers face a number of potential issues, in some locations the biggest threat to corn farming is insect infestation.51 While farmers had faced plagues of insects from the first days of farming in the Midwest, in the twentieth century the European corn borer became the corn farmer’s worst enemy. The borer arrived in Massachusetts in the early 1900s, probably having hitched a ride on broomcorn imported from Hungary or Italy. The borers, which are the larvae of a type of moth, could easily migrate themselves once hatched and on the wing. Within a few decades, they had swept across most of the breadth of corn country, crossing the Mississippi by 1943.52
A wide range of weapons had been deployed against insects, including sulfur, tobacco juice, soapy water, vinegar, lye, and turpentine. However, these worked only for some insects. During the Great Depression, arsenic and a couple of plant-derived insecticides (rotenone and pyrethrum) became popular. Unfortunately, arsenic, though a naturally occurring substance in the environment, can, in large amounts, kill animals (including humans), and rotenone, while also natural, kills fish if introduced into lakes and streams. So people kept looking for alternatives. As was true for fertilizer and herbicides, it was World War II that led to the next big developments in insecticide.53
In 1939, in Switzerland, the Geigy Company developed dichlorodiphenyl-trichloroethane, more commonly known as DDT. It was used widely during World War II, especially to kill lice. It stopped a deadly typhus epidemic in its tracks in 1943. Iowa State extension staff began introducing it in 1945 as an ideal way to battle agricultural pests. Farmers were hooked. It was miraculously successful. Tests showed that DDT destroyed 85 to 100 percent of the borers in treated fields, with the result of yield increases of up to 25 percent. Farmers began to rely on insecticide almost to the exclusion of traditional methods, such as crop rotation. However, by 1950, farmers were beginning to see resistant pests developing, ones not killed by insecticides. Then, government researchers began to find insecticide residue in meat and dairy, due to livestock being fed silage made from plants sprayed with DDT. While no problems had been witnessed, these findings obviously raised concerns about the potential for harming consumers.54 DDT also proved to be deadly for some species of birds. It is a pity, really, because nothing else has been so remarkably successful. It was inexpensive, killed a wide range of destructive insects, and had no apparent adverse effects on humans.55 In fact, even swallowing DDT had little effect. People who actually swallowed large amounts of DDT became excited, with tremors and seizures, but those effects vanished as soon as exposure to DDT stopped. Research in which people took small doses of DDT by capsule every day for eighteen months showed there were no effects at all.56
DDT is still widely used in Africa, for malaria control, because it is also splendid at killing mosquitoes.57 But harm to birds and concerns over residue led to its being banned in most places—which should be encouraging, as it demonstrates the concern for health that balances the enthusiasm for killing pests. (There are still those who point out that DDT was used with wild abandon by everyone, not just farmers, because DDT was used everywhere—in neighborhoods, on playgrounds, in areas with lice, mosquitoes, or any other pests—sprayed in huge amounts. There was even a DDT product [Flit] for use in the home. One wonders if today’s care in applying insecticides might have given DDT a longer life expectancy. But by the time it began appearing in groundwater, public concern was such that eliminating it seemed the only option.)
With DDT out of the way, new chemicals came along. Other chlorinated hydrocarbons, besides DDT, were developed. They became popular in the battle against rootworms. Extension professionals from universities encouraged crop rotation as the preferred method of controlling rootworms, but in the 1950s, everyone (and not just farmers) believed chemicals were the wave of the future. So farmers turned to chemicals, instead of crop rotation, with the result that they soon saw resistance in rootworms.58 Organophosphates, close relatives of nerve gas, were up next. Of these, only Malathion has not been banned, and Malathion is used only in small-scale applications, such as home gardens.59 In the mid-1960s, there were too many incidents of farmers and livestock being poisoned by organophosphates, even with the special training and special handling of these more toxic chemicals, to let this category of insecticide remain in widespread use.60
Insecticide safety became a matter of serious and widespread concern. Government agencies at both the state and national level began getting involved in regulating production and use of insecticides. In 1964, the Iowa Pesticide Act required specialized training and licensing of commercial insecticide application. In 1970, the USDA cancelled the registration of DDT, which made it illegal to use on most crops.61 Careful oversight of insecticides continued, with new regulations created with each new development. One of the main reasons corn-borer-resistant corn was welcomed so warmly onto the scene is that it reduces the need for insecticides.
Martin Diffley explains that crop rotation can contribute to controlling insects, as well as to building soil quality. “Conventional farmers rotate between corn and soybeans. That’s not enough. Insects have developed now that can wait out the single season of corn versus soybeans. In an organic system, you need to plan a long-term rotation. We still have insects, but as the soil gets healthier, we have fewer problems.”
While some organic farmers choose to avoid all chemicals, some available insecticides are considered acceptable for organic farming—they just have to use substances that occur naturally. The USDA keeps a list of which substances can or cannot be used if someone wishes to label crops as organic. Synthetic pesticides made with naturally occurring ingredients can also be used. Because even acceptable organic pesticides include toxic, albeit natural, substances, the amount used can make a difference, as far as safety goes. There is a debate as to whether it is safer to use a small amount of synthetic insecticide versus a huge amount of natural insecticide—but that sometimes is answered by practicality: one can’t call crops organic unless approved insecticides (or no insecticides) were used.62 (Of course, this also means that, even if you buy organic vegetables, unless you know the farmer and know how he avoids pests, you should wash the produce well before eating it.)
There is one fairly effective natural insecticide that the organic world is divided on: tobacco. It has been used since colonial times to kill pests. It is natural and viewed by some scientists as offering the potential of mass production, for use in place of synthetic pesticides. Nicotine is toxic and therefore makes tobacco an effective insecticide. It is being called “green” and “eco-friendly,” but not by everyone. Fortunately, even with nicotine removed, the oil that remains once tobacco is processed is still fairly useful at killing pests—both insects and fungus.63 One hopes that the nicotine-free oil is sufficiently effective, because while nicotine is being called “organic” by some, many organic farmers don’t think nicotine (which kills beneficial insects, too) is such a great idea.
Among insecticides, one “ingredient” has a dual role: Bacillus thuringiensis, aka Bt. It is one of a number of bacterial insecticides. A pricey option, Bt is not toxic to humans, birds, or animals.64 Sold as a powder or a spray, it can be used to control mosquitoes, black flies, and the larval stages of moths—such as corn borers—without harming beneficial insects.65 That’s one role. The second role is in GM (genetically modified) corn. This is the same Bt that offers the protection built into Bt corn. In the first role, it’s considered organic; in the second, it’s not.
Other nonchemical and nonbacterial insect-defense systems include oil sprays, insecticidal soaps, and pheromones, which lure insects into traps. While many organic farmers make use of these approaches to pest control, they are generally too costly to use on large fields. Synthetic insecticides are cost-effective and protect large areas. They make abundant food available at affordable prices, but the potential long-term effects are unknown and therefore remain a concern.66 Then there is the robust rotation cycle recommended by Martin Diffley and Carmen Fernholz. These are the options currently available, but researchers, both organic and conventional, continue to look for better ways to protect crops and consumers alike.
Genetically modified (GM), also known as genetically engineered (GE), plants have created another revolution. They have increased crop yields, reduced the amount of pesticide used, and enabled the reduction of erosion—clearly astonishing benefits for both farmers and the environment. Benefits to the general population include lower prices for food and drugs and improved nutrient composition of foods.67 Yet GM/GE crops have also created a fire-storm of debate. If it’s created by humans and not nature, can it be good? Is it safe for human consumption? Of course, people regularly put in their bodies things created by humans, from medicine to processed foods. But somehow, messing with genetic code takes it to a new level, and people worry. Much of the discussion on GM/GE corn is in Chapter 15, “Questions, Issues, and Hopes for the Future,” but the process was sufficiently revolutionary to warrant mention here, as well.
University of Illinois corn researcher Dr. Stephen Moose points out that genetic manipulation of corn is not new—it has been going on since the days of the early Native Americans in Mexico who created maize out of teosinte. In a way, corn only exists because that “jumping gene” in teosinte genetically modified itself, inserting DNA code where it didn’t belong to create a mutant that eventually became maize.68 However, the nature of manipulation has changed in recent years, as Moose relates:
Beginning in the 1990s, corn was among the first major crops used in the emerging field of biotechnology. Rice, being vital in so much of the world, has also been studied, but here in the Midwest, we were, of course, going to focus on corn. Today, with genome/DNA science, we understand corn better.
For thousands of years, people manipulated corn without really understanding it. The last hundred years, we’ve been working from knowledge. We now have the ability to improve corn, to make sure we continue to have corn, and to continue to increase the diversity. We don’t want to end up with just one kind of corn. Depending on what we need, we can find a corn that is well suited for the purpose. We even have corn for cold-weather locations. Gaspé Flint, which grows in Canada, matures in five weeks, so it’s ideally suited for places with shorter summers. This corn derives from the old flint corns of New England. We have now identified the major gene that lets this variety mature so quickly. We can use that knowledge to feed people in colder climates.
Despite his enthusiasm, Moose acknowledges that not everyone is comfortable with the new technology.
The reactions to newer, genetically modified foods are largely based on fear, rather than science. Five major genes were altered to turn teosinte into corn. We haven’t done anything that dramatic. That said, there are issues, simply because we depend so much on corn. Still, I think people understand that biotech has improved corn—in fact, all plant science has benefitted from corn science. It is corn’s importance to the United States that led to the government funding corn-genome research. There was, in fact, a little bit of an international race to sequence the DNA of key plants worldwide. We helped others with their sequencing, but corn was ours.
Martin Diffley’s reaction to GM corn is that it’s counter to the natural process, and it has limitations. “It also doesn’t really solve any problems, long term. Plus, the cornstalks don’t break down. They’re so strong, they puncture the tires on our tractors. When we try to convert a field of GM corn into an organic field, we have to use moldboard69 plows, because the corn residue is not breaking down like it should. It also doesn’t sop up water. We used to use cornstalk trash (pieces of cornstalk left behind after harvest) to soak up water in wet areas. We can’t do that with GM corn.”
Because so many industries could benefit from GMO research, it is unlikely it will go away. It is being studied as a way of producing clean fuels, and it shows promise as a way to stop the spread of disease in underdeveloped countries. However, there are risks, the primary one being that nature adjusts to changes. Kill most germs, insects, and weeds, and the ones that survive breed into resistant strains. Changing some aspects of a plant might have unintended consequences, such as growth rate (though that is sometimes viewed as a positive) and its response to the environment. A potential risk to human health is the possibility of exposure to unexpected allergens (if a gene from an allergen is transferred into a previously “safe” food plant). Although statistically unlikely, the evolutionary mechanism by which some bacteria transfer traits to other bacteria (horizontal gene transfer, rather than the vertical gene transfer of parents to children) could become an issue if it increased resistance in bacteria. However, a 2013 article in Scientific American points out that even naturally occurring changes can (and do) present these same possible outcomes and suggests that safety is more likely to lie in the direction of continued testing, rather than in the avoidance of new technologies.70
One study, which has since been challenged, showed under laboratory conditions a potentially higher mortality rate among the larvae of monarch butterflies if they fed on milkweed heavily covered in pollen from corn-borer-resistant corn—more heavily covered than would occur in nature, other scientists have stated. (One might ask if the larvae would have been healthier if the milkweed had been sprayed with the insecticide that is no longer needed thanks to the corn-borer-resistant corn.)71 The vast majority of testing, however, has failed to turn up problems, and it is rigorous testing. As plant molecular geneticist Alan McHughen notes, it is unlikely that most conventionally bred crops would make it to market if they had to pass the tests GM foods do.72
Another problem, though one that is primarily a concern for organic farmers, is that corn is wind-pollinated. Normally, a corn plant would pollinate only the plants in the immediate area, but on a windy day, a field of GM corn can potentially pollinate corn in adjacent fields. Fortunately, hybrid corn produces less pollen than open-pollinated corn, which reduces the danger somewhat. In the Midwest, most hybrid corn is detasseled, which comes close to eliminating the likelihood of cross-pollination. Still, it is an issue for organic farmers, both for wanting to protect the purity of their plants and because the government has tightened regulations on what can be called organic and tests corn more often, to see if any GMO genes have strayed into the organic population. (Fortunately, for those who are concerned about genetically engineered foods, this careful regulation of organic produce offers an easy and effective way to avoid GMOs.)
Because of the issues and fear of possible risks, researchers watch carefully for signs of emerging problems. However, so far, the benefits have been sufficiently impressive that most people have been willing to accept theoretical, but as yet unproven, problems. Only time will tell if they continue to feel that way.
This will by no means be an exhaustive look at legislation affecting farming in general and corn in particular. There has simply been too much to cover it all, dating back to the earliest days of our country. From taxes on corn whiskey in 1791 to laws concerning everything from overproduction to the use of ethanol today, the centrality and absolute necessity of agriculture has always kept it in the legislative line of fire. So this is simply a brief review of a few turning points and key bits of legislation that underscore the importance of farming and of corn in the United States.
Of course, the government making land available had a very foundational impact. While people had begun wandering westward on their own, the widespread settlement of the Midwest was facilitated by everything from land grants as payment for soldiers who served in the Revolutionary War and the War of 1812 to the various settlement and homesteading laws. Land grants creating many of the nation’s top agricultural schools was another key bit of legislation.
The explosive growth of railways, grain elevators, and the city of Chicago triggered legislative activity following the Civil War. Railroad companies recognized that farmers across the Midwest needed them, but some needed them more than others. The presence or absence of competition in a town could make a dramatic difference in the prices railroads charged. Add to that the fact that using a specific railway line tied a farmer to that line’s grain elevators, and not all grain-elevator operators were fair in charging for their services. The backlash from these abuses gave rise to a famous case, Munn v. Illinois (Ira Munn being Chicago’s leading grain-elevator operator in 1877), in which the U.S. Supreme Court ruled that facilities such as grain elevators were “clothed with a public interest”—that is, the government had a right to regulate transactions that had an impact on the public at large. Around the same time, regulations were also passed that required railways to create fare structures that offered the same prices, regardless of whether there was competition in an area. These rulings had a tremendous impact, with far wider ramifications than just helping farmers get corn to Chicago in the 1800s.73
During wartime, food came even more under government supervision. “Food will win the war” was the cry from Washington, D.C., during World War I, as the government faced the problem of feeding Europe as well as the United States. People were urged to grow as much food as they could in backyards and on vacant lots, but the real focus was on trying to increase agricultural yields in the states of the Midwest. However, the war created terrible labor shortages.74 The United States Food Administration worked with the USDA to distribute information and materials that would help people make the most of limited resources.75
Heading up the Food Administration was Iowa-born Herbert Hoover, a widely traveled civil engineer whose astonishing life of hardship, adventure, and danger—including having a few brushes with war and operating relief efforts in both China and Belgium—made him the perfect choice to encourage Americans to tighten their belts to help with the war effort.76 Hoover convinced President Woodrow Wilson that food policy would be critical, as was the voluntary involvement of Americans in whatever was planned. Wanting to be a volunteer among volunteers, Hoover refused to receive any salary for this position. There were paid clerks, but the vast majority of the Food Administration was run by volunteers, eight thousand of whom worked full-time. An additional three-quarters of a million people, primarily women, volunteered in committees across the country. Among the several objectives of the administration, the primary one was saving food to send overseas. In order to accomplish this, Hoover encouraged people to observe “Meatless Mondays” and “Wheatless Wednesdays.” Those Wheatless Wednesdays caused a surge in demand for corn. Suddenly, all the nearly forgotten cornmeal recipes that had been cherished just a generation or two earlier became the key to putting food on the table, and people for whom cornbread had never been part of their culture found themselves relying on it for the duration of the war.77
The Great Depression led to the next big wave of farm-related legislative activity. Farms were at the leading edge of the economic disaster, and it looked as though half of America’s farmers would face bankruptcy. New farm policy helped reduce that number to “only” a quarter of all farmers. Herbert Hoover, who became president in 1929, inheriting the already collapsing agricultural scene, worked hard to pass legislation that would help farmers. Because Congress was against him, he lost more battles than he won, but Hoover still had a significant impact on farm policy, including the concept of cooperative marketing—the foundation of the agricultural cooperatives that are so key to many corn farmers today. These were seen in the 1929 bill, the Agricultural Marketing Act, which created the Farm Board. The board had a different director for each major crop, including one for corn. Minimizing the impact of the Depression on the country’s farmers was a key goal of the Farm Board. It made money available so farmers could buy seed to grow crops. It established marketing cooperatives. And it tried to rein in the tendency of desperate farmers to grow more grain than could possibly be sold. (Overproduction was to become a recurring problem for most agricultural products, not just corn.)
Recovery wasn’t moving quickly enough for struggling voters, so in 1932, Hoover was replaced by Franklin Delano Roosevelt. Roosevelt understood the importance of farmers to the country and continued much of the work of the Hoover administration. However, his efforts sometimes had mixed results. The Agricultural Adjustment Act of 1933 created both confusion and controversy. It introduced a new monetary policy, inflating domestic prices and permitting silver, and not just gold, as backing for currency. More controversial still was the plowing under of crops and slaughter of millions of pigs. It was hoped that the slaughtering of pigs would raise pork prices, indirectly helping corn farmers who raised corn for feeding pigs. The outrage over the waste led in 1935 to the creation of the Federal Surplus Commodities Corporation, to help distribute surplus farm produce to those in need, so that prices for crops would remain higher but crops and animals weren’t destroyed. However, it was the heat and drought of the mid-1930s, not government policy, that ended the problem of overproduction, at least at that time.78
Farmers and the government continued to try to find ways to make things work, and some options did more good than others. Many of the farmers I talked with who were alive during the 1930s cited one particular government program that had the most positive effect on their lives: the Rural Electrification Act. This piece of legislation, passed on May 20, 1936, was designed to help rural Americans close the gap between their lives, which remained stuck in the pre-electric age, and the lives of those in towns and cities. It permitted the federal government to make low-cost loans to farmers if they would work together to bring electricity to rural America.79 The farmers formed nonprofit cooperatives that purchased the poles and wires and generators and strung networks across the broad landscape.80 Light bulbs replaced gas lamps as the descendants of homesteaders were happily catapulted into the twentieth century.
Price supports, which began in the 1930s, have continued, as has the struggle with balancing between overproduction and shortfalls. The major goals of government assistance for farmers have generally been to protect the family farm, create a reasonable income for farmers, and ensure an uninterrupted and affordable food supply for consumers.81 However, policies, markets, and production have remained something of a rollercoaster.
In 1956, the Soil Bank had taken 25 million acres of land out of corn production. This was reversed in 1973, when a hungry USSR and China, among other overseas nations, developed a greater need for grain. Suddenly, “overproduction” of corn was an economic necessity. In 1976, Secretary of Agriculture Earl Butz, a graduate of Purdue University with a doctorate in agricultural economics,82 wrote an article titled “Corn Holds the Key to Affluence,” and in it he stated that “corn, a golden grain, is as important to our wealth as gold itself.” The price for corn dropped, and more countries began importing American corn. But then in 1979, exports topped out and began to decline, and grain prices began to decline again. This led in 1983 to the Payment In Kind program, or PIK, which took 78 million acres of land out of crop production.83 Corn production dropped by 38 percent, but the government saved money on storage costs, and, because prices for corn rose, farmers actually saw an increase in income.84 Of course, an increase in corn prices means an increase in the cost of things dependent on corn.
The government instituted an Export Enhancement Program and a Conservation Reserve Program (on land where erosion was an issue). For a while, growing less corn seemed to be working. (Worth noting is that even with less corn growing, the Corn Belt was still producing stunning amounts of corn.) Then, in 1988–1989, the Soviet Union and Japan made huge purchases of corn.85 And so it has gone, boom and bust. An increase in alternative uses for corn (plastics, fuel, etc.) increases demand for corn, and then a drought, such as in 2012, reduces the amount of corn grown. Up and down, the government works to maintain balance and react to changes. Farm policy changes frequently, but at its heart, its goal remains helping farmers stay on their farms and making sure the rest of the country has abundant, safe, and affordable food.86
On the whole, despite adjustments and fine-tuning of thinking and legislating about agriculture, the country still lives with the paradigm Earl Butz suggested. More people will buy the corn if it’s cheaper, so one makes more money by growing more corn.87 Even in years when drought or government policy cut back on what is produced, the United States produces a lot of corn.
As Butz notes in the movie King Corn, “[Corn] is the basis of our affluence now, the fact that we spend less on food. It’s America’s best-kept secret. We feed ourselves with approximately 16 or 17 percent of our take-home pay.” Butz’s statistics actually included meals eaten in restaurants. If one does not eat out regularly, that percentage is considerably lower. Plus, the percentage has continued to decline since Butz made that statement. Depending on the sources consulted, the percentage is between 6 and 9 percent—the least of any country in the world. (In some cases, such as Canada or the United Kingdom, expenditures are only slightly more, but many countries of the world spend 25 to 40 percent or more of their income to feed themselves.) This more than anything else is responsible for Americans having so much disposable income. Americans are rich because of corn.88