For hundreds of miles the tall corn springs in a jungle of undeviating rows, and the stranger who sweatily trudges the corn-walled roads is lost and nervous with the sense of merciless growth.
—from Arrowsmith, by Sinclair Lewis
Hogs were long considered a great way to “transform” corn into a marketable commodity. However, it was not the only transformation the grain was destined to undergo. These transformations have, in many cases, had a surprisingly great impact. Most people have heard that transformed corn is everywhere today, but some transformations date back to early colonial days. Alcohol, corn oil, cornstarch, corn syrup, snack foods, breakfast cereals, and an array of other corn-based products have been emerging over much of the country’s history. The ease with which corn is altered is just one more reason corn shaped our lives and foodways.
Other than feeding pigs, the most widespread way to transform corn for the first three hundred years of the country’s history was making alcohol. Whether it was whiskey, white lightning, moonshine, or bourbon, a substantial amount of the booze produced in North America was liquid corn—an easy way to preserve and ship a grain that weighed a lot, in a form that weighed less, sold for more, and had a ready market. (Of course, this particular transformation is still with us, just not at the stunning level it attained earlier in our history.)
Whiskey appeared in Virginia as early as 1620—just a bit more than a decade after John Smith struggled to save Jamestown by convincing the first settlers there to grow corn. A colonist named George Thorpe wrote to a cousin back in England, “Wee have found a waie to make soe good drink of Indian corne I have divers times refused to drinke good stronge English beare and chose to drinke that.”1 Of course, distilling was not a new technology, so the only experimental aspect of this was seeing if corn worked, in place of rye or barley. It did—splendidly.
That said, early colonists spent considerable time testing the alcohol potential of just about everything they could grow: peaches, apples, pears, apricots, the local grapes—anything that would ferment.2 Then, around 1630, something happened that would put corn liquor on the back burner for a few years: rum emerged from the sugarcane plantations of the Caribbean. Trade carried food (usually salt cod or pickled herring) to the Caribbean (where sugar growers put so much effort into producing sugar that they overlooked producing food) and brought molasses back to New England. While baked beans and Boston brown bread required some of that abundant molasses, making rum was a more popular option, and distilleries sprouted rapidly. Soon rum was the drink of choice throughout the colonies.3 But two things occurred that would bring corn liquor back to front and center: the American Revolution would make it a lot harder to get molasses from foreign sugar growers in the Caribbean, and folks were moving westward, where they were growing vastly more corn and were at a fair distance from New England’s rum distilleries. Suddenly, corn liquor seemed not merely practical, but positively patriotic. Even George Washington made corn whiskey, after he retired from politics.
It didn’t take much effort to convince colonists to drink whiskey (also spelled whisky, fairly interchangeably, early in American history). There had been a huge influx of Scotch-Irish prior to the American Revolution, making them the country’s largest immigrant group at that period. They brought with them not only their passion for independence, but also a taste for whiskey.4 (In fact, the word whiskey came with them, too. It comes from the Gaelic uisge beatha, or “water of life”—which rather underscores their high regard for the drink.)5
The frontier is where corn whiskey really came into its own. The abundant corn being grown on the great, sprawling prairies was more than farmers or their livestock could consume, and before larger markets, easier transportation, and other applications developed, whiskey was the easiest and most popular way to store or sell the excess grain. By the early 1800s, thousands of gallons of whiskey were pouring through St. Louis and down the Mississippi. All across the opening Midwest, pioneer families made their own home brew or, if they didn’t own a still, would send a family member with a sack of corn to trade for whiskey. (The going rate was usually a gallon of liquor for a bushel of corn.) Having a jug of corn whiskey on hand became urgent if a husking bee was planned.6
In all fairness, water generally wasn’t reliably safe or clean, unless one lived by a mountain stream. And whiskey was cheap—far cheaper than tea or coffee, even when it wasn’t homemade. More importantly, alcohol, while it certainly had its recreational uses, was seen as a preventive and a cure-all, prescribed by doctors for just about every ailment imaginable—even for children.7 However, even though the idea of alcohol as medicine was fairly international, foreign visitors were startled by the tremendous amount of booze consumed by Americans. When Thomas Jefferson tried to get Americans interested in wine, he was actually partly motivated by his concern about “the poison of whiskey,” which he felt could potentially destroy people.8 (Before 1820, when the alcohol content of wine was first measured, people believed it was free of alcohol and therefore a healthful alternative to brewed or distilled beverages.)9
Despite Jefferson’s concerns, whiskey consumption kept growing—and grew dramatically as the Midwest began producing a superabundance of corn. By the mid-1800s, there were distilleries producing corn liquor all across the Midwest. However, Peoria, Illinois, was to become known as “the Whiskey Capital of the World.” The city’s first distillery opened in 1843, but the number would grow to seventy-three as the century progressed. During its seventy-six-year distilling history, Peoria produced more whiskey than any other city in history. It became the biggest consumer of corn in the world.10 Peoria also had a substantial dairy industry and a large meatpacking industry, so not all the corn went to making whiskey—but a tremendous amount of it did go to the distilleries. It is estimated that Peoria’s distilleries were turning out around 18.6 million gallons of alcohol per year.11 Peoria turned so much corn into whiskey that the alcohol taxes paid by this one city represented nearly half the income of the federal government. However, this all ended in 1919, when Prohibition went into effect.12
Progressives had long been trying to bring an end to the overconsumption of alcohol that characterized the early 1800s. The Temperance Movement had had some success, lowering the average per-capita consumption of hard liquor from an average of 7.1 gallons per year in 1830 (a near-lethal amount of alcohol, and roughly triple the average consumption today) to 3.1 gallons in 1840—though consumption began to climb again after 1900.13 Progressives thought that eliminating alcohol would help eliminate a huge number of society’s ills.14 Because women had nowhere to go when husbands became drunk and abusive (because men were doing the majority of the drinking), they joined temperance groups in large numbers, and many of the names most familiarly associated with the Suffragist movement also appeared on the rosters of the Progressives, joining in their drive to limit alcohol consumption.15
Prohibition was finally repealed in 1933, and the country went back to producing whiskey and other forms of alcohol, though never again on the scale or in the wildly unregulated manner of pre-Prohibition years. It also helped that doctors had stopped prescribing whiskey. Dr. Benjamin Rush had demonstrated that alcoholism was a disease, not a simple lack of self-control, and modern medicine disposed of the long-held beliefs that alcohol cured colds, fevers, headaches, depression, and snakebites.16
One surprising benefit of distilling is that the corn residue (known as dried distiller grains) that remains at the end of the distilling process is great livestock feed.17 Since it seems unlikely that the country will ever again outlaw producing whiskey, it’s good to know that there is a useful way to dispose of the by-products.
Wet milling is the technique that created the most widespread and varied transforming of corn. Unlike dry milling, which creates flours, cornmeal, and corn oil, wet milling made it possible to really work magic with corn.
In 1840, an Englishman named Orlando Jones patented a process that greatly improved and sped up the process of starch making. The following year, Jones’s process was patented in the United States. In 1842, Thomas Kingsford, while an employee at Colgate & Company, a wheat-starch manufacturer, successfully used Jones’s technique to isolate starch from corn, creating a high-quality laundry starch. Kingsford started his own starch company in 1846 in Bergen, New Jersey, producing cornstarch,18 and by 1880 he was producing thirty-five tons of starch per day.19 Others noticed his success and started their own cornstarch companies. By 1896, there were sixteen cornstarch-processing plants in the United States, producing nearly two million pounds of cornstarch per year.20
The process of creating cornstarch involved soaking the kernels in an alkali substance, much as Native Americans once boiled corn kernels with lime or lye to remove the hulls.21 It made it easier to separate the starch in a really pure form from other constituents of the grain. Because of the soaking before grinding, this process was called wet milling, and Thomas Kingsford became known as “the father of the corn wet-milling industry.”22
All green plants produce starch, but only a few plants are useful for commercial starch production—corn, potatoes, wheat, and tapioca being the chief sources. Among these useful plants, corn is the top source of starch worldwide.23 The corn plant is unique in its ability to convert large amounts of sunlight into concentrated, usable energy, largely in the form of starch. A dry kernel of corn is about 73 percent starch. The balance of that kernel is protein (about 19 percent), and fiber and oil (about 4 percent each).24 Of course, there is some variation among varieties of corn, but this is roughly average for the type of field corn used in wet milling.
Initially, cornstarch was used for starching laundry. Its food potential wasn’t recognized until after 1850, but within a decade, cookbooks were filled with recipes using cornstarch. It was, of course, used to thicken sauces and gravies, but it was particularly popular in desserts, with cornstarch pies, cornstarch puddings, and cornstarch cakes appearing in substantial numbers. (And judging by the silken texture of cornstarch pudding, these recipes were well worth the space they took up in the cookbooks.) Of course, cornstarch still had its laundry applications (and does today, as well—it strengthens thread during weaving), and it went on to help in paper manufacturing.
Producing cornstarch became big business. Factories sprang up across the Midwest. Companies joined together and grew. Argo Corn Starch was introduced in Nebraska in 1892. Seven years later, Kingsford, the original cornstarch company and initially Argo’s biggest competitor, merged with Argo, becoming the United Starch Company.25 By the early 1900s, this company was acquired by Illinois-based Corn Products Refining Company. In 1909, Augustus Eugene Staley, who may be better remembered for starting the Chicago Bears football team than for making cornstarch, bought a cornstarch plant in Decatur, Illinois. Within a decade, he built the A. E. Staley Manufacturing Company into one of the largest corn-processing businesses in the United States.26 However, by this time the wet-milling process was creating a lot more than just cornstarch.
Wet milling had unlocked the door to a wide range of products. It created an industry that would, by the end of the twentieth century, consume between 5 and 10 percent of the world corn harvest and produce hundreds of products. Many of these products made the modern world possible, aiding in food processing, making packaged foods shelf-stable, streamlining preparation—in essence, creating what urban populations expect from food: always accessible, tasty, easy to prepare. Nearly one-fourth of the products in the modern grocery store contain something derived from wet milling of corn—for better or for worse.27
Next up after cornstarch was, essentially, broken-down starch, because starch is really just a lot of glucose molecules joined together.28 Add a bit of heat and an enzyme to starch, and, just as happens in the human body,29 starch is broken down into simple sugars.30 The first dextrose was produced from cornstarch in 1866.31 Glucose and dextrose from corn are identical to the glucose and dextrose that our bodies (and the bodies of all other mammals) use for energy.32
Depending on how much those glucose molecules are broken up, a variety of primarily glucose and dextrose sweeteners can be obtained, from corn sugar to corn syrup. However, corn sugar and corn syrup are not as sweet as sugar from sugarcane, sugar beets, or maple syrup, which is sucrose. To make corn sugars sweeter, another enzyme is added to turn the molecules of corn sugar into fructose. Fructose is about 30 percent sweeter than sucrose. When “high-fructose corn sweetener” appears on a label, this is what has been used. However, all sweeteners have subtly different tastes, and many people prefer “real” sugar, or sucrose. So, why use high-fructose corn sweetener (aka HFCS)? Americans like sugar a lot (too much, many healthcare professionals and nutritionists suggest). The United States is simply unable to grow enough sugarcane and sugar beets to meet the demand. The country imports huge amounts of sugar, but it’s costlier when imported, especially when a hurricane wipes out the crop in one of the tropical locations where sugarcane grows. Corn, on the other hand, is cheap and reliably available.33
HFCS was first introduced in the late 1960s, and the majority of it was immediately assigned to sweetening beverages, though it appears in many other sweet products.34 Lately, a debate has raged over whether HFCS is worse for you than other forms of sugar. Research at Princeton University has shown that rats gain weight faster with HFCS than they do with regular sugar.35 Other nutritional scientists have stated that HFCS does not have substantially different effects on metabolism than sucrose (cane or beet sugar).36 Whichever research is proven correct in the long run, the real problem is that the consumption of too much of any sweetener has the potential to create health problems. Because HFCS makes soft drinks and other sweetened beverages so inexpensive, Americans are consuming a lot of them. The Center for Science in the Public Interest reports that Americans consume an average of more than 52 gallons of soda pop a year for every man, woman, and child in the country.37 Kick in all the other sweetened beverages available, and that represents a considerable amount of sweeteners—and in those quantities, it’s increasing the risk of everything from heart disease to diabetes to obesity. Most researchers now urge a reduction in the intake of sugar calories, regardless of the form of sugar.38 (And they mean any sugar. A big glass of orange or apple juice, which is high in natural fructose, will spike one’s insulin as badly as a big glass of soda pop. Whole fruit isn’t a problem, because of the fiber, but as far as the body is concerned, a big sugar hit is a big sugar hit, no matter what type of sugar.)39 So HFCS may or may not in and of itself be an issue, but the quantities of all forms of sugar being consumed is definitely a concern.
In the late 1800s, after a few decades of making cornstarch and corn sugar, it occurred to processors that they might be able to do something with all the stuff they were throwing out—the fiber, protein, and germ from the corn. The protein and fiber in particular might be good for animals, and processors began turning them into feed in 1882. Next, processors began extracting oil from the germ.40 The germ contains all the oil produced by the corn plant.41 This was actually a rediscovery, as corn oil had been produced as a by-product of distilling for some time. In fact, the New England Farmer reported in 1829 that corn oil could be used as a medicine, in place of castor oil.42 But the wet-milling process did make corn oil easier to produce, because it made it easier to separate out the germ from the other constituents.
Theodore Hudnut of Terre Haute, Indiana, was known as “America’s Hominy King” in the late 1800s, and he was very involved in the advances in wet milling witnessed during this period. Hudnut and his son Benjamin patented a machine for extracting oil from grain for the purpose of cooking, and Hudnut’s production of corn oil (which he called mazoil) earned Terre Haute the nickname “Home of Corn Oil.”43
By the end of the 1800s, corn oil had found multiple uses, in addition to cooking. It was used to make soap, paint, linoleum, creams, salves, synthetic rubber, and varnish.44 Almost all those applications still exist, but the list of applications has gotten longer since then. Corn oil is still very popular for frying, because of its high smoke point. However, because it is not strong in monounsaturates, it has lost market share to oils such as olive oil and canola, which offer greater health benefits. Fortunately for processors, there are still plenty of applications for which corn oil is ideal.
When reading statistics of how many foods contain corn products, one should remember that there may be nothing identifiably corny in the majority of those products—but the words will likely be recognizable to anyone who reads food labels. Ascorbic acid (vitamin C) is one of the most common food additives derived from corn. Because vitamin C is an antioxidant, it’s used as a preservative in a tremendous number of products. (There are many other sources of vitamin C, but corn is the source of most commercial vitamin C—something those sensitive to corn should keep in mind.)45
Maltodextrins are added to a tremendous number of foods but are also popular with everyone from bodybuilders to chefs who practice molecular gastronomy (it’s how they make crazy things like powdered duck fat). Maltodextrin is a fine, soft white powder often used as filler in artificial sweeteners and pharmaceuticals. Maltodextrins help things dissolve quickly and are used in many instant, “just add water” foods.46
It’s worth remembering that any starch can be broken down the same way cornstarch is. It’s just that in the United States, cornstarch is the most abundant starch. (Also worth noting is that, while all maltodextrin in the United States is, by law, produced from corn, that is not the case in other countries. Since it can also be made from wheat or barley, anyone with allergies should use caution when eating foods containing maltodextrin in other parts of the world.)47
The list goes on. Unfortunately for anyone allergic to corn, corn derivatives created by the wet-milling process are in a lot of processed foods. For those not allergic, the many derivatives make life a bit easier and considerably cheaper.
The Victorian era saw a rise to prominence of a few people whose names would be recognized today, though folks might not automatically associate those names with important health-food movements. Reverend Sylvester Graham, Dr. John Harvey Kellogg, and Charles W. Post were, in their day, cutting-edge health gurus who would have a huge impact on the way Americans eat. In the 1830s, Graham, a vegetarian, called for a return to what he called natural living. He felt that diet and morality were interconnected and that consumption of highly spiced foods, rich pastries, and meat would lead to sexual improprieties. He may not have gotten that quite right, but he also promoted the idea that more whole grains would help reduce indigestion, and that has proven to be true (though he got the reason wrong; he thought white bread was too nutritious for the body to process, and a simpler, rougher bread would therefore be easier to digest). He created coarse, high-fiber “Graham bread” from the whole-wheat Graham flour he promoted. (Graham crackers, though named for Reverend Graham, do not actually mirror what Graham was trying to get people to eat.)48
Kellogg, who received his M.D. in 1875, liked Graham’s ideas. Kellogg was also a vegetarian, as well as a Seventh-day Adventist. In 1876, he became superintendent of a health facility in Michigan that would become known as the Battle Creek Sanitarium.49 The whole-grain focus led Kellogg to look for new and different ways to feed grains to the people who flocked to the sanitarium. (The imposing sanitarium was quite famous, and quite fashionable; the client list included Amelia Earhart, John D. Rockefeller, and Teddy Roosevelt.) Shortly after creating a breakfast cereal out of wheat, Kellogg, along with his younger brother, Will Keith Kellogg, developed corn flakes and rice flakes. These were fed to patients at Battle Creek, and the story might have ended there, except that a former patient of the sanitarium, and soon-to-be rival of Kellogg’s, Charles W. Post, began marketing his own cereal.50 Post, who grew up in Illinois, started his food business in Battle Creek, Michigan, in 1895. In 1897, he developed Grape Nuts, and Post Toasties came out in 1904. The idea of dry, ready-to-eat cereal for breakfast was a new concept, and it was Post’s genius for advertising (along with his fervently held belief in the ability of whole grains to cure everything) that turned breakfast cereal into an industry.51 In response, John and Will Keith Kellogg started their own cereal company in 1900, calling it the Sanitas Food Company. They began marketing the grain flakes they had developed together and that had been so popular at the sanitarium. In 1906, Will bought out his brother’s share in the company and started the Kellogg Toasted Corn Flake Company. In addition to using corn, the company promoted the improvement of corn crops. In 1909, the “W. K. Kellogg National Corn Trophy” was introduced, to be given to the farmer who submitted the best ear of corn at the annual National Corn Show. By 1922, because the company had begun making cereals from other grains, in addition to corn, the name was changed to the Kellogg Company.52 When the Great Depression hit, Will Kellogg pulled out all the stops for the company’s advertising. With money tight and ads convincing, Americans abandoned the hot cooked breakfast in favor of cold cereal.53
While other grains are flaked, most flaked breakfast cereals are made from corn. The process is sufficiently complex that it makes one wonder how early proponents viewed these as natural foods. The corn is cooked under pressure until it becomes a large lump. The lump is broken down and sent to the driers, which reduce the moisture to about 20 percent. This substance is then tempered for twenty-four hours, to make certain moisture is evenly distributed. The product is then passed between large steel cylinders, which break it into flakes. The soft flakes are then transferred to toasting ovens, where the flakes are heated at 550°F for two or three minutes. This completes the dehydration of the corn, toasts it, and slightly blisters it. This, at last, is what Americans would recognize as cornflakes.54
Most of the other corn-based foods in the crunchy category are found in the snack aisle. Of course, there’s popcorn, but that’s not really transformed, just heated till it explodes. Corn nuts hearken back to the parched corn that Native Americans and settlers ate before popcorn was introduced. Just about everything else is a new spin on a process that is almost as old as corn itself: pound it into paste, form it into the desired shape, and cook it. The difference is largely in the tools used to accomplish these tasks.
The simplest thing to do with the paste is to flatten and bake it. That’s what Native Americans did for thousands of years before Europeans arrived, and it’s what Mexicans do today, at home, in restaurants, and in burgeoning tortilla businesses.
In the 1930s, a gentleman named Charles Elmer Doolin, inspired by a Mexican he met at a gas station who was frying bits of flattened masa,55 started a business that today pretty much rules the snack category in the United States.56 Knowing that the Spanish word for “fried” is frito makes it relatively easy to guess what Mr. Doolin created. Doolin paid the man $100 for the recipe (which was a considerable sum during the Great Depression)57 and went home and began cooking, though it was his mom who perfected the recipe. Not satisfied with simply having a good recipe, Doolin hybridized his own corn, to make sure the flavor and quality would be consistent. Interestingly, Doolin was, like the men responsible for breakfast cereal, a vegetarian following a strict diet regimen, though his was imposed by a healer of the day, rather than being an approach he devised. He always thought Fritos would be a healthy whole-grain side dish to be served as part of a meal and consumed in small amounts, rather than a snack food.58
In an innovative marketing move, Doolin opened Casa de Fritos at Disneyland, shortly after the theme park opened in 1955. Casa de Fritos served a range of Mexican dishes, plus their own innovation, the “tacup”—a cup made of the fried masa used in Fritos, packed with classic taco fillings. Of course, every meal ordered came with a bag of Fritos. It was a brilliant maneuver on Doolin’s part, because over the nearly three decades that Casa de Fritos was at Disneyland, millions of people were introduced to Fritos. Soon, Fritos were selling nationwide. (And, apparently, Doritos were also born at Casa de Fritos.)59
Granted, Doolin was from Texas, not the Midwest, and that’s where Frito-Lay is still based, but his part in kick-starting the market for corn snacks (other than popcorn) would have an impact on the Midwest. The other key corn-snack innovation came out of Beloit, Wisconsin. Clair B. Matthews, an agronomist; Harry W. Adams, an attorney; and E. E. Berry, an engineer, had formed the Flakall Corporation in 1932 to market a machine that cooked animal feed to make it more digestible. During the Great Depression and World War II, they sold flaked rabbit feed. But one day, a machine operator named Ed Wilson was using cracked corn to clean the machine, and he noticed that the machine was spitting out sticks of puffed, cooked corn. In one of those moments of inspiration that most might miss, Wilson scooped up the sticks and took them home to his wife, who fried them. Add a little salt, and a snack food is born. Since it was Wisconsin, adding a bit of cheese seemed like a natural move. And so were born Korn Kurls™. World War II kept things on hold for a while, but in 1946, Harry Adams and his two sons formed the Adams Corporation and began producing and marketing the new puffy, fried corn snack.60
The technical name of the process accidentally discovered in Wisconsin is the extrusion process—and it is not limited to food. It is simply the shaping of a doughlike material by forcing it through a restricted space or a die, to give it a shape or form. The technical name for a puffy curl of corn produced by this process is “collet.” There are two basic types of collets: baked and fried. If a collet will be baked, maximum puffiness is achieved if the moisture level is about 13 percent. Collets to be fried need to have about 20 percent moisture, before they’re fried in oil.61
That was hardly the end of inventiveness. Minnesota-based General Mills, which had been the first food processor to use an extruder in making ready-to-eat cereal,62 introduced Bugles in 1966.63 More flavors and shapes of corn snacks emerged. Increased interest in ethnic dining in the late twentieth century led to a surge in the number of manufacturers selling tortilla chips. Crunchy, salty, transformed corn had become a key force in the snack-food marketplace.
Only food-related corn products have been discussed up to this point, but it is worth noting, if only briefly, that there are, and always have been, myriad nonfood uses for corn. Corncob pipes are still around. Cornhusks are still used in folk art and for decorative items, even if few people rely on them to fabricate dolls for children (unless they’re at one of the lovely living-history farms that bring back earlier days). However, though people no longer stuff mattresses with cornhusks, insulate houses with cornstalks, or fuel stoves with corncobs, nonfood uses of corn are actually increasing in number, as industries have scrambled to find raw materials that are not only less expensive, but also more ecologically sensible.
Cornstalks have been made into paper and wallboard. Husks can be used as filling material. Corn resins can be made into almost anything, from cutlery to wristwatches—all of it biodegradable.64 Skateboards are being made out of a new material known as CornBoard™.65 And who hasn’t encountered the poufy packing material called popcorn and actually made from corn? Charcoal, cosmetics, adhesives, and vastly more can be made from corn. The number of possibilities of “green” products based on corn seems to be limited only by the imagination (and probably the funding) of those working to find new ways to both use corn and protect the environment.
And, of course, there are ethanol and biodiesel. These are not recent ideas. Henry Ford built his Model T, released in 1908, to run on ethanol (though Ford’s world-changing auto could also run on petroleum, or a blend of ethanol and petroleum). Poised at the very beginning of the country’s love affair with cars, Ford was already focusing on the future, and he felt certain that ethanol was the automobile’s destiny—though he hoped there would be a wide range of options for creating it, noting that fruit, weeds, and sawdust are all capable of fermenting into alcohol, as well as corn.66 University of Illinois corn researcher Dr. Stephen Moose notes that using yeast and starch to create alcohol is actually even older than Ford’s vision for future fuels. People have been drinking the results of that technology for thousands of years. However, whether using corn for fuel production is the wisest option is still under discussion—and some have begun to say that government-mandated corn-based ethanol production is actually bad environmental policy.67 Consumers, environmentalists, corn growers, and politicians all have a stake in the outcome of this debate.