Butter in the Interwar Period
In the winter of 1911, a group of New York women launched a protest movement. The issue that motivated them was not suffrage, prohibition, or child labor. Rather, they were concerned about access to and quality of a household essential: butter. When commission houses hiked prices of the savory staple, the women refused to purchase even a pat. Within days, butter marketers dropped their prices. In the wake of the successful boycott, thousands of women nationwide joined the group who called themselves the Housewives’ League.1
The price of butter may appear a peculiar instigator of a protest. But Americans in the 1910s and 1920s demanded butter, and lots of it. Whether they spread it on toast, used it to fry pancakes, or baked it in pies, between 1910 and 1940, Americans regularly ate more than eighteen pounds of butter per capita, well over four times the recent consumption rates.2 In 1918, one magazine recommended that a household consume as many pounds of butter per week as there were members of the family.3 For women who wanted to provision their tables with the best food and to rein in the grocery bill at a time of escalating prices, butter was of weighty importance.
A staple for American eaters, butter was also central to the farm economy. Many more rural families churned butter on the farm or supplied cream to butter factories than sold milk for drinking, because butter factories did not require farmers to meet as exacting sanitary requirements or as quick of a transit to market. By the 1930s, more milk was channeled into butter than to any other dairy food.4 Hence, butter prices inspired farm people, as well as consumers, to organize.5 High butter prices, the target of consumer cost-of-living protests in 1911, were the goal of farm families ten years later.
At first glance, the politics of butter pricing seem to confirm historians’ assessment of the interwar period as a time of tense relations between country and city. During the 1920s, immigration, prohibition, and religion pitted cosmopolitan urbanites against rural fundamentalists. Improved roads, telephones, and new media—like advertising and radio—brought urban values and cultural forms to rural audiences, challenging local institutions and sensibilities.6 But even as rural people voiced objections to urban mores, they sought out ways to reconcile urban industrialism with rural life. By 1920, more Americans were urban than rural, and rural people, especially youth, shunned the country and sought opportunities elsewhere. The nation’s economy, once grounded in agriculture and raw materials, shifted to become capital-intensive and consumercentered. Fearful of population drain, farm experts urged producers to change the nature of the farm to fit the demands of the standardized market. By the 1930s, farm producers increasingly believed that they needed to accommodate farming practices to meet the dictates of the consumer-driven age.
That even humble butter could be part of this push toward mass-marketing and standardization might have been difficult for early twentieth-century Americans to fathom. Before the 1920s, butter was manufactured largely on farms or by local creameries. The weight, color, quality, and volume of butter sent to market varied widely. Seasonal fluctuations in cows’ diets, the ways individual farmers balanced their farm work, and the whims of the buttermaker meant that the body and flavor of butter differed in nearly every tub. Wholesale commission agents graded butter in Philadelphia and New York before selling it to retailers, but only a few buttermakers paid keen attention to the standards on which butter was judged. Many creameries even lacked scales to weigh their product before sending it to market.7 Butter marketing was personal and localized. Farm kids delivered butter churned by their mothers to town residents on their way to school, and women peddled it directly to consumers at curbside farmers’ markets.8 For most of the farm families supplying the cream trade, milking cows was a seasonal sideline, not a main income stream. Cream sellers raised beef cattle, hogs, horses, and chickens. Many also sold fruits, potatoes, sugar beets, hops, or tobacco. Often, cream-selling farmers valued dairy cows as much for the manure, calves, and skim milk they produced as for their cream. Whether selling it from the farm or alongside other farm products, cream and butter sales offered farm families a measure of economic security in a volatile agricultural economy dominated by crops like cotton, tobacco, corn, and wheat.
During the interwar period, farm people and buttermakers changed the ways they made and marketed butter. By 1920, buttermaking largely shifted from farms to factories, called creameries. The largest of these factories, centralizer creameries, operated at the intersection of railroad lines.9 Smaller, crossroads creameries thrived in regions of the amply populated upper Midwest where dairy farming thrived but farmers were too distant from cities to supply urban fluid milk markets.10 By sending cream to the factory to be churned, farm people could lighten the labor of dairy work and benefit from the expertise of a skilled buttermaker.
The rise of the chain store also altered the butter trade. By 1923, nearly a quarter of the nation’s food retailers were chain stores.11 Driven by centralized management directives, chain stores purchased goods directly from manufacturers and rewarded those who could deliver a large volume of high-quality butter. Chain store managers wanted consumers to be able to enter a store in San Francisco, Spartanburg, or Schenectady and obtain the same kind of product. Centralizer creameries, because of the large volume of butter they produced, were well positioned to take advantage of the new retail environment. Local creameries could compete with centralizers if they adhered to a new standard of purity: one that promised a better flavor, longer-keeping quality, uniform color, and homogeneous texture. Chain stores also encouraged butter purchasers to use new ways to judge butter quality. Once attentive to the tub with the most appealing shade of yellow or the endorsement of the grocer, consumers came to choose butter based on labels carrying the imprint of state inspection and nutritional claims.
The most profound shift in the butter trade was resetting the priorities of the farm people who supplied it. Dairying experts asked small farmers to stop viewing their cows largely as generators of products to be used on the farm—like manure to fertilize soil or calves to increase the herd—and instead to value cows for the cream they produced. Local creameries expected farmers to be more fastidious in caring for cream and feeding animals to maximize production. Increasingly, success in farming required both careful understanding of the potential of one’s farm, as well as market savvy about the national and regional demands for farm products. Rather than view interactions with regional and national markets as undermining autonomy, farm people saw the vitality of the local farm economy to be dependent on these interchanges.
Contemporary rhetoric neatly divides local, small, subsistence farmers from nationalized, industrial ones. But a closer look at the historical record of butter-making in the interwar period reveals ways that local farming and industrialized mass production often overlapped and coexisted. The experiences of Elihu and Lena Gifford, who farmed near Camden, in Oneida County, New York, were in many ways typical of those engaged in the cream trade. Throughout the 1910s and 1920s, the Giffords kept a herd of twenty to forty Jersey cows, a breed that yielded milk with high fat content and thus was perfect for a farm selling cream. Compared to other cream-producing farms, the Giffords’ herd was large, but still cream only constituted a fraction of the farm income. Most winters, the Gifford family produced a hundred pounds of sausage from hogs they killed on the farm, slaughtered beef cattle for their own use and to sell to neighbors, and dressed scores of chickens for local grocers. The Gifford family also raised other livestock, not least of which were horses that aided in much of the farm work. In spring and summer, they delivered sweet corn to a nearby cannery, kept a garden, and raised oats, hay, corn, and beets for livestock fodder. Offering a variety of products for market helped balance the volatility of agriculture and steadied the income stream on the farm.12
Local food enthusiasts might be surprised by how many national ties kept farms like the Giffords’ afloat. When Elihu and Lena Gifford ventured to the local store with farm-raised sausage, hams, eggs, and beef, they returned to the farm with pineapples, bananas, oranges, and lemons. The Giffords fed livestock farm-raised alfalfa hay, oats, corn silage, and beets, but also supplied them with oil meal and cottonseed meal shipped from St. Louis and points farther south. As Mr. Gifford sold cream and butter to nearby restaurants and local residents, he also traveled to New York City, Utica, and Syracuse to establish cream contracts with greater geographic reach. Supplementing local feed rations with cottonseed meal boosted production while keeping feed costs low. Securing a more lucrative cream contract in New York City guaranteed a higher return for one’s labor than one might receive locally. The ledgers of farms like the Giffords’ illustrate the careful balance of markets driven by neighbors and by national trends, and of combined uses of local resources and of inputs purchased from afar.13
The rise of centralizer creameries provides one of the most vivid examples of the coexistence of small-scale farming with the tenets of the industrial economy in the 1910s and 1920s. A new machine—the hand-cranked cream separator—fueled the shift in butter production from farm to factory. Before the twentieth century, most buttermakers relied on gravity to carry out the first step in making butter, which is separating whole milk into cream and skim milk. They poured milk into shallow pans, placed the pans in cool water, and waited for cream to rise. Then, they used a round disk to remove the cream layer from the skim milk.14 In the 1870s, Danish and Swedish inventors devised implements to separate whole milk into cream and skim milk more efficiently. As whole milk whirled in a bowl, centrifugal force separated skim milk from the lighter, insoluble cream. The spinning motion pushed the heaviest parts of the milk to the edges of the bowl, while the lightest cream remained in the center. Once separated, skim milk and cream flowed from separate outlets. Butter factory operators adopted centrifugal separators because they could operate at any temperature and could control the richness of the cream they yielded.15
In the 1890s, a new kind of centrifugal cream separator came on the market: a hand-cranked centrifugal separator for farm use. Soon, cream separator companies, such as De Laval, Sharples, and Burrell, aggressively marketed them to rural audiences. The separator manufacturers peddled their wares at county, state, and world’s fairs, canvassed the countryside, and advertised in all the farm magazines.16 Dairying families responded quickly. When the De Laval separator salesmen visited the Gifford farm in 1910, for instance, they found the family already had a separator.17 By 1930, creamery expert Otto Hunziker explains, “nearly every family in the dairy belt had a farm separator.”18
The adoption of hand-cranked cream separators in the 1910s and 1920s made it easier than ever for farm families to supply butter factories. Historically, farms that engaged in the butter trade were more geographically distant from urban markets than those that specialized in the fluid milk trade because fluid milk producers needed milk to reach the city before perishing. Even creameries were reluctant to operate in scantly populated areas, for the potential volume of raw material was too small to justify the operating costs. The cost of hauling whole milk to the factory exceeded potential returns from cream sales. The farm separator changed the economic calculus. It enabled farmers with just a few cows to begin marketing their surplus butterfat. With its adoption, factory butter production extended into the most rural regions of the nation, including the Great Plains and the South. Large creameries took advantage of these small cream sellers who sent separated cream by rail to the factory.19 By collecting hand-separated cream from far-flung places, centralizer creameries incorporated small-scale farmers into the national economy. Specialized modern processing supported small-scale mixed farming.
Despite the general trend toward factory production, in some rural districts, home-produced butter still commanded an important sector of the rural market. One 1915 study, for instance, found that nearly 90 percent of farmers in northern Minnesota made their own butter, including those who also sold cream to local butter factories. Locally made butter was consumed in the home but was also traded to storekeepers for credit. Most grocers and general store owners acquired more butter in these small transactions than could be used locally. Storekeepers sent surplus butter via merchants to centralizer creameries, where it would be remade and sold to urban consumers as “renovated” butter. To renovate, centralizers melted the products of hundreds of small country stores, combining butters of varying quality and age. After removing the whey and other impurities from the butter oil that remained, manufacturers blended the oil with skim milk or buttermilk so that the final product would have a similar appearance, consistency, and flavor to genuine butter.20 Renovating butter allowed a poor-quality product—homemade country butter—to find a market, relieving the pressure on country storekeepers who accepted such butter in trade. In transactions like this one, the line between local and industrial, homemade and corporate became especially difficult to draw.
Figure 2.1 The cream separator enabled farm people to deliver cream to market and use skim milk on the farm for animal feeding. Keeping the separator clean was essential to the production of wholesome and sanitary butter but was a toilsome task. Wisconsin Historical Society, WHi-9964.
The farm separator helped farm families enter the butter trade, but it provided no guarantee that the cream they sent to be churned into butter would be wholesome and high-quality. Despite manufacturers’ promises, mechanically separated cream often reached the factory dirty and spoiled. Interwar Americans seeking to improve cream attributed its poor quality to both the factories making butter and the farmers who supplied them. Some, foreshadowing present-day critiques of factory farming, blamed poor cream on the large size of centralizer creameries and the anonymity they afforded their farmer suppliers. To these critics, the scale of modern processing imperiled food quality. Others saw the seasonal and unspecialized nature of the farms delivering cream as the primary obstacle to purity and called for greater focus on dairy work and professionalization of buttermaking.
The factors to which observers traced cream’s impurities and the solutions they suggested carried political implications. Efforts to purify cream took place at a time when many rural Americans sought to counter the growing power of agricultural processors by forming their own organizations to store or process agricultural goods. Many local creameries were cooperatives, owned jointly by the farmers who supplied them (called patrons).21 Promoters of agricultural cooperatives believed that by selling the product of their labor in common, farm people could revive the economic vitality of the countryside and maintain a measure of autonomy. When cooperative proponents linked impure cream with the centralizer system, they cast aspersions on their competitor and critiqued the broader economic system of which it was a part. Similarly, when big butter factories derided “country butter,” they characterized rural creameries as ill-run, backward-looking facilities.
Although local creameries and centralizers’ rhetoric pitted them against one another, all butter factories faced many of the same kinds of challenges in obtaining good cream. It mattered little whether a centralizer or local cooperative churned the cream into butter if a heat wave melted the ice used to keep cream cool, or farm families skimped on the time devoted to sanitizing dairy equipment. Since many farmers sometimes sold cream to centralizers and sometimes to cooperatives, depending on the price they could obtain, causes of impure cream rooted on the farm affected centralizers and cooperatives alike. Poor cream resulted from a mix of natural, technological, and social variables. Some natural factors, like bacterial spoilage, could imperil cream, but other environmental variables, such as cool spring waters, could be harnessed to maintain its purity. Similarly, modern scientific and technological interventions could overcome the natural variability of milk but could also accelerate problems of spoilage by intensifying competition between creameries. To get pure butter, buttermakers had to balance natural processes and technological interventions.
One source of unsanitary cream was unwashed or poorly sanitized cream separators. The same intricate separator parts that channeled cream to the center of the bowl collected slimy milk residue. Without careful sanitation, vestiges of milk, and the germs it carried, remained. The task of separating milk, then, was not complete until farm people ran lukewarm water through the machine, disassembled the separator, washed each individual part, and sanitized them by scalding with hot water. Tiresome enough by itself, cleaning the separator also often required pumping and heating water. This task was almost always women’s work. Farm women remembered years later how much they loathed cleaning the separator. Isabel Baumann, who grew up on a farm near Stoughton, Wisconsin, recalled, “My one job that I hated so much was to wash the separator, especially in the summertime.… I vowed that one day I’d never have to wash the separator and by gosh, I get married, and move up here there’s the separator.” Clara Scott, from Grant County, Wisconsin, also explained, “separators … were a terrible thing to wash. All them discs and everything.”22 If even a few women shirked the meticulous sanitation process, they imperiled the safety of the cream sent to market.
The site where cream separation took place could also affect the purity of the product. Many who marketed cream practiced mixed farming, and such farmers valued the skim milk coming from the separator as much as the cream. Skim milk could be fed to chickens, hogs, and calves, and enlarged the profitability of the farm’s livestock, because the protein-rich skim milk could replace some of the high-cost protein-rich grains or supplements purchased for animal feed. Some farmers were so concerned with the quality of skim milk for animal feeding that they separated milk in the hog barn. This practice made feeding hogs convenient but offended milk inspectors’ notions of cleanliness. In 1913 and 1914, milk inspectors in Wisconsin issued citations to farmers who kept their milk separators and cream in unclean parts of the barn, for the cream arrived at the creamery warm, full of particles of dirt, hay, and manure, and tainted with a barny flavor.23
Cream purity also suffered because health officials prioritized farm inspection of sites producing milk for drinking over those supplying cream to the butter trade. Food-borne illnesses from butter were less frequent than those traced to spoiled or contaminated milk. Furthermore, public health experts could identify and monitor suppliers of fluid milk more easily than cream producers. The sheer numbers of farmers that sent surplus cream to creameries, combined with the geographic isolation of these farms, made regulation difficult. Some farmers sold cream for butter only seasonally or sporadically, presenting an even greater challenge to health inspectors. Because dairying was merely a sideline operation on many cream-producing farms, such farmers were reluctant to invest in improved facilities for conducting dairy work. Most cream-producing farms were only visited by inspectors once a year, if at all.24 In combination, lax regulation, mixed farming, and lazy separator sanitation could imperil the cream supply of cooperative and centralizer creameries alike.
Cream destined for centralizer creameries did travel longer distances and with less frequency than cream carried to neighborhood cooperative creameries by wagon. Since many of the farm families supplying centralizer creameries had just a cow or two, it sometimes took them a week before they gathered enough cream to merit a trip to the cream station. As the cream sat waiting to be delivered, it lacked proper cooling and sanitation, especially in the summer months. Failing to cool cream completely before adding it to the can of already-collected cream could result in a musty, smothered flavor. Uncovered cream cans also attracted flies. A 1935 FDA investigation found maggots in eight of the fifty cans of cream delivered to one southern creamery.25 Transportation delays degraded cream quality even further. By the time cream arrived at centralizer creameries by rail, it had traveled long distances. Sometimes cream sat unrefrigerated at the railroad station before being loaded and delivered to the butter factory. Aaron Ihde, who worked at one of the largest centralizer creamery companies from 1931 to 1938, described the cream that arrived at Blue Valley Creamery in Chicago:
Cream that was very sour sometimes had developed off flavors, sometimes had even accumulated insects and rodents. In those cases Blue Valley dumped the stuff and sent the farmer a note and usually lost them as a customer.… But every so often the rat didn’t appear until it was dumped into a big vat.26
Blue Valley Creamery was not an isolated case. Floyd Lucia, who worked as a buttermaker for the Beatrice Creamery Company in Indiana during the early 1920s, noted that cream sometimes arrived at its factories nearly rancid.27
Critics of the large centralizer creameries blamed the long path from farm to market for discouraging rural people from being careful with their cream. In the centralizer system, the staff at the local cream station, not trained buttermakers, had the most contact with farmers. Drop-off stations for centralizer creameries were located at railroad stations, garages, gas stations, blacksmith shops, and private houses. The mechanics or railroad operators who staffed the stations were rarely trained in dairy sanitation. Cream station operators sought to increase the volume of cream sent, not to ensure its quality.28
At local creameries, buttermakers had regular contact with those who delivered milk, could advise them on problems with their cream, and warn them when cream did not meet expectations.29 And yet even local buttermakers who had been trained in the principles of buttermaking and who enjoyed direct contact with farm patrons could lapse in their insistence on high-quality cream. While local buttermakers could instill sanitary practices and enforce a high purity standard, patrons exerted influence on buttermakers’ actions, sometimes discouraging the buttermaker from doing so.30 As G. L. McKay and C. Larsen noted in their 1922 textbook Principles and Practice of Butter-making:
The question as to where the line should be drawn between the good, medium, and very bad milk or cream, must depend upon the judgment of the receiver, and in a great measure upon the local conditions.31
Some local patrons resisted buttermakers’ requests for pure cream. M. P. Mortenson, a field manager for Land O’Lakes, noted that at the Rice Creamery, “the Operator has always said plainly that the patrons would not stand for grading very closely.”32 In such cases, buttermakers worried that if they insisted on higher-quality cream, their patrons would take their cream someplace else and convince their neighbors to do the same. Farmers disenchanted with buttermakers’ actions had ample opportunity to air their complaints and gain like-minded allies as farmers congregated outside the local creamery at delivery time. Patrons’ pressure had especially strong influence in the late 1910s and early 1920s, when competition among creameries was at its peak.
By the mid-1910s, in specialized dairy districts, a growing number of creameries and cream stations competed to obtain farm families’ cream.33 Facing stiff competition from centralizers, buttermakers were especially reluctant to hold farm people to high sanitary standards. W. W. Clark, the agricultural extension agent in Houston County, Minnesota, noted that only one of the twelve creameries operating in the county did not directly compete with the territory of a centralized creamery.34 As creameries struggled to get enough cream to keep their factories operating, buttermakers worried as much about getting enough cream as they did about obtaining high-quality cream. Some creameries even based buttermakers’ salaries on the number of patrons they attracted, so it is not surprising that some buttermakers accepted cream of lesser quality. The success of small cooperative creameries rested on amicable relationships between the buttermaker and the farm people who supplied them.35 Thus, although handcranked separators quickened the pace of cream separation and enlarged the scale of butter manufacture, they intensified problems with crafting a sanitary, standard product.36
Creameries’ efforts to get good cream became even more difficult by the 1920s and 1930s, as motor-trucks and the improvement of country roads hastened travel times in the countryside. In theory, such technological developments facilitated quick cream delivery.37 But like the cream separator, the automobile had mixed effects on butter quality. As automobiles quickened the pace of transport, crossroads creameries that once collected cream only from nearby farmers extended their reach. After the arrival of the automobile, crossroads creameries faced pressure not just from centralizer creameries, but from other small factories once thought too distant to be a convenient market.38 Even as farm separators and motor trucks sped the trip from farm to factory, socially agreed-upon standards of purity commanded great power.
If buttermakers were to produce butter of consistently high quality and sanitation, they needed to improve the quality of cream they received. Buttermakers did not seek to eliminate all bacteria from milk; in fact, most used bacterial starter cultures to “ripen” cream before churning. The problem was not with sour milk per se but with cream that had soured from abnormal ferments instead of beneficial bacteria. Cream that arrived at the creamery clean and uncontaminated could ripen and produce good-flavored butter, but poor cream carried bacteria that marred its flavor. The variety and amount of acidity in cream varied with the age of the cream, the kind of feed cows ate, the phase of a cow’s lactation, and the season of year.39
To overcome this natural variability, factory managers proposed two kinds of solutions. The first alternative that creamery operators utilized to control sour cream was called neutralization. Buttermakers tested the acidity of the cream they received and then added a lime-water mixture to cream in proportion to the cream’s acidity. The more sour the cream had become, the more lime water had to be incorporated to bring the acidity in check. Buttermakers aimed for a target of 0.25% before churning it into butter.40 Often buttermakers used cream neutralization and cream ripening in tandem to direct the souring process. Cream might arrive at the creamery sour, be neutralized to reduce its acidity, pasteurized, and then be soured again with bacterial cultures before being churned into butter.41
A second technique, taken up by increasing numbers of buttermakers in the 1920s, was to make butter from sweet cream—cream with a lactic acid content of 0.25% or less. Sweet cream butter often received a higher score from butter testers and commanded a higher price.42 Some consumers preferred its milder taste.43 Sweet cream butter’s most important attribute was its keeping quality. Butter made from ripened cream deteriorated more quickly than that made from sweet cream and developed fishy, oily, and metallic flavors over time. As nearly 25 percent of the nation’s butter was stored to be sold during the winter months, a long shelf life was important.44
The debate over how to overcome problems with spoiled cream paralleled earlier discussions about how best to purify milk. Like pasteurization, neutralization was a technical fix to be carried out by the dairy manufacturer. Like tuberculin testing, using sweet cream for butter production placed the onus of cream improvement on dairy farm families. Proponents of neutralization, like supporters of pasteurization, argued that it was cost-efficient and eliminated waste.45 Those who opposed neutralization, like those who opposed pasteurization, worried that the process would discourage improvements to the sanitary conditions in which cream was produced. If neutralization enabled soured cream to be made into salable butter, what incentive would farm families have to keep cream cool and clean on the farm?46
The method a creamery utilized to overcome spoilage and define cream’s purity—whether it neutralized cream or obtained it sweet—indicated how it balanced the interests of farm families supplying the butter trade and the imperatives of an industrializing economy. Neutralized butter stood as a symbol of big business, because centralizer creameries pioneered the technique.47 Small creameries, with their more direct contact with farm patrons, led the way in making sweet cream butter. Small creameries hoped that selling sweet cream butter would distinguish their product from that of centralizers on the national market and enhance farmers’ profits.48
Those that produced sweet cream butter worked to tilt the regulations in the marketplace to their favor, proposing legislation that distinguished sweet cream butter from competitors’ neutralized variety. One reform was to require that butter made from neutralized cream be labeled. Their hope was that consumers would believe that neutralized butter was an unhealthy or inferior product to butter made from sweet cream.49 Sweet cream butter promoters also argued that neutralized butter was adulterated. Drawing on a 1902 federal law that established a tax for adulterated butter, they contended that neutralized butter should be taxed.50 In 1920, the Commissioner of Internal Revenue declared butter made from cream high in acid (including cream that was neutralized) was adulterated butter and therefore subject to tax.51 The tax, at the rate of ten cents per pound, provoked quick response from creamery manufacturers that practiced neutralization and was reversed in September 1921.52
As federal regulations waned, state food and dairy inspectors aided sweet cream butter proponents’ efforts. Minnesota’s State Dairy and Food Commissioner defended the measure that classified neutralized butter adulterated at a hearing on the treasury ruling.53 In Michigan, creameries that demonstrated that they could meet high sanitary standards carried a special state label and sold butter at a higher price. The Dairy and Food department provided a list of names and addresses using the state brand to interested consumers.
Despite winning state sanction, the efforts of small cooperatives to distinguish sweet cream butter from counterparts had limited impact. James Helme, the former commissioner of the Michigan State Food and Dairy Department who devised the state brand for butter, acknowledged that “it may take some time before Michigan State brand butter is known in the markets of the great cities.”54 In neighboring Wisconsin, when 306 housewives were asked “what does sweet-cream butter mean to you?” in 1932, half of the respondents had no response. Some women identified sweet cream butter as high-quality butter, but a few commented that sweet cream butter had less flavor, a flat taste, and too pale a color. Two women even contended that “you can’t make butter from sweet cream.”55 As centralizer and local creameries fought over cream neutralization, consumers—even in a dairying state—lacked a clear understanding of what the terms sweet cream and neutralized cream butter even meant.
At first glance, neutralization seems to offer the most dramatic example of dairy industrialization during this period. Endorsed by large centralizer creameries, neutralization accommodated the goal of cream purity to the economy of scale. But manufacturing sweet cream butter was just as revolutionary and far-reaching a transformation to the butter trade. Implementing it required changes to established rhythms of farm work, new networks of product distribution, and different ways of thinking about the nature of the dairy farm. Like neutralization, the shift to sweet cream butter was a modern phenomenon.
Creameries that sought to make sweet cream butter had to convince three audiences of its benefits: farmers supplying cream, state agencies that monitored the boundaries of purity, and consumers who purchased butter. The most profound change required was to convince farmers to devote the time to dairy work that selling sweet cream required. Most farm families conceived of cream as merely one element of a multifaceted farm income stream, not as a central economic engine. Many farmers spent more keeping a cow than they received for the butterfat it produced each year. They valued cows not simply by how much the animals earned in cash receipts from cream sales, but also for the skim milk, manure, and dairy calves they produced. Such farmers did not operate without regard to cash revenues; rather, they valued the cash garnered from selling corn fertilized with manure, or hogs that had been fed skim milk, both by products of dairying, like cream itself.
To reformers, this method of dairying had limitations. As A. J. McGuire, the general manager of Land O’Lakes Creameries, noted in a 1919 speech to Minnesota farmers, “Dairying in this way [for manure and skim milk] pays because it keeps up the fertility of the land, but the average young farmer will not become very enthusiastic over his dairy work unless he can be shown that there is something more than skim milk and manure for his labor in dairying.”56
The more to which McGuire alluded was farm profit, and if farmers reorganized their farms to maximize cream production, McGuire believed, they could rely on dairy cows’ cream, not just their by products, as a profit-builder. McGuire emphasized the need for cash profits to keep the young farmer satisfied. McGuire delivered his speech at a time when rural people and policymakers were especially concerned about keeping young women and men from leaving the farm. Only by making the farm more akin to the world of industry, with regular forty-hour weeks, modern machinery, and a steady paycheck, he argued, could rural areas keep the youth eager to convert their labor into the goods of consumer culture. 57 Farmers could partake fully in the industrial system, McGuire believed, but only if all of the goods they produced garnered high prices in the national market.
In the 1910s and early 1920s, only a few farmers conceived of the dairy work of their farms in the same way as McGuire. Many farmers were willing to feed cows at an economic loss for their manure and skim milk. Manure restored nitrogen, phosphorous, and moisture to soils without requiring outlays for commercial fertilizer. Thus, even after synthetic fertilizers became available, farmers looked upon manure favorably as a home-grown soil improver.58 Collecting dairy cows’ manure frequently generated higher yields of farm crops and improved barnyard sanitation.59 Thus, hauling manure was one of the farm tasks that Elihu Gifford recorded most frequently in his diary, second only to cutting wood. (Ordinary daily tasks, such as feeding livestock or milking cows, did not even merit mention in the diary’s pages.) Gifford hauled manure most frequently in fall, winter, and spring, when cows were being fed in the barnyard and when caring for crops didn’t consume most of his time.60 Similarly, skim milk boosted farmers’ return from hogs and calves. Gifford valued skim milk so much that in 1911 he purchased eight cans of skim milk at ten cents each.61 Wisconsin farmers Margaret and Rangnar Segerstrom likewise remembered raising “really nice pigs and calves” on skim milk from the cream separator.62 Whereas farm people like Gifford and the Segerstroms saw dairy work tightly intertwined with other farm operations, experts sought to loosen its connections to hog- and crop-raising and elevate the place of dairy work on the farm.
A second element of dairy experts’ market-driven reform effort was to encourage farm families to even out the seasonality of milk production with a balanced ration so they could take advantage of higher off-season milk and cream prices. Generally, cows gave the most milk during the late spring and early summer, when they grazed on pasture grasses and calved. Cows thrived on well-tended pasture grass, for it was high in nutrients, succulent, and tasty. But the seasonal increase in milk production that grazing encouraged, rooted in the ecology of the seasons, had an acute economic impact on milk prices. During the flush summer months, milk and cream prices fell with the abundance of milk produced, while milk prices increased in the winter due to the paucity of supply. The seasonality of milk production also proved frustrating for many farmers, making the burden of dairy work most intense in the summer, when crop work also demanded their attention.
Farm experts wanted farmers to provide cows a ration to encourage high milk production year-round. Ideally, greater attention to animal feeding would also more equally distribute the burden of dairy work throughout the year. Dairying experts’ feeding recommendations, rooted in market demands for a reliable and consistent quantity of cream for manufacture, introduced different uses of pastures and mixes of field crops to local farms. The recommendations provide a key example of the way that agricultural industrialization reordered ecological, as well as economic, relationships.
Dairy rations were composed of two parts: roughage (hays, straw, silage, or corn stover) and concentrates (grains or meals). Concentrates were usually the most expensive part of the ration. Farmers could rarely grow enough grain to feed cattle and had to purchase it from feed dealers. The more nutrients cows received from hay or silage, the less high-priced concentrates farmers would need to supply. Thus, farmers sought feed rations that minimized the need to buy expensive concentrates. Franklin Pope Wilson, who operated a farm in Loudon County, Virginia, for instance, wrote to the United States Agricultural Department’s dairy division in 1923, with hopes of devising a ration with a greater proportion of home-raised feeds. “With the present high prices of concentrates … bran, cottonseed meal, and mixed feeds, and the low price of wheat, which we raise and of which we have a supply on hand, I would like to know whether we can use to advantage some of this low priced wheat to mix with crushed corn, which we also raise, and with a more limited amount of the high priced bran and cottonseed meal which we have to buy.”63 Dairying experts’ most frequent advice to farmers was to improve the quality of the roughage they fed, especially by furnishing cows with corn silage and high-protein hays.64
Corn silage was one roughage that dairying experts promoted. Made from the cut-up stems and leaves of corn plants, cows relished silage because of its succulence. Dairying experts liked it because it extended the milking season and boosted milk production.65 Fed alongside hay, silage whet cows’ appetite for other foods and provided them with extra nutrients for milk production.66 Storing silage enabled farmers to have an ample reserve of fodder for winter months and also for midsummer when pasture grasses dried up and failed to provide a good ration.67 A filled silo reduced the risk of price fluctuations or intemperate climate by ensuring farmers a ready part of cows’ rations.
Corn had a long history as a livestock feed, but its use as silage was still relatively new to dairying in the early twentieth century. Some feared its effects on animal health, while others worried about silage flavors in milk.68 In 1917, only approximately 11 percent of dairy farms had silos.69 To take advantage of the risk-reducing and nutrient-improving potential that silage provided, farm families had to carry out a number of additional tasks: to construct a silo, to cut silage, and to fill the silo each September, in addition to raising a corn crop. When Elihu Gifford constructed a new silo in 1916, its assembly consumed much of the spring. The parts of the silo arrived by train in February, and he drew them in multiple sleigh-loads between the train car and the farm. Then, in early June, he dug a foundation and hewed scaffold poles in preparation to build the silo. With the help of three neighbors, Gifford and his brother began construction. Not until a month later was the thirty-three-foot silo fully tiled, roofed, and shingled.70 Gifford, who enjoyed community members’ assistance in building his silo, helped his neighbors to fill their silos each fall in return. Gifford or one of his sons brought their engine to power a neighbor’s silage cutter, which sped the process. To see a silo on a farm signaled the farm’s orientation to the tenets of industrial farming; farms equipped with them yielded greater quantities of milk more efficiently that could be channeled to a national market. But the neighborhood labor that constructed and filled silos each year is a reminder of the continued local character of industrializing dairy farms in the 1910s and 1920s.
Alfalfa and clover hay were other feedstuffs that buttermakers thought would make for a more profitable cream trade. Whereas corn provided cows’ digestible carbohydrates, clover hay and alfalfa hay were both high in protein. Together, the crops balanced out the ration and minimized the need for protein-rich concentrates.71 Red clover had additional virtues. It added nitrogen to the soil and was hardy enough to thrive in all kinds of weather. As Arthur McGuire told a gathering in 1929, “We would like to have every Land O’Lakes Creamery patron know and get the value of sweet clover pasture. It is without question twice the value of any other pasture. It comes on earlier in the spring, grows faster, withstands dry weather, and grows until the ground freezes in the fall.”72
Finally, dairy reformers urged farm people to take better care of cream. Carrying out a higher standard of cleanliness on the farm, they promised, would reap economic rewards through higher prices for sweet cream butter. They asked farmers to keep cream cool and to deliver it frequently. By the late 1920s, electric refrigerators came on the market, but not until the late 1930s would many rural districts acquire electric lines to power cream coolers.73 Until then, farmers harnessed the powers of wind, ice, and cool streams to chill cream. The most common method was to submerge cream cans in a tank of cold water. In Minnesota, Land O’Lakes encouraged farmers to construct windmills to pump water directly into cream-cooling tanks. North Dakota dairy experts experimented with ice well refrigeration.74 North Carolina farm families submerged cream cans in a barrel and set them in nearby springs to keep the cream cool.75
Modernizing creamery managers also asked farmers to deliver cream frequently or early in the day, especially in the summer, since the heat accelerated spoilage.76 The Hillpoint Creamery Association, for instance, stated in its bylaws that cream be delivered to the creamery by 8 a.m. between May and October, and by 9 a.m. the rest of the year.77 Most Wisconsin creameries surveyed in 1932 received cream three times a week in the summer and twice a week in the winter, but some required more frequent deliveries. Members of the Land O’Lakes creamery cooperative had to deliver cream four times a week during the summer and three in winter, and some creameries even called for daily deliveries during hot weather.78 Earlier summertime deliveries cut spoilage by minimizing time from cow to creamery and also gave buttermakers more time to handle the greater volume of cream produced during the flush season of May, June, and July.
Creameries had a particularly tough job convincing farmers to take on the additional work to keep cream sweet. For farm people, frequent cream delivery meant spending more time on dairy work during the busy season of planting and cultivating corn, mowing hay, and cutting oats. To encourage farmers’ efforts, buttermakers graded cream at the factory and rejected cream that they deemed unfit for making butter, while paying a premium for the highest-quality product.79 In 1921, for instance, the Minnesota Cooperative Creameries Association began paying patrons an extra five cents for each pound of sweet cream they delivered.80 But only a minority of creameries practiced cream grading. Only 28.6 percent of Wisconsin creameries surveyed in 1932, for instance, graded cream.81 Of those who did, fewer still paid a differential rate that rewarded the best cream. More plants cut the price paid for the poorest cream received, thereby eliminating the worst cream and improving quality.82
When looking for evidence of industrialization, historians generally hone in on the adoption of new technologies or the expanding scale of farm operations. But the signs of farmers’ incorporation into a modern, industrialized butter market were often more subtle: a cement cream-cooling tank filled with well-water or a field seeded in alfalfa hay. Such small-scale farm improvements indicated the power of a new national standard of purity to reform farm practice and a newfound desire to see dairy cows not just as producers of by products to be utilized on the farm but as creators of cream profits. The added labor such improvements entailed, farmers hoped, would be compensated with the earnings in an expanded market. Even as they aimed to purify the cream they churned in different ways than centralizers, operators of small local creameries were driven by the same aim of achieving a consistent, uniform product. Whether produced by local factories or highly capitalized centralizers, mixed from neutralized cream or sweet, butter was a product of nature, human technologies, ideas, and labor.
As dairying experts worked to alter the ways that farm people carried out their work, new practices of retailing and manufacturing prompted changes in the way people purchased butter. Until the 1910s and 1920s, many consumers bought butter in bulk, with the assistance of a shopkeeper. Grocers carried different kinds of butter in wooden tubs in refrigerated cabinets behind the store counter. The tubs usually bore some description of the place from which it came, though rarely the specific creamery. When a consumer came to purchase butter, the retailer scooped out a specified quantity from a butter tub, weighed it, and wrapped it for taking. Consumers relied on price, color, and storekeepers’ recommendations to decide which butter to buy; once home, when they tasted its flavor, shoppers decided whether the grocers’ word could be trusted in future purchases.83
By the interwar period, consumers increasingly bought packaged butter from the refrigerated case of a chain store. Suggestions of state-backed nutritionists informed purchasing decisions alongside grocers’ words and product labels. As in the past, consumers knew butter by tasting and smelling it, but their perceptions of the food were also mediated through the labels it carried, the sites where they consumed it, and state policies that governed its price.84 Changing practices of butter buying, like new forms of butter manufacture, carried implications for how people learned and drew the boundaries of pure and impure food.
Advertisements and food labels told consumers that butter’s purity originated in one of two places: the pastoral landscape or the modern factory. Butter appeared either as an industrial product or a product of nature, but not one of an industrialized nature. Claims about butter’s purity simultaneously obscured the processes of change transforming the countryside and the environmental connections inherent in the mass-produced food. In political debates about how to distinguish butter from oleomargarine, however, a different vision of butter surfaced, one that betrayed that its composition had both technological and natural origins.
As cooperatives and centralizer creameries battled over issues like neutralization, the primary concern of most consumers was butter’s price. In the 1910s, food prices spiked. The average retail price of butter, just under forty cents in 1913, rose to nearly seventy cents per pound by 1920.85 Many consumers blamed inefficiencies in the commodity chain between farm and market.86 They felt that wholesalers and retailers unfairly raised the prices of household staples and that more efficient methods of distribution might eliminate these costs. Grocers received special condemnation. In the years before self-service, consumers depended on grocers to monitor quality and measure quantity of butter they received. If a shopper found that the butter provided was overly salty or tasted moldy, she held the grocer partly responsible for insufficiently screening the quality of goods provided. Bargain-conscious shoppers also worried that grocers weighed butter improperly, so that purchasers received less than their money’s worth.87 Consumers’ dissatisfaction with full-service grocers in a time of rapid inflation created the context for the introduction of branded, packaged butter.
Like consumers, farm-based cream sellers and small creameries expressed frustrations about the wholesalers and distributors to whom they sold their cream. Each transaction between farm and market cut into the profits they received for the good. They resented paying transportation charges and worried that wholesalers marketing and storing their butter might not care for it properly. In some cases, consumers’ and farmers’ dissatisfaction with existing modes of food distribution yielded direct farm-to-market sales.88 But the geography of butter production limited the utility of farmers’ markets. Many creameries and cream sellers were located hundreds or even thousands of miles from the people who purchased their goods.
Instead of turning to direct marketing, butter buyers and cream sellers turned to the newest model of food marketing: the self-service chain store. The hope was that self-service grocery chains, by buying in quantity and cutting costs on service, would deliver goods more efficiently, allowing consumers to pay a lower price for a quality good and limiting the intermediate charges that minimized farmers’ earnings. Chain store purchasing appeared to make butter buyers more independent and self-sufficient and less dependent on grocers’ advice. In fact, it replaced grocers’ role with advertisers and nutritional experts, enlarging the power and influence of retailers in guiding butter’s path from farm to market.
Self-service food chains minimized the role of wholesalers. Instead, representatives of the chain obtained butter directly from the manufacturer. Creameries, including small cooperatives, signed up for chain store contracts because they believed that direct contracts would bring cost savings. As Philadelphia’s American Stores Company promised Wisconsin’s Milltown Co-op Creamery in 1920, “We deduct no commission, no cartage, nor any other incidental charges except freight.” Working with a chain meant access to a steady, year-round market for small-scale creamery patrons. To meet the quantity demanded by their contract with American Stores and to save in shipping rates, managers of the Milltown creamery teamed up with creameries in Luck, Frederick, Centuria, and Dresser Junction, Wisconsin, before shipping carloads of butter en masse to Philadelphia and New York.89
As it circumvented wholesalers, the self-service chain also minimized the role of the grocer as mediator of each butter-purchasing transaction. Instead, consumers were to determine the merits of a product by relying on so-called silent salesmen: packages, advertisements, and product displays.90 Proponents of the new sales method told business leaders that packaging made a sales pitch more effectively than people did. In the words of one 1928 packaging guide, “The manufacturer cannot expect the average retail clerk … to put up a very strong selling talk.” A well-designed package, by contrast, provided manufacturers with control over the messages delivered about their product and reduced labor costs.91
Proponents of packaging also claimed that labels enabled butter manufacturers to differentiate their product from others offered in the market. As the Paterson Parchment Paper Company explained to buttermakers in 1915, by stamping one’s name on the butter package, buttermakers could “create a demand with the housewife, so that when she comes to purchase butter she will look for, ask for, and insist upon having his particular brand.” The company continued, “It is worth a great deal to you to have a characteristic package that cannot be forgotten.”92 In time, the Paterson Company’s prediction rang true. One 1931 survey found respondents could name forty-eight different brands of butter.93
Before brand recognition could take hold, advertisers had to convince purchasers to select packaged rather than bulk butter. Butter advertisers tapped into consumers’ dissatisfaction with full-service grocers to promote the packaged good. An advertisement for Wedgewood Creamery butter alluded to the potential for grocers’ to cheat the scales by stressing that its package carried a “full sixteen ounces”’94 Packaged butter ads also suggested that bulk goods posed sanitary hazards. A 1911 Meadow Gold advertisement, for instance, noted that by wrapping its butter in extra layers of packaging, it remained free from grocery-store odors and woody flavors that tub butter absorbed.95 In 1930, Land O’Lakes published a similar advertisement, contending that bulk butter was a “menace to health.”96 Criticisms of bulk butter faulted retailers, not manufacturers, for impure or unclean butter.
By contrast, butter labels and advertisements praised the sites where butter originated: pasturelands and butter factories. Multicolored labels depicted scenic vistas and grazing cows. They created brand names with natural connotations. In 1924, the Minnesota Cooperative Creameries Association’s adopted the name Land O’Lakes, linking their butter to the streams and lakes of the place that produced it.97 Even butter factories that constituted a part of urban meat-packing operations adopted natural-sounding names that belied their modernity; Armour called its butter “Cloverbloom” and Swift used the name “Brookfield.”98
Figure 2.2 The first step to branding butter was to discredit bulk butter as an unsanitary food. N. W. Ayer Advertising Agency Records, Archives Center, National Museum of American History, Smithsonian Institution.
The nature to which butter labels appealed was a timeless one. Advertisements depicted clear streams and grazing cows, not newly constructed cooling tanks or silos. The labels acknowledged that butter derived from herds of Jerseys and Holsteins, but made no mention of the farm people who milked the cows, separated the cream, or delivered it to the creamery. The Land O’Lakes label made an Indian maiden its central figure. By associating butter with these images, creameries assured consumers that even as the world around them changed, their butter remained authentic and simple.
At the same time as they praised pastoral landscapes, butter labels highlighted creameries as modern manufacturers. Meadow Gold told potential purchasers that each churning was “analyzed by skilled chemists.” Advertisements emphasized that machinery, not human hands, prepared it for sale. Fairmont Creamery promised that each employee donned a “freshly laundered uniform each morning” and guaranteed that at each stage “expert specialists” oversaw creamery processes, using scientific techniques to ensure “uniform body, texture and quality.”99 Emphasizing butter’s industrial origins, these ads credited human expertise and technology for ensuring butter’s quality and minimized the seasonal variability or generative powers of nature on which cream production depended.
Figure 2.3 The vision of butter’s purity expressed in the 1920s underscored the food’s natural origins and the expertise of those who churned it. Note the emphasis to consumers on getting the full weight in a packaged good. N. W. Ayer Advertising Agency Records, Archives Center, National Museum of American History, Smithsonian Institution.
In truth, butter was not wholly natural or technological, nor was its purity rooted simply in its pastoral origins or industrial safeguards. At every stage, butter was a hybrid of natural and cultural influences, from the moment that farm people led a cow to a bull for breeding to the time that consumers’ bodies converted the dairy fat into energy. Advertisements juxtaposed representations of the natural pasture and the technological creamery without comment. But political debates about butter’s place in the national diet brought the tensions such representations suggested to the fore.
Even as the rise of chain store retailers encouraged creameries to begin to brand and differentiate their butter from that of other manufacturers, what concerned many in the butter industry was not competition from other buttermakers, but competition posed to their product by oleomargarine. Dairy farm families and butter industry officials had lobbied against margarine since the 1880s, and their efforts had successfully created a federal margarine tax. But by the interwar period, competition between butter and margarine intensified as butter prices increased and new technologies altered the way the fats were made. Dairy interests consistently appealed to the nature of butter and disparaged margarine as an artificial food. Despite these claims, both butter and margarine were mixed products of environmental forces, technological interventions, and social mores.
One of the most important new developments in the competition between butter and margarine in the interwar period was the discovery of vitamins in foods. Vitamin research transformed the way in which Americans evaluated foods’ health value. Once feared for its role in spreading communicable diseases, food was newly lauded for its beneficial properties. Further, foods like fruits and vegetables, discounted in a model of nutrition that emphasized high calories and low costs, came to be appreciated as “protective foods” that delivered essential nutrients.100 As late as 1905, nutritionists with the United States Department of Agriculture advised consumers to purchase foods that provided the greatest amount of calories, fat, and protein for the lowest cost.101 But by the late 1910s and 1920s, research indicated that certain fats and proteins carried beneficial trace nutrients, called vitamins, that warded off diseases like rickets, scurvy, and pellagra. The proper ratio of carbohydrates, proteins, and fats alone would not guarantee good health. Rather, a mixed diet of vitamin-rich foods was important. Getting enough vitamins required careful food selection, preparation, and consumption.
Butter manufacturers followed vitamin research closely and benefited from the publicity it generated.102 When animal feeding experiments found that butter, unlike other fats, contained vitamin A, researchers suggested that butter was nutritionally superior to other fats.103 Home economists, whose numbers grew and professionalized in the 1910s and 1920s, disseminated nutritional advice based on vitamin research to a wide audience. Public health officers also publicized findings about vitamins, enlarging their role as food inspectors and encouraging children to eat vitamin-rich foods like butter. Aided by public health officers and nutritionists, butter manufacturers advised consumers that the fat in butter was healthful by its nature.104
Vitamin research articulated the relationship between nature and health more narrowly than images of the pastoral landscape used to elicit purity. It established the specific pathways by which nutrients of nature entered human bodies and measured the material composition of foods. By the mid-1930s, for instance, researchers knew that cows fed roughage high in carotene—such as fresh alfalfa, carrots, or fresh green Kentucky bluegrass—converted the carotene pigments to vitamin A and secreted it in larger quantities in their milk.105 The more directly a cow’s diet came from nature, the higher the vitamin content of the butter produced from its milk. Rather than replace aesthetic appeals to nature with nutrition-based claims entirely, advertisers clothed old notions of the health of the country with the new legitimacy afforded by vitamin research. Fairmont Creamery Company, for instance, reminded consumers that “Fairmont’s Better Butter Brings the Vital Food Element Vitamins from Fields of Clover and Alfalfa to your Table.”106 By tying its butter to clover and alfalfa fields, Fairmont was able to evoke the pastoral landscape and utilize scientific studies in one fell swoop. Even with new nutritional knowledge, the most basic advertising claim employed by buttermakers remained: butter was healthy by its nature.
In 1938, when scientists fortified margarine with vitamin A, they undercut the inherent nutritional advantage butter held. As vitamin fortification made margarine nutritionally equivalent to butter, dairy industry officials turned to well-worn rhetoric about nature to bolster butter’s standing. Whereas fortified margarine’s vitamin richness was counterfeit and artificial, they claimed, butter’s nutritional content was “natural.”107 To the surprise and dismay of the dairy industry, nutritionists expressed few qualms about recommending vitamin-fortified margarine as a healthful alternative to butter.108 A 1941 study by the American Medical Association’s Council on Foods and Nutrition, for instance, concluded that “there is no scientific evidence to show that the use of fortified oleomargarine in an average adult diet would lead to nutritional difficulties.”109 The Committee on Food and Nutrition of the National Research Council similarly determined that “present available scientific evidence indicates that when fortified margarine is used in place of butter as a source of fat in a mixed diet no nutritional differences can be observed.”110 By 1942, 99 percent of all margarine was fortified with vitamin A.111 The language of vitamin research, once used to describe butter’s natural properties in modern terms, ultimately encouraged scientists and consumers to recommend a replacement.
Dairy industry officials invoked the connections between nature and health in a second element of the butter-margarine debate: disputes over butter’s color. State and federal legislation regulating the sale of oleomargarine turned on whether margarine manufacturers should be allowed to color their product in the same golden shade as butter. In the 1880s, numerous states passed margarine laws prohibiting the coloring of margarine. New Hampshire, Vermont, and South Dakota even required that oleomargarine be colored pink so it would not be confused with butter. Color became the basis for federal laws to restrict oleomargarine sales in 1902. In that year, a Congressional Act known as the Grout Bill increased the tax on “artificially colored” oleomargarine to ten cents per pound but lowered the tax on uncolored oleomargarine to one-fourth of one cent.112
Congressional laws taxing artificially colored oleomargarine and the dairy lobbyists who encouraged their passage drew a clear line between nature and artifice. According to butter proponents, butter’s golden color was authentically natural—derived from the sun and the pasture grasses on which cows grazed. Butter’s golden hue indicated its freshness and richness, a reminder of butter’s pastoral origins.113 As George McKay, a spokesman for the American Creamery Butter Manufacturers, declared in 1918, “In selling butter we might say we are selling air, sunshine, and rain.”114 Meanwhile, dairy lobbyists stated that the addition of yellow dyes made margarine artificial. They poked fun at margarine by claiming its origins were unnatural; while butter was the product of pastoral places, margarine came from an aberrant creature—a coconut cow.115
The line between nature and artifice was trickier, however, than the rhetorical stance of dairy lobbyists suggested. Butter’s color was not, in fact, only derived from nature. During the winter months, when cows were fed on silage and dried hay instead of fresh grasses, their milk paled. Hence, butter manufacturers—like margarine makers—added yellow dyes to make butter’s color richer.116 Moreover, margarine’s color was not as artificial as rhetoric implied. In fact, by the late 1920s and 1930s, margarine manufacturers relied not on dyes, but on tropical oils to lend margarine a yellow tint. Combined with coconut oil, palm oil gave margarine a golden hue. Palm and coconut trees were, of course, natural species. Despite the natural origins of tropical oils, dairy lobbyists contended that margarine manufactured from them was artificial. According to lobbyists, the conscious selection of naturally colored yellow fats to make margarine was a manipulation of nature—a process “tantamount to using the product commonly known as artificial color.”117
In November 1930, however, the U.S. Commissioner of Internal Revenue David Burnet revised the line between artifice and nature. Burnet argued that margarine containing unbleached palm oil would no longer be classified as artificially colored since its color came from a natural substance. This reclassification meant the product derived from palm oils would be taxed only one-quarter cent per pound, not ten cents per pound.118 The dairy industry responded quickly. In January 1931, Vermont’s Congressman Elbert S. Brigham introduced a bill calling for any colored margarine—whether colored with dyes or naturally occurring yellow fats—to be taxed at ten cents per pound. Brigham’s proposal left the tax on uncolored margarine low but restored the provisions previously established under the Grout bill to all colored margarine, rendering Commissioner Burnet’s ruling moot. Brigham’s bill became law on March 4, 1931.119 Burnet’s 1930 ruling that margarine colored by palm oil was a natural color forced dairy interests to reconsider descriptions of butter as a natural food and oleomargarine as an artificial one.
Despite these changes, imagery of nature continued to figure into the butter versus oleomargarine battle. Dairy interests played up the idyllic nature of the dairy farm and cast suspicion on palm and coconut oil by emphasizing the foreign labor and primitive processes used to produce it. On one side was the pastoral imagery of a herd of cows near a babbling brook in a springtime pasture. On the other, a tropical forest in a faraway place, where foreigners harvested coconuts and palm fruits. Tennessee Congressman Ewin Lamar Davis argued, “This is a contest between wholesome butter produced from American cows by American citizens on the one hand, and oleomargarines composed of the palm oil of Java and Sumatra and the coconut of the Pacific and South Sea Islands.” Pennsylvania’s Congressman Franklin Menges told of palm fruit chaff being “tramped by bare-footed natives” to extract the oil from which margarine was made.120 By contrasting tropical natives in the palm oil trade to American dairy farmers, a group whose racial makeup was predominantly white, Menges and Davis tied butter’s purity to racial ideologies.121 Their critique also cast butter as a more technologically modern product than margarine, for barefoot palm fruit-trampers were a far cry from the expert, white-coated creamery workers highlighted in butter ads. For Menges and Davis, as for creamery manufacturers who labeled butter with images of verdant pastures and up-to-date equipment, butter’s purity rested in both its natural origins and its modern transformation.
Support from southerners, who had previously aligned themselves with the oleomargarine industry, was critical to the passage of the 1931 Brigham Bill. Southern support for the oleomargarine industry had rested in southerners’ defense of the staple crops important to the South: cotton and peanuts. Peanut and cottonseed oils constituted some of margarine’s key ingredients. As dairying developed in southern states, representatives from Mississippi, Tennessee, and Texas became more inclined to back their states’ burgeoning dairy industries. But the bill’s success also rested on the changing blend of oils used to manufacture oleomargarine. The Brigham Bill targeted palm and coconut oils for regulation, not cottonseed, soy, or peanut oils grown domestically. The growth of the southern dairy industry, combined with the rising amount of tropical oils used for oleomargarine, brought southern agricultural interests in line with dairy lobbyists to protect butter in the market.
National congressional support for butter was short-lived. In 1933, scientists mastered the process of hydrogenating fats, making it possible to use domestic products—such as cottonseed and soybean oils—as the main ingredients for margarine. Thus, dairy interests could no longer cast margarine as a foreign product. Congressional representatives from southern and even midwestern states who had aligned with dairy interests to pass the Brigham Bill became loyal once more to the soybean, livestock, and cotton producers in their own districts.122 Once again, a new means of blending the nature and technology of food altered the position of butter vis-à-vis margarine.
Ultimately, World War II, not new technologies or taxation policies, changed American food habits regarding butter. At first, the war seemed a boon to dairy producers. Military contracts and lend-lease exports ensured creameries with a market. Americans employed in wartime industries had a greater appetite for dairy foods and the ability to purchase them. By 1942, however, policymakers worried that the country lacked enough fats and oils to supply wartime needs. The fight in the Pacific cut off access to imported oils. Such fats fueled the stomachs of soldiers and civilians and were used to make paint, soap, and propellant powders.123 As wartime planning dominated the policy agenda, even pats of butter atop a stack of shortcakes seemed essential to the nation’s defense.
In the early years of the war, as butter became more scarce, consumers went out of their way to obtain it. In February 1943, Portsmouth, New Hampshire, residents lined up in the pouring rain to await the arrival of a large shipment of butter at the local First National Store. Wartime diaries note butter purchases jubilantly, making clear that such purchases were not commonplace. As one resident wrote, “Ma got a whole lb of butter! First whole lb for a long time!”124 Even as politicians and nutritionists cast margarine in more positive terms due to its inclusion of domestically produced ingredients and its enhanced nutritional content, consumers still preferred butter.
Facing scarcities of fats and oils, on March 23, 1943, the Office of Price Administration began rationing butter and margarine to prevent inflationary pricing. Rationing tightened the competition between butter and margarine, because consumers had to redeem more ration points to purchase a pound of butter than a pound of margarine. Yet, at first, consumers held to their preference for butter over margarine. Some citizens even cared enough about butter to complain to government officials when they learned of shipments of butter being sent for lend-lease purposes, voicing resentment that foreigners would receive American-made butter, when they themselves had not eaten butter for months.125
As butter became scarce, many consumers tried margarine for the first time. Portsmouth, New Hampshire, resident Louise Grant recorded her first use of margarine in an August 1943 diary entry, writing “Colored & salted & sugared some oleo & it was alright. Butter is so scarce & takes 10 points to get a pound that you can’t afford to use it on corn, etc.” By the end of September 1943, butter had become a rare treat, rather than a household staple, at the Grant household. “Butter is going up to 16 points and that would be a whole week’s points for one person. We haven’t bot [sic] any for a long while.” By October 7, 1943, she wrote, “No butter of course but then we don’t use it now.”126 This family’s transformation in food practices from butter to margarine was paralleled around the country. If consumers purchased butter, they might have to forgo meat, whereas they could purchase margarine and keep putting meat on their tables. Hence, consumers who had once spurned margarine as a low-class food came to accept it and found the cheaper alternative for butter satisfying enough to spread on their toast or to coat their vegetables. Consumers’ shift from butter to margarine during the wartime period illustrates the ways that state policy, as well as advertising rhetoric, mediated consumers’ purchasing practices.
Wartime milk shortages, not simply those of fats, made federal policy on butter subject to policymakers’ scrutiny. By 1943, New York, Washington, D.C., and San Francisco faced temporary fluid milk shortages. Margarine supporters argued that by replacing butter with margarine, dairies could channel their products to fluid milk, rather than separating milk to make butter. In a House Agricultural Committee hearing in 1943, Texas U.S. Representative W. R. Poage remarked, “Is not the sensible thing for us to do during a period of time when we have a physical shortage of the products of the cow to replace that one product that can be replaced with a vegetable oil rather than attempt to convert the fluid milk into a product for which we have a palatable and suitable substitute?”127 Policymakers considered milk irreplaceable, but they increasingly defined butter as a manufactured food for which there were good alternatives. By the mid-1940s, many consumers, retailers, and nutritionists, drawing on the notion of equivalent food values, saw butter and margarine as interchangeable.
The logic of nutritional substitution, however, obscured that butter and margarine originated from very different rural environments. Poage’s suggestion that milk be channeled toward drinking rather than butter production overlooked the distinctions between farms selling cream and those marketing fluid milk. Most farm families who supplied butter factories could not easily convert to fluid milk production. They were too far from cities to market whole milk and usually lacked the cooling equipment and licenses to do so. A dairy could shift to supplying milk for drinking only insofar as the farms supplying the milk company were equipped with the labor and technology to meet stringent sanitary standards. These tight connections between dairy manufacturers and their rural suppliers made a seemingly simple substitution of one raw material with another much more complex than policymakers suggested. As butter’s exclusive claims to nutritional richness and natural purity melted away, it created new challenges both for creameries and the producers who supplied them.
By the 1940s and 1950s, the dairy industry mounted a vociferous defense of butter. To contemporary eyes, the documents from this period seem laughable. Kitschy pro-butter advertisements glorified the 1950s kitchen. Some observers even decried oleomargarine as a communist plot or a tool of Satan.128 Viewed in the context of changes to dairy manufacturing in the 1920s and 1930s, however, these zany pro-butter documents take on a new light. The farm people and creamery manufacturers who doggedly defended butter did so because they became devoted to cream production for profit, rather than simply as a sideline of general farming in the 1920s and 1930s. The very same pro-butter proponents who appear in the 1950s and 1960s to be agricultural fundamentalists, committed to an industry and food of the past, were on the forefront of modernizing the dairy industry just decades earlier. The margarine-butter battles of the 1950s and 1960s indicate more than simple economic self-interest. They also reveal growing disillusionment among those who hoped tighter links between the goods of the country and the urban marketplace would bring a brighter agrarian future.
It is ironic that butter producers sought to distinguish their product as an authentically natural food and to characterize margarine as an artificial, factory-produced one. By the interwar period, butter producers were just as reliant as margarinemakers on industrial techniques and mass-marketing strategies to sell their food. Specialist buttermakers standardized butter quality by practicing neutralization or being exacting about the kind of cream their plant accepted and employed artificial dyes to give butter a uniform golden hue throughout the year. Whether farmers sent cream to centralizer creameries or to neighborhood cooperative creameries, the butter trade operated with a greater scale and geographic reach than ever before. Processes of industrialization reached back to the farm itself, where farm people altered their work patterns to deliver sweet cream, erected new technologies (like silos), and planted different forage crops to prolong milk production. Modernization of the butter trade brought businessminded efficiency to the barnyard.
Farm people’s and buttermakers’ increasing reliance on technologies hardly indicated that environmental factors no longer affected butter production. Rather, the widespread use of technological processes came about precisely because environmental factors posed perpetual challenges to the food’s commodification. Neutralization overcame natural variability of cream. Stored silage moderated the seasonality of milk production. Butter color imbued the dairy fat with a deeper hue in the winter months. Neither unadulterated nature nor wholly artificial, butter was a mix of human technologies and environmental forces.
As the butter trade blended nature and technology, it incorporated small-scale dairy farm laborers into networks of national and global commerce. Whereas present-day rhetoric stresses the existence of two agricultural systems—one local and self-sustaining and the other industrial and large-scale—cream-supplying farmers of the interwar period operated on the local and national levels simultaneously. The centralized factory system of butter manufacture provided a market to the smallest, least efficient cream sellers. Locally-rooted social and ecological networks sustained and fostered engagement in a national economy. Neighbors joined together to haul scientifically balanced feed supplements shipped from afar as well as to harvest locally produced hay. Rather than seeing the rise of national manufacturing and a mass market as forces that eroded local rural economic vitality, many farm families viewed engagement with national markets as a path toward (not away from) economic independence in the 1910s and 1920s.
As farm people and creamery manufacturers relied on new technologies to transform the material nature of butter, they articulated an ideal of food purity that drew upon both industrial and environmental values. Advertisers stressed either the food’s origins on the pasture or its expert completion in the factory, even though achieving pure and flavorful butter depended just as much on intermediary processes such as cream storage on the farm or sufficient sanitation of cream separators. Definitions of butter purity that centered on middle steps on the commodity chain—such as “sweet cream butter” and “neutralized butter”—failed to resonate with consumers. Even though food purity remained important for those manufacturing butter, the emergence of new ways of thinking about butter, such as nutritionists’ emphasis on vitamins and state policy that established ration points, encouraged consumers to consider butter’s nutritional contents and its price, not just its purity.
In 1911, butter was considered such an essential, irreplaceable, nutritious staple food that a hike in the butter price inspired a protest movement. By the late 1940s, butter was increasingly thought of as one of a few comparable alternatives for enriching the table. The same kind of women who had once defended access to butter as a citizens’ right—consumer advocates and home economists—testified in the 1943 and 1949 congressional hearings to encourage an end to the federal tax on oleomargarine. Despite butter’s well-established place in American cookery and rhetoric that consistently trumpeted butter’s natural advantages, margarine benefited from the development of vitamin fortification, new consumer practices during World War II, and the elimination of federal taxes on nondairy fats. State policy, particularly on the federal level, facilitated consumers’ changing attitudes about the dairy food.
Mass-marketed, packaged, standardized butter was just one product of the industrialization of the dairy business. The same factories that made dairy foods like butter and cheese generated a stream of cast-off effluent in the form of skim milk and whey. These by products of dairying, which had once been incorporated back into farm processes as animal feeds, concentrated as the scale of industrialization increased. Dairy plant managers hoped to recapture some value from these by products by remaking them, too, for a modern marketplace.