When Curt Ellis graduated from college, he gave researchers a single strand of hair.1 After examining its molecular makeup, they told him the composition of his diet. Like that of most Americans, it was based overwhelmingly on corn. Ellis was surprised; he didn’t think he ate much corn.
Walk down any supermarket aisle, and you will find corn processed into every imaginable form. Bread, juice, granola bars, yogurt—read the labels, and there it is. Your salad at lunchtime? Drenched in salad dressing sweetened with high-fructose corn syrup. The beef you eat? Fattened on corn. If we are what we eat, it’s a wonder we haven’t all turned yellow.
Shocked by that discovery, Ellis and a classmate decided to keep digging. After college, they moved from Connecticut to Iowa, where nearly a fifth of the nation’s corn is grown. With a population of 3 million people, Iowa grows nearly 2.5 billion bushels of corn annually.2 Curt and his partner bought an acre of land to grow corn of their own. Their plan was to trace its journey through the US food system.
The result was King Corn, a documentary that serves as a scathing indictment of our food industrial complex. Like Upton Sinclair’s The Jungle, which revealed the horrors of Chicago slaughterhouses in the early 1900s, King Corn pulled back the curtain on the modern American diet. It’s a story of government subsidies, corporate greed, and an ambivalence about nutrition that have conspired to help make Americans among the least healthy people in the developed world.3
For parents with limited time and small budgets, the easiest thing to do is feed kids empty, corn-based calories. Our food system enables our least nutritious and most caloric cravings, engineered and reengineered to stuff shareholders’ pocketbooks and consumers’ expanding waistlines. As a result, one in five children today is obese—a rate that has tripled since the 1970s.4 For children of color, it’s one in two, as is the probability of developing type 2 diabetes at some point in their lives. Forty percent of American adults are obese, resulting in conditions ranging from diabetes to stroke, cancer, and heart disease.5 Poor diet is now the leading cause of death around the globe.6
No farmers are trying to sicken their consumers, just as no financial advisers are trying to impoverish their clients, no educators are trying to saddle students with debt but no degree, and no doctors are trying to leave patients addicted to opioids. No one intends it. But it happens.
The field of design thinking has a saying: “Every system is perfectly designed to get the results it gets.” This chapter is about understanding how our system has given us the corporations we’ve got. We will begin with the story of Kellogg’s. Kellogg’s was founded to make breakfast healthier but eventually succumbed to selling sugary cereals instead. We will see how this reflects a broader disease in our corporations: fiduciary absolutism, a myopic focus on short-term profit above all else.
We will explore the theoretical underpinnings of fiduciary absolutism. It began as an answer to a question that is long out of date: How can we make corporate managers more focused on serving shareholders? We will see how fiduciary absolutism relies on economic theories—the efficient market hypothesis and the invisible hand—that apply better to the eighteenth-century Scottish economist Adam Smith’s age than to our own.
But where did fiduciary absolutism come from? In this chapter, we will also trace a brief history of capitalism in America. Ultimately, we will see how the economic malaise of the 1970s combined with structural changes to our economy to create fertile ground for fiduciary absolutism to take root.
That does not mean our corporations were perfect in the past, nor that they are damned to chase short-term profits for all time. We’ll see how some new companies are challenging fiduciary absolutism and how larger corporations—including Kellogg’s itself—are now trying to adapt to a changing world.
Food companies are being pushed along by people such as Curt Ellis. Not content with just making a documentary, Ellis began to fight. In 2010, he cofounded FoodCorps, which embeds recent college grads in high-poverty schools to teach healthy eating. By the time many students get to school in the morning, they have already consumed 15 or more grams of sugar in a bowl of cereal such as Frosted Flakes. That’s 60 percent of the recommended daily limit even before first period has begun.7
Ellis is fighting for the health of our children, but he’s up against the multitrillion-dollar food industry. How can he combat the 225 million hamburgers McDonald’s sells worldwide each year? Or the equivalent of 2 billion cans of carbonated sugar water Coca-Cola sells each day? Ellis could spend his life preaching the virtues of fresh fruits and vegetables, but when Taco Bell makes a taco shell out of Doritos, it sells a million tacos in twenty-four hours.89
Five of the largest food manufacturers in America—Kellogg’s, General Mills, Tyson Foods, PepsiCo, and Kraft Heinz—sell over $150 billion of product each year. Reformers like Ellis aren’t sticking a finger into a dike; they’re caught in a tidal wave.
Despite Big Food’s cynical marketing to our worst biological impulses, the tale is not all about unadulterated corporate greed—at least not from the beginning. In fact, some of the worst corporate actors today were founded with noble ambitions. And it all started with breakfast.
Kellogg’s Corn Flakes was meant to be a superfood. It was developed by a doctor for his patients as an easy, healthy substitute for what passed for breakfast at the time. During the mid–nineteenth century, the word “dyspepsia” entered the American lexicon. Howard Markel, the author of The Kelloggs: The Battling Brothers of Battle Creek, described dyspepsia as “a nineteenth-century catchall term for a medley of flatulence, constipation, diarrhea, heartburn, and ‘upset stomach.’”10 He went on to show that breakfast was one of the main contributors to the condition:
Early morning repasts included filling, starchy potatoes, fried in the congealed fat from last night’s dinner. As a source of protein, cured and heavily salted meats, such as ham or bacon, were fried up as well. … the staggeringly high salt content made one quite thirsty and eager for a drink—a situation not lost on the saloonkeepers of every town in America who routinely opened for business in the morning.
Enter Dr. John Harvey Kellogg of Battle Creek, Michigan, and his brother Will. Like Curt Ellis, they were men on a mission to improve the American diet. At the Battle Creek Sanitarium, the Kellogg brothers experimented with new ways to feed patients that focused on whole grains and fresh vegetables and fruits.11 As one popular myth has it, Kellogg’s Corn Flakes came to John in a dream involving a new way of rolling and cooking a batch of wheat-berry dough. The ultimate result was a simple and—for its time—healthy dish that revolutionized the food industry.
Few companies can claim to have changed the way Americans eat, but Kellogg’s and its early competitors did just that with breakfast cereals. Unfortunately, Kellogg’s Corn Flakes and the other better-for-you brands that launched this industry have been overcome by the countless sugary cereals that line today’s supermarket aisles. Roughly 90 percent of Americans indulge in packaged cereal of some kind, and nearly all of these cereals are based substantially on added sugars.12
The cereal Cookie Crisp was marketed to kids during cartoon commercial breaks with the tagline “You can’t have cookies for breakfast. But you can have Cookie Crisp!” Similar cereals are often more than one-third sugar by weight. That’s like eating a spoonful of sugar for every two spoonfuls of Kellogg’s original cereal. Cookie Crisp may wear its disregard for health on its sleeve, but Kellogg’s Honey Smacks touts misleading nutrition claims such as “Good source of Vitamin D” despite providing only a tenth of the recommended daily value. Raisin Bran sounds healthy, but it has more sugar per serving than Count Chocula.
How did Kellogg’s go from manufacturing the first breakfast superfood to becoming a supervillain of the American diet? Dr. John Harvey Kellogg wanted Kellogg’s to help sick people get healthy. What would he make of a world in which his name has become synonymous with making healthy people sick? Though Kellogg’s was founded with a deeper purpose, it is representative of many of our large corporations today: it appears to care more about earning profits for shareholders than it does about the long-term well-being of its customers—not to mention its employees or its communities. This ideology—which we’ll call fiduciary absolutism—has become so pervasive that we accept it as just the way the world works. As we’ll see, the forces behind Kellogg’s transformation are pushing on every corporation in our economy. But when we look at the company’s history, we see that the world hasn’t always worked this way, and it need not in the future.
In 1976, the bicentennial of our nation’s independence and the 200th anniversary of the publication of Adam Smith’s The Wealth of Nations, the economists Michael C. Jensen and William H. Meckling published a paper that has since become the most widely cited study in the history of business literature.13 It’s done as much to influence the way our economy works as any other academic paper in history. Its subject? The principal-agent problem.
Imagine this: You hire a babysitter to watch your children. You are the principal, and the babysitter is your agent. The problem is that your interests diverge. You care much more about your children than your babysitter does. Your babysitter cares, sure, but he also cares about having an easy afternoon. When your children act up, it’s easier for the babysitter to put on the TV than provide a lesson on sharing. Since the agent has the day-to-day control, the principal’s interests tend to suffer. This is the principal-agent problem.
Jensen and Meckling argued that this was the central problem of corporate governance. Corporate managers (the agents) were not working in the best interest of shareholders (the principals).
To solve this problem, they suggested clarifying what a corporation is for. “Because it is logically impossible to maximize in more than one dimension,” Jensen argued, “purposeful behavior requires a single-valued objective function.”14 Under the principal-agent problem’s framework, the single-valued objective function should be for the corporation’s managers to maximize financial value for shareholders.
This theory built on the existing law of fiduciary duty. Fiduciary duty is as old as the Code of Hammurabi. Some version of it is law everywhere that corporations exist. Fiduciary duty mandates that managers not be self-serving or negligent in running corporations owned by others. It provides some guardrails on what managers can do. For example, managers cannot sell corporate assets to themselves at discounted prices.
Fiduciary duty is a vital part of corporate law. But in their paper, Jensen and Meckling proposed something radical: they took the guardrails of fiduciary duty and transformed them into the sole purpose of the corporation. In other words, fiduciary duty became fiduciary absolutism, and a constraint became a demand. With support from popular intellectuals such as Milton Friedman, fiduciary absolutism became the guiding principle of the modern US economy.15 Forget balancing different stakeholder demands or conflicting obligations to society; just maximize shareholder value. As the legal scholar Lynn Stout wrote, this “offered an easy-to-explain, sound-bite description of what corporations are and what they are supposed to do.”16
The idea caught on. As one contemporary remembered it, Jensen’s arrival at Harvard Business School, in 1985, represented a singular shift. Before his arrival, “No one was talking about ‘shareholder value.’”17 But by 1986, after Jensen was ensconced as a tenured professor at the nation’s most prestigious business school, everyone was. Business research since the 1970s has been focused almost entirely on financial outcomes, rather than corporations’ many other roles in society.18 At that point, Stout wrote, “Shareholder primacy had become dogma, a belief system that was seldom questioned, rarely justified, and so commonplace most of its followers could not even recall where they had first learned of it.”19
We can trace the spread of this ideology through the statements of the Business Roundtable, which is made up of CEOs speaking for 30 percent of all equity value in the United States.20 As late as 1981, the members of the Roundtable said that although shareholders must earn a fair return, “the legitimate concerns of other constituencies also must have appropriate attention.”21 They believed that “the owners have an interest in balancing short-range and long-term profitability, in considering political and social viability of the enterprise over time.”
By the mid-1990s, that sort of fair-minded balance was gone, replaced by fiduciary absolutism. “The principal objective of a business enterprise is to generate economic returns to its owners,” the Roundtable now wrote.22 “The notion that the board must somehow balance the interests of stockholders against the interests of other stakeholders fundamentally misconstrues the role of directors.”23
Long-term sustainability be damned. Stakeholders be damned. Purpose be damned. Fiduciary absolutism has driven our corporations to focus on shareholders above all else. If sugary cereal serves shareholders better, Kellogg’s has a sacred obligation to fill the bowl of every last child.
But even within the simplicity of this shareholder-focused framework, managers still have a problem: they have to serve shareholders, but which ones? Some shareholders want growth, others want dividends. Some seek risk, others seek stability. Some hold their stock for days, others for decades. Enter the efficient market hypothesis.
In its strongest form, the efficient market hypothesis says that a corporation’s current stock price represents its true value. The stock price includes all relevant information and considerations, and therefore increasing the current stock price benefits all shareholders uniformly. Rather than worrying about balancing the interests of different time periods or risk tolerances, managers could now focus on a single number.
The corporate world is much simpler under fiduciary absolutism and the efficient market hypothesis. Fiduciary absolutism tells managers that they exist to serve shareholders. The efficient market hypothesis says that they do so by maximizing today’s share price. This is the single objective function—the one and only thing that matters. Where else in life does this kind of unbalanced focus lead to better outcomes?
The result is the pernicious short-termism that plagues corporations today. In one recent survey, more than two-thirds of CEOs and CFOs of large public corporations said they faced pressure to maximize short-term returns at the expense of long-term growth.24 Consider the quarterly earnings that corporations report. The number of firms that provide forecasts for this short-term metric grew from 92 in 1994 to more than 1,200 by 2001.25 Two-thirds of CFOs said they had been pressured by other executives to misrepresent corporate results in order to meet earnings expectations.26 Four in five business leaders said they would decrease spending on research and development, advertising, maintenance, or hiring in order to hit those benchmarks.27 Over half of managers surveyed would skip carrying out a profitable project to do the same.28
There’s just one problem with the efficient market hypothesis: it’s wrong.
It’s what the economist John Quiggin calls a “zombie idea”—a concept that is intellectually dead but exercises broad influence nonetheless.29 The economist Luigi Zingales agrees. Whereas in the late 1980s, no other proposition in economics was as widely supported, today “it is hard to find any financial economist under forty with such a sanguine position.”30 A company’s stock price at any moment reflects only the price at which the last buyer and seller traded a share. That’s it. There are things companies can do to increase the price today at the expense of long-term value or to benefit some shareholders more than others. Corporations have many different stakeholders even within their shareholder base and many different ways to serve them.
Yet the allure of a simple theory is strong. As the economist John Maynard Keynes once wrote, “Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.”31
Here’s the result of fiduciary absolutism: corporations see no purpose higher than profit, no duty greater than maximizing shareholder value. We have an economy that maximizes the one thing we shouldn’t care about—short-term stock price—at the expense of everything we should. The real genius of fiduciary absolutism, however, is that it rests on a deeper moral justification. There is little in the principal-agent problem or the efficient market hypothesis to elicit much enthusiasm. So why has it become such a dominant way of doing business? Because, proponents argue, maximizing profit for shareholders is what benefits society most overall.
This is the invisible hand. “It is not from the benevolence of the butcher, the brewer, or the baker that we can expect our dinner,” Adam Smith wrote in The Wealth of Nations in 1776, “but from their regard to their own interest.” Our self-interest ends up benefiting others as if guided by an invisible hand. The modern rendition is that private vice makes for public virtue. As Keynes protégé E. F. Schumacher put it, we’ve come to think “the road to heaven is paved with bad intentions.”32
Maybe that’s sometimes true. But as we’ll soon see, the economy Adam Smith wrote about was drastically different from our own. A theory based on cottage industries and local owners works differently when applied to global corporations with institutional shareholders. Using the invisible hand to justify capitalism today is like using Jefferson’s vision of a nation of small yeoman farmers to justify Monsanto and the Dole Food Company. We should be wary of sacred originalism in social science. Smith may have been right for his time, but, on this point at least, he is wrong for ours.
So maybe private vice makes for public virtue. Or maybe it makes for public vice. Maybe the invisible hand can cure all ills. Or maybe it causes them. As the prominent legal scholar Leo E. Strine, Jr., former chief justice of the Delaware Supreme Court, has written, “If empowering short-term investors turns out to be optimal for our society and its human citizens, that seems like a very improbable and unsustainable triumph of the law of unintended consequences.”33
Let economists have their invisible hand. We’d rather have companies that feed our children a healthy breakfast, the way Dr. John Harvey Kellogg intended.
We’ve seen the ideological underpinnings of fiduciary absolutism, but why did they take root when they did? Why doesn’t the invisible hand metaphor apply as well today as it did in 1776? For this, we have to take a quick sprint through the history of capitalism in America. We’ll then see how closely Kellogg’s followed this path and how a few corporations today are trying to create a different future.
There have been five eras of ownership in America, with the transitions between them forced by war, ideology, and innovation—both technical innovations such as the steam engine and financial innovations such as junk bonds.
As we’ll see, fiduciary absolutism came out of the transition from public ownership to the institutional ownership era.
When The Wealth of Nations was published, almost all economic activity occurred on small, locally owned farms and in cottage workshops.34 More than 90 percent of Americans lived in the countryside.35 In an 1812 letter to John Adams, Thomas Jefferson wrote, “every family in the country is a manufactory within itself, and is very generally able to make within itself all the stouter and midling stuffs for it’s own cloathing & houshold use.”36 As late as 1850, retailing was still completely dominated by locally owned stores.37
At the turn of the nineteenth century, there were only 335 business corporations in all of the United States, most of which had been established for transportation projects such as canals and toll bridges. There were no middle managers in the economy as late as 1840.38 In fact, Adam Smith was worried that the few corporations that did exist in his time might put at risk the sort of economy that his theory was meant to describe. He believed that “the corporation undermined an ideal market economy composed of small, owner-managed companies,” according to one modern scholar.39
Finally, consumers knew what they were buying. There were no cars, no Cokes, no utilities—almost nothing that individuals couldn’t make themselves. As late as 1894, the fabled Sears catalog still had only 322 pages of wares—more variety than most people were used to, yet quaint compared to Walmart’s 150,000 items now on offer.40 Or Amazon’s 500 million.
In the era of local ownership, an individual intuitively balanced his interest in making a profit with a local sense of accountability to his neighbors and a long-term interest in staying in his community’s good graces. So it would remain through the eve of the Civil War. And then things changed. Fast.
US industrialization came on in a rush: railroads, electricity, telephones, cars, branded packaged goods, and department stores. All of them required large-scale production to be efficient. This, in turn, required capital to invest in heavy industrial products. The first corporations were those that needed to pool large amounts of capital for big up-front expenses such as train locomotives and factory machinery.
Toward the end of that era, large industrialists such as John D. Rockefeller, Andrew Carnegie, Andrew W. Mellon, and Cornelius Vanderbilt had put together massive conglomerates, consolidating most of the country’s industrial base in only a few dozen companies.41 Though those corporations became massive, they were still run by their founders, called robber barons by some due to their monopolistic practices. Few companies in those days were public apart from the railways.
Those large corporations came to dominate everyday life—both as producers and as employers. By 1860, factory workers were replacing independent skilled craftsmen.42 The number of white-collar workers grew from less than 10 percent of the workforce in 1870 to nearly 20 percent in 1910.43 Markets became so integrated that by the outbreak of World War I, the price of a bushel of wheat was identical in Chicago and Liverpool.44 The era was characterized by laissez-faire economics, a hands-off policy of minimal government intervention. For the most part, government steered a course of benign neglect.
But the era’s scale of production and distance between consumers and producers created room for abuse, as the meat industry shows. Before the Civil War, meat had been slaughtered on family farms and sold through the local butcher.45 But the industry was revolutionized by urbanization and refrigerated railcars. A local, small, and simple industry became the distant, massive, and complex meat industry of the Chicago stockyards, where millions of hogs and cows were slaughtered each year.46 This complexity led to abuses such as the packaging for sale of diseased and rotten meat documented in Sinclair’s The Jungle and, after public outcry, to the Meat Inspection Act, the Pure Food and Drug Act, and the Bureau of Chemistry (later renamed the Food and Drug Administration).
As calls for regulation grew stronger and founders themselves sought to sell their massive corporations, we entered a new era.
President Theodore Roosevelt, who took office in 1901, thundered against “the malefactors of great wealth.” He sought to contain the monopolistic practices that had emerged as capital had become concentrated in the hands of the few. Though Roosevelt broke up many of the trusts, big corporations remained. Increasingly, however, they were owned not by their founders but by widely distributed shareholders.47 The number of Americans who owned stock grew from 2 million in 1920 to 10 million in 1930 to 25 million in 1965.48
The rising power of corporations was met by the rising power of government and labor. Four million workers staged 3,600 strikes in 1919 alone.49 By 1945, union membership had peaked at 35 percent of private sector employment.50 This had a direct payoff for union members through higher wages and better benefits. It also helped the rest of the American workforce, who saw their wages and benefits improve.51 Union membership has since declined to less than 7 percent today.52 The era also saw the birth of strong federal regulation, with the 1887 Interstate Commerce Act, the 1890 Sherman Antitrust Act, and the 1914 Federal Trade Commission Act among the earliest laws to be passed.
The spread of stock ownership in our economy is actually unique in the world. The United States and the United Kingdom are the only two countries to have their largest corporations held by a widely distributed group of shareholders. Whereas roughly two-thirds of public companies in Germany have a single shareholder who controls over half the company, in the United Kingdom it’s only 2.4 percent.53 On the New York Stock Exchange, it’s 1.7 percent.54
The era was punctuated by the Great Depression, which marked the end of laissez-faire economics. Increasingly, citizens looked to government for leadership as referee, infrastructure builder, and distributor of the spoils of growth. By the 1940s, the Progressive, antimonopoly era of the first Roosevelt had become the New Deal, pro-stakeholder era of the second.
With the presidency of Franklin Delano Roosevelt, government assumed a larger role in protecting the people and public goods left behind by capitalism. The Food and Drug Administration led to Medicaid and Medicare, the Clean Air and Water Quality Acts, Social Security, and the other welfare programs and regulations of the New Deal Era.
Unions, still at their strongest, fought for benefits that had previously been the preserve of the rich. The number of workers covered by private pensions grew from 4 million in 1940 to 15 million in 1956, eventually covering 50 percent of workers by 1980.55 The same thing happened with what was then called hospital insurance, where the number of workers covered grew from 6 million in 1939 to 91 million in 1952. Those plans were often created under the premise that workers would stay at one firm until they retired—a model in which long-term reciprocal obligations between employers and employees, both formal and informal, were the norm.
The economy also soared, thanks in part to the Allies’ victory in World War II. Not only did the war create a national sense of shared sacrifice and identity, it also left the United States the last industrial economy standing. After the war, the United States held 7 percent of global population but produced 42 percent of its manufactured goods, 57 percent of its steel, 62 percent of its oil, and 80 percent of its cars.56 The economy grew by nearly 4 percent annually from 1946 to 1973 (compared with 0 to 3 percent over the last ten years). Real household income increased by 74 percent over the same period.57 Economic prosperity was more widely shared than any time before or since.
As ownership continued to spread to millions of small, passive individual shareholders, corporate managers took control. Scholars call this the period of “managerialism,” in which managers saw their role as balancing the varied and conflicting demands of all their stakeholders. Fiduciary absolutism had yet to take hold. A 1961 survey revealed that 83 percent of executives believed it was unethical to work only in the interest of shareholders.58
The United Kingdom followed a similar transition over this period. “The position of shareholders,” wrote one British economist in 1965, “which is sometimes presented by the ideologues of business in the image of a parliament telling ministers what to do, is in fact much closer to that of a highly disciplined army, which is permitted by law to riot against its generals if, but only if, rations should happen to run out.”59
This sense of balance in an economy of shared prosperity—shared at least among white males—marks an era that many capitalist reformers wish we could return to. But the era would soon give way as fiduciary absolutism began its reign.
The transition from managerialism to fiduciary absolutism today is the result of several factors: increased competition from abroad concurrent with a bear market in the 1970s that undermined Americans’ confidence in business, the advent of hostile takeovers in the public markets in the 1980s, and, perhaps most important, a shift in ownership toward institutional shareholders.
Through World War II, the average US worker was five times as productive per hour as the average Japanese worker; by 1980, Japanese car workers were 17 percent more productive than American car workers.60 The same was true of steel:61 from 1956 to 1976, Japanese steelmakers went from 19 percent less productive to up to 17 percent more productive than their US counterparts. With free trade and global competition growing simultaneously, US corporations were forced to compete with those that had far lower standards with regard to workers and environmental regulation.
Meanwhile, in the 1970s, an Arab oil embargo quadrupled oil prices, triggering a period of stagflation in the United States: the economy stagnated while inflation ballooned to as high as 11 percent annually.62 From 1966 through 1982, the real return on the S&P 500 was 0 percent.63 Productivity growth slowed to roughly half the rate of previous decades. By 1971, the United States had an unfavorable balance of trade, the first time that had happened since 1893. Whereas in 1966, 55 percent of Americans had voiced “a great deal of confidence” in corporate leaders, by 1975, only 15 percent did so.64
Around that time, the creation of high-risk “junk bonds” suddenly gave investment funds the ability to buy out entire companies on the public market, even against the wishes of their existing managers. In the 1980s, nearly a quarter of major corporations received a hostile or unwanted bid, and nearly three in five received a takeover offer of some kind.6566 By 2000, half of the largest hundred industrial firms had either been taken over or gone bankrupt.67
Finally, structural changes in how Americans saved for retirement led to a greater focus on short-term share price.
Until the 1970s, most pensions were structured as a defined benefit to be paid by the company. In this arrangement, workers’ interests were in keeping the company strong, stable, and low risk so it could meet its pension commitments. Unfortunately, when companies went bankrupt, retirees could be left without the benefits they’d been promised. So the government set up new minimum funding rules, mandatory insurance contributions, and other regulations meant to protect pensioners. They protected retirees against some of the risks but also made those sorts of plans more expensive and risky for corporations.
But the government gave companies another option: Rather than promising retirees a pension in the future, a corporation can contribute to a tax-deferred retirement account today, such as a 401(k). Then it’s up to the employee to save and invest that money. The employer is off the hook. If corporations wanted to convert from the old “defined-benefit” system to the new “defined-contribution” system, they could.
And most of them did. In 1981, 60 percent of pensioners relied on a defined-benefit plan.68 By 2001, that had flipped: 60 percent relied on a 401(k) or IRA. Total assets in 401(k)s grew from $700 billion in 1994 to $4.4 trillion in 2014.69 Again, this is unique in the developed world. In Germany and Japan, private pensions are still paid out of a corporation’s cash flow. In many other developed countries, retirements are funded by the government. In France, retirees get 85 percent of their income from the government.70 In the United States, it’s only 36 percent.
The United States’ new reliance on defined-contribution retirement accounts had two major effects.
First, it meant that a comfortable retirement now relied on the returns of the stock market rather than the stability of a worker’s employer. In 1977, 20 percent of households owned stock.71 Today, that figure is over 50 percent. Now everyone—union members and the rest—must worry about shareholder value to fund their golden years.
Second, it led to the rise of institutional investors, as employees turned to mutual funds to invest their retirement savings on their behalf. By 2000, there were nine thousand mutual funds in the United States, two-thirds of which had been launched in the preceding decade.72 Whereas institutions such as mutual funds controlled only 6 percent of equity in 1950, they controlled over 63 percent by 2016.73 These institutions compete for our business from quarter to quarter by trying to show a higher return than the competition’s. Their interests end up being far more shareholder oriented and short term focused than those of the workers whose retirements they manage.
All the while, a divisive war in Vietnam and social upheaval at home exacerbated Americans’ confusion and undermined their self-confidence. At that stage in US history, the ground for a reorientation of business priorities was fertile indeed.
It was in that context that Jensen and Meckling stated that the principal-agent relationship was the core problem of corporate governance, that the efficient market hypothesis gave managers a single number to maximize, and that the invisible hand gave everyone a justification to carry out their worst impulses. It was in that context that the seductive simplicity of fiduciary absolutism came to dominate our corporations. The culture and norms supporting the reciprocal obligations of the postwar period gave way at the same time that both regulations and union protection declined.
Real per capita income in the United States grew at an annual rate of 2.5 percent from 1950 to the early 1970s. From 1973 to 1995, growth fell to 1.8 percent per year, and it has since fallen further to 1.6 percent per year since then.74 Productivity growth has slowed.75 Business has also become less dynamic, with fewer companies being started and fewer corporations going public.76 Indeed, the number of public companies has fallen by half since 1996.77 Of course, correlation is not causation, and many of the factors that led to fiduciary absolutism—such as increased competition from abroad and stagflation—are themselves either causes or consequences of slower growth. Nevertheless, this is the record fiduciary absolutism must answer for.
It’s a history we see reflected in many of the largest corporations that have lived through it, Kellogg’s included. After being founder-owned for years, the company went public in 1952, selling shares to four thousand individual investors.78 Institutions now own 88 percent of the company.79 From founder ownership to public ownership to institutional ownership, all in step with the country at large.
Dr. John Harvey Kellogg made a product to serve his own patients each day. Today, shareholders aren’t even permitted to tour the manufacturing plant.80 Investors in mutual funds might not even realize they own the stock.
Kellogg’s began as a family affair. Though Dr. John Harvey Kellogg invented the cereal, it was his brother, Will, who created the empire.
From the start the two brothers saw the world in different ways. John was the nutritionist in chief, protective of his reputation as a doctor. “I have been interested in human service,” he wrote.81 “Not in piling up money.” Will, on the other hand, was the master marketer. Over time, he became more interested in capitalizing on John’s medical background than in making his customers healthier.
As described in The Kelloggs, “Will pressed the doctor to expand the business … develop a national advertising campaign, sell cereal in grocery stores across the country and make some real money.”82 For Will, it was always just a business. To grow their empire, Will innovated extensively around taste, adding substantial sugar and salt to the recipe for Corn Flakes against John’s wishes. The rift left the brothers barely on speaking terms.
Together, they were the corporate equivalent of Dr. Jekyll and Mr. Hyde: the upstanding doctor transformed at night into the nefarious bandit. It was one company with a split personality—dueling motivations to serve others and to serve itself. Unfortunately for Dr. John, Mr. Will won. Kellogg’s grew in his image.
After Will’s death in 1951, his successor, Watson Vanderploeg, introduced a long list of sugar-laden cereals. He hired the advertising company behind the Jolly Green Giant and the Marlboro Man to invent characters such as Toucan Sam and Tony the Tiger. Kellogg’s grew to be a $22 billion company, and it did so primarily by selling sugary cereal.83 Today, Kellogg’s continues to sell $2.6 billion of breakfast food each year, much of that sugary cereals.84
From a biological perspective, we’re wired to eat what we can; overconsumption is a modern luxury humans did not evolve to resist. But we have a system that is complicit in exploiting this reality. Food companies are not trying to make us sick; they’re just trying to sell us more and more product. “In this kind of investment economy,” wrote NYU professor Marion Nestle, “weight gain is just collateral damage.”85
As one journalist put it, “Obesity is often described as simply a matter of managing one’s calories and consequently cast as a lack of willpower on the part of an overweight individual.86 But it is probably more accurately understood in the context of a global food system that is incentivized by financial markets to produce low cost, high-calorie, unhealthy, and addictive foods.”
It’s not a question of personal lack of discipline; it’s our food system, trapped under fiduciary absolutism’s mandate to maximize profits at all costs. Kellogg’s objective is creating profit for shareholders. If the by-product is childhood obesity, well, according to fiduciary absolutism, that isn’t the company’s problem to solve.
It is a strange but typical feature of our economy’s split personality that although Kellogg’s the company continues to generate so much ill health in children, the foundation Will Kellogg created simultaneously tries to undo it. The now $7 billion W. K. Kellogg Foundation was launched in 1930 to help disadvantaged children. It has even given a grant to none other than Curt Ellis—the same Curt Ellis who made King Corn and is now trying to get kids off sugary cereal.
So here we have it, the Jekyll and Hyde of our financial system: make money however you can, then use the spoils to remediate the very problems you have caused. It’s “The Gospel of Wealth” all over again.
Fiduciary absolutism has created a world in which it seems normal for companies that are meant to nourish us to make us sick. This irony isn’t limited to food. Think of all the pain that painkillers have caused this country. Or the financial stress of retirees caused by unscrupulous financial advisers.87 These are not morally ambiguous products—casinos, cigarettes, or porn. These are industries with a very clear underlying purpose. Yet because of fiduciary absolutism, they put profit above all else, often subverting whatever purpose they could have served.
Does that mean we’ve reached the end of history? Does that mean fiduciary absolutism will rule forevermore? Not if it’s up to a pair of social entrepreneurs who are taking on the food supply system.
When Jon Olinto and Tony Rosenfeld couldn’t find something, they made it. Two entrepreneurs from Boston, they decided to launch a farm-to-table restaurant that could compete with fast food. Their restaurants, called B.Good, had chalkboards with the names and pictures of the farmers who supplied them. They served kale salads and quinoa, as well as comfort food in the form of grass-fed burgers and hand-cut fries.
For their first restaurants in greater Boston, they sourced zucchini for their seasonal salads from a fifth-generation family farm in New York, eggs from Nellie’s Free Range in New Hampshire, cheese from a creamery in Vermont, and ice cream from down the road in Cambridge. Most of it bypassed the industrial food complex; they used local, unprocessed food whenever possible.
But there was one ingredient that, try as they might, they could not find locally: wheat. So in 2017, they launched One Mighty Mill with the subversively simple tagline “Wheat you can eat!”
Almost all the flour we eat in our bread and baked goods comes from one of only four companies: Archer Daniels Midland, Bunge, Cargill, or the Louis Dreyfus Company. Known collectively as ABCD, these companies control 90 percent of the world’s grain trade.88 Their process for making flour is a case study in industrial food production.89
For the last century and a half, flour was meant to be cheap, shelf stable, and lily white—originally thought to be a mark of purity. Few recognized that the industrial milling process stripped away the bran, the germ, and many of the essential nutrients that made grain healthy in the first place. One Mighty Mill was founded to make wheat a superfood again. Its first step was to partner with the sole US-based stone mill builder to build a 7,000-pound mighty mill. Next it opened a storefront on the main street of Lynn, Massachusetts, a down-and-out town on the outskirts of Boston. This is where they sell their bagels, pretzels, and whole flour–based foods, using flour from wheat harvested on small farms in Maine.
The business is tiny, but it’s driven by a deep sense of accountability to consumers and communities—in some ways closer to the era of local ownership than to our own. One could easily imagine Dr. John Harvey Kellogg involved in an enterprise like this one. “You can’t have healthy kids without healthy wheat,” Olinto told us. “And because healthy wheat is perishable, it’s hard to have healthy wheat without a local mill.”
Consumers themselves have already begun pushing toward healthier breakfast options, and financial markets have started responding. Over the five years from 2014 through 2019, Kellogg’s stock price has grown only 14 percent while the S&P 500 has increased by 76 percent.90 For all its harms, fiduciary absolutism is meant at least to maximize share price. As we’ve seen already in the examples in this book—and as we’ll see in the empirical data presented in chapter 4—fiduciary absolutism often fails even to accomplish this much. When CEOs grow profit quarter after quarter, people say they have the Midas touch. We forget that the myth of King Midas was a tragedy. He died of starvation when his greed for gold made his world unlivable, a gilded shrine of misplaced values. In the long term, you reap what you sow.
Olinto and Rosenfeld are trying to use One Mighty Mill to find a way around the obstacles described above. What if, instead of profiting by making us sick, all food companies sought their profits through making us healthy? What if the economic value a corporation created was once again aligned with the social and environmental value it created? Olinto and Rosenfeld are trying to find out.91
All doctors take the Hippocratic Oath. “I will remember that I remain a member of society,” it reads, “with special obligations to all my fellow human beings.” Many professions, from barbers to airline pilots, have their own codes of ethics.
There is no such code for the business world. Instead, business leaders read Sun Tzu’s The Art of War and borrow phrases from the field of battle. For decades our food industrial complex has indeed made a killing. Ray Kroc, the legendary CEO of McDonald’s, embodied a sort of ruthless approach to business. “This is rat eat rat, dog eat dog.92 I’ll kill ’em, and I’m going to kill ’em before they kill me. You’re talking about the American way—of survival of the fittest.” Well, that’s one view of the American way.
When Harvard Business School added a dedicated ethics module in the 1960s, one observer wrote that it sent the message “that ethics, like the caboose on a train, would lend a sweet symmetrical touch to business, but that the locomotive would haul just as well without it.”93 Professor Rakesh Khurana and Dean Nitin Nohria have recently made efforts to further codify ethics at HBS, going so far as to propose a Hippocratic Oath for Managers. It included the pledge to “guard against decisions and behavior that advance my own narrow ambitions but harm the enterprise I manage and the societies it serves.” Khurana hopes his students will learn that corporations are meant to improve society, not harm it. As the saying often attributed to President Theodore Roosevelt goes, “To educate a man in mind and not in morals is to educate a menace to society.”94
Fiduciary absolutism doesn’t maximize profit; it maximizes the profit motive. And it doesn’t maximize shareholder value; it maximizes today’s share price. Fiduciary absolutism is its own ruler. It serves itself.
Worst of all, fiduciary absolutism often reduces meaningful employment to crass commercialism. Fiduciary absolutism tells everyone—explicitly—that they are just cogs in someone else’s value extraction machine, that they are line items on an income statement meant to be minimized. This misunderstands and underestimates human motivation.
Capitalism’s staunchest apologists call this dystopia our promised land. They hide behind the invisible hand and claim that this is the best of all possible worlds. It is a convenient view for those who seek to simplify our complex world into a single number. It is a convenient view for those who would rather excuse their behavior than change it.
Fiduciary absolutism reflects a failure of ambition and a reductio ad absurdum. The most important problem facing a food company is not the principal-agent problem. It never was. It’s feeding people, nourishing them, and doing so in a way that will also generate prosperity for employees, managers, and shareholders.
We forget that corporations are social organizations originally designed to solve problems, not create them. They are capable of meeting the needs and reflecting the values of their societies while making money at the same time. Shouldn’t we hold our corporations to a higher bar?