Given his expertise, age, and accent, Roberto Goizueta’s rise to the top of Coca-Cola was every bit as unexpected as Jack Welch’s climb to the chief executive’s office at General Electric.
Goizueta had joined Coke in 1954, overseeing production processes in the three bottling plants that the company ran in his native Cuba. He was twenty-two at the time, the son of a Havana family that had made its fortune in sugar, and had just earned a chemical-engineering degree from Yale. He immediately showed himself to be a tireless worker, willing to do whatever it took to keep the line moving, even if it meant sleeping through the night at the factory so that he could be on hand to troubleshoot problems. He impressed, as well, with his devotion to quality, trying to ensure that every Coke sold was, as he said in a 1957 report, “pure and wholesome.”
With Goizueta’s ambition and intensity, it was likely only a matter of time before higher-ups in Atlanta would have plucked him from Cuba and assigned him to bigger and better things. Before they could act, however, Goizueta’s upper-crust existence was shattered. In February 1959, Fidel Castro was sworn in as prime minister of Cuba after his guerrilla campaign forced right-wing dictator Fulgencio Batista into exile. It wasn’t long before Goizueta found himself harassed by the new regime. Soldiers stopped him one night and searched his briefcase, evidently trying to make sure that he wasn’t hiding any company documents from the government. The following year, with tensions flaring between the United States and Cuba and Castro threatening to seize American interests on the island, the moment had come for Goizueta and his wife to leave.
They landed in Miami with just enough clothes for a supposedly short “vacation” and only $200 between them. They left everything else behind—their house, their cars, their bank accounts, their books and art. Goizueta did, however, still own one other notable asset: a hundred shares of Coca-Cola stock that he had bought with $8,000 lent to him by his dad. “You shouldn’t work for someone else,” Goizueta’s father had told him when he first took the job with Coke. “You should work for yourself.” Keeping a close eye on the company’s share price—and, in the end, doing absolutely everything that he could to help it go up—would become Roberto Goizueta’s obsession.
After coordinating technical operations for Coca-Cola in Latin America—a job that had him shuttling between Miami and a base in the Bahamas—Goizueta was summoned in 1964 to Atlanta for a special assignment: helping to reorganize the company’s engineering and research activities. Whatever mixing of the elements went on in the lab, the real chemistry occurred between Goizueta and Coca-Cola’s senior leaders. Now ensconced at headquarters, and with his responsibilities steadily widening to include issues related to general management and the allocation of capital, Goizueta worked closely with CEO Paul Austin and, before long, Robert Woodruff—The Boss—who, from his seat as chairman of the board’s Finance Committee, was still very much in control of the company.
In 1966, at age thirty-five, Goizueta was promoted to vice president of technical research and development—the youngest vice president in Coca-Cola history. In 1974, he was named a senior vice president, and in 1979 he was elevated to be one of six new vice chairmen of the corporation. At forty-seven, Goizueta was three to twenty years younger than anyone else in the “vice squad,” as the group came to be known. In many respects, his meteoric ascent was similar to Jack Welch’s. Yet as coarse and hyperkinetic as Welch could be, Goizueta was just the opposite: reserved, refined, and flawlessly put together, with his Guccis, tailored suits, and silk handkerchiefs tucked in his breast pocket. He was also adept at sucking up to Robert Woodruff. And he padded his credentials, claiming falsely that he’d graduated tenth in his class at Yale.
As it entered the 1980s, Coca-Cola was in a bad spot. Austin had expanded into a bunch of areas, including water purification, shrimp farms, carpet shampoo, plastic straws, and more, diverting the company’s attention away from its core business: selling soft drinks. There couldn’t have been a worse time to stray, either. Coca-Cola’s biggest rival was aggressively offering blind taste tests across the country—the Pepsi Challenge—and stealing market share. Coke’s cash reserves dwindled and its stock price lagged. “Executives were paralyzed by indirection,” one journalist has noted, “and the company’s operations and financial policies were outmoded, in some cases medieval.”
At the same time, Coke’s bottlers had “no confidence in what Austin tells them,” Joseph Jones, Woodruff’s right-hand assistant, warned The Boss. “They have no respect for him, they don’t trust him.” Neither did Woodruff, who had lost all patience with Austin after he made some questionable personnel moves and built a fancy new headquarters tower that the old man hated. Austin then tried to pay for the building with something that the financially conservative Woodruff hated even more: $100 million in debt. As the company lost its focus, Austin lost his, fumbling through speeches and forgetting things and otherwise behaving erratically. Some of Austin’s longtime colleagues figured that he’d developed a drinking problem. Only later would it be revealed that he had been suffering from a crippling combination of diseases—Alzheimer’s and Parkinson’s.
As company insiders and outsiders began to speculate on who would replace Austin, few gave Goizueta a shot. He was relatively young, had no branding or marketing experience in a company that was all about branding and marketing, and “was a foreign national” at an institution “heretofore led by dyed-in-the-wool Georgians,” as one commentator remarked. But Goizueta’s every-other-day visits to Woodruff’s Tuxedo Road mansion, sipping vodka and tonic in the sitting room with the old man, apparently paid off. Goizueta emerged as Woodruff’s choice to take charge of his company; everyone else in the running was knocked out. The Financial Times called Goizueta’s selection “one of the most surprising executive appointments in US business this year.” Most of those working for him couldn’t even pronounce his name (Goy-SWET-ah).
It wouldn’t take any time at all, however, for Goizueta to make plain what he intended to accomplish. Just like Welch at GE, he declared that there would be “no sacred cows at Coca-Cola, and he set out to impose a new degree of discipline for a new day. “It has been said that the fifties, the sixties, and even into the midseventies were the years of determinism in the corporate world—a period when predictable, satisfactory earnings growth records could be achieved without a great deal of management input or careful attention to details,” Goizueta told employees in May 1981. “Since the midseventies, however, we have been in a period of uncertainty. In this environment, the finest test of management competence will be the management of productivities: productivity of money, of physical assets, and of people.”
To make Coke more productive, Goizueta pressed executives to execute with a level of rigor that was never before required of them. Under Austin, they had been able to make five-year plans that were full of vague promises. Under Goizueta, they were compelled to lay out precise three-year plans—and then they’d better be sure to nail their targets. “I want you to tell me what you need to do to expand your business, what kind of capital you need to do it, and what kind of net return you’re going to get,” Goizueta instructed his seventeen division chiefs. He insisted on answers that were clear and smart and grounded in data that was accurate. “Facts are facts,” Goizueta would write again and again on his executives’ budget submissions, sending them scurrying back to revise things.
“People have got to believe they are living or dying by these numbers,” Goizueta told Coke’s president, Donald Keough, a highly respected company veteran who had wanted the CEO job but, after being passed over, agreed to stay on as the second-in-command.
“Yes,” Keough responded. “We’ve got some toilet training to do.”
For those who hadn’t done their homework, Goizueta gave no forbearance. His questions were so sharp that an early session when he prodded and probed his managers became known as “The Spanish Inquisition.” Goizueta’s aim in all of this was, in many ways, a mirror of what Welch was trying to do at GE: take an organization that had become tired and slow-footed and rejuvenate it by spreading more responsibility and accountability throughout the ranks. “It would take a long and difficult battle to mold Coke into a world-class company at every level of operation in every market around the globe,” Goizueta’s biographer, David Greising, has written. “But his job, as he saw it, was to shrink the time line, to jolt Coke into the future abruptly, to shake up the managers and make them understand. It was time for these people to get on board or get out.”
Like Welch, Goizueta wanted employees who would dream big and take risks, so long as those risks panned out more times than not. “The needs of our business demand that we look for and develop entrepreneurs rather than just caretakers,” he said. Most companies, Goizueta added, like to “reward the ‘team player’ who doesn’t rock the boat or step out of line. I’m all for loyalty and team spirit—but only if they do not discourage innovative thought and action.”
Goizueta led by example, taking a series of bold steps throughout his time as CEO. Although he held a dim view of his predecessor’s fondness for diversifying—“cats and dogs,” he called the assortment of businesses that Austin had branched into in the 1960s and ’70s—Goizueta purchased Columbia Pictures in 1982, giving Coke a taste of Hollywood. The acquisition couldn’t have gone down any smoother. During that first year, on the strength of the box-office smashes Gandhi and Tootsie and a successful joint production deal with Home Box Office, Coke’s entertainment subsidiary rang up a hefty $90 million in operating profit, 50 percent more than Goizueta had predicted. (Seven years later, Goizueta would sell Columbia to Sony for a net gain of half a billion dollars.) He also introduced Diet Coke, a runaway hit. The company was now booming.
“I have been racking my brain to figure out what to give you” for a ninety-fifth-birthday gift, Goizueta wrote to Robert Woodruff in December 1984. “After much thought, I have come to the conclusion that what you probably would enjoy the most is knowing how well your company is doing.” Goizueta then ticked off “a few facts and figures”: Earnings for the first nine months of the year were up 17 percent. Operating profit was poised to pass the billion-dollar mark for the first time ever. And Coke’s stock price was at an eleven-year high. “For the last three years,” Goizueta told The Boss, “the value of the shareholders’ interest in your company has increased from $4.6 billion to about $8 billion.… This year alone, the total return on your investment in the company will likely end up being at least 20 percent.” For his part, Woodruff was pleased. “My company is being run very well,” he told an associate.
In March 1985, Woodruff died while holding his nurse’s hand and listening to the gospel song “Just a Closer Walk with Thee.” Although he could be a tough S.O.B., Woodruff left $1 million in his will to Joseph Jones, who had denied himself all those years of vacation so as to faithfully serve The Boss.
The following month, Goizueta made his biggest bet yet: he replaced the company’s flagship product with a reformulated version of the ninety-nine-year-old cola. The backlash against New Coke, as it was called, was overwhelming, as some 400,000 phone calls and letters poured into the company. The buying public abhorred the change. So did Coca-Cola’s bottlers. Pepsi pounced, too, saying that Coke had sweetened its soda to mimic the taste of its closest competitor. “The other guy just blinked,” Roger Enrico, the president of Pepsi-Cola USA, proclaimed in an open letter that appeared in newspapers across the country. But even one of the greatest blunders in business history couldn’t slow down Coke for long. The company soon reintroduced its original formula, now dubbed Coke Classic. Consumers were so thrilled by the reversal that some wondered whether the company had concocted the entire crisis. “Some critics will say Coca-Cola made a marketing mistake,” Keough told reporters. “Some cynics will say that we planned the whole thing. The truth is we are not that dumb and not that smart.” In any case, by early 1986 Coke Classic was once again outselling Pepsi, while New Coke faded away.
Through all of the successes and occasional failures, Goizueta remained firmly fixated on the company’s financial health. Increasing sales was crucial, he believed, but not at the expense of profitability, and he stipulated that every line of business at Coca-Cola must generate at least a 20 percent return. For some of the old-timers, this new emphasis left them cold. “We moved into the eighties and everything seemed to change,” said Charlie Bottoms, a marketing manager who worked at Coke for forty years. “Financial engineers run this company.”
It wasn’t surprising that employees felt this way. For one measuring stick had come to stand above the rest at Coca-Cola: the company’s stock price.
“It is easy in the rush of our day-to-day routines to forget for whom we are really working,” Goizueta had said when he first became CEO. “There are 79,305 shareholders—mostly individuals like you and me—who own the Coca-Cola Company. After all is said and done, our primary responsibility is the long-term enhancement of their investment in this company.”
By 1988, Goizueta had wrapped Coke’s strategy around this one main objective: “to increase shareholder value over time.” “I wrestle over how to build shareholder value from the time I get up in the morning to the time I go to bed,” he said. “I even think about it when I am shaving.” If such an orientation sounded different than that favored by previous CEOs, that’s because it was. Typical of the old guard was Coca-Cola president William Robinson, who told a group at Fordham Law School in 1959 that it was a mistake for executives to put “the stockholders first, last, and all the time.” Rather, he said, a corporation had to serve four constituencies: the stockholder, the community, the customer, and the employee—Coke’s version of GE’s “balanced best interests” doctrine. But for Goizueta, the shareholder was the undisputed king. “There are plenty of… missions upon which a company could focus: serving customers… providing the highest quality of products and services; creating jobs and job security,” he said. “But I would submit that in our political and economic system, the mission of any business is to create value for its owners.”
Under Goizueta’s leadership, Coca-Cola split its stock and bought back billions of dollars of its shares—a maneuver that became popular throughout corporate America in the 1980s—in an effort to nudge the price higher. He also hectored securities analysts who dared to disagree with his vision for the company. One analyst, from Merrill Lynch, was essentially blackballed for a year after raising questions about Coca-Cola’s foreign currency exposure. Goizueta personally kept close tabs on the company’s share price as well. Constance Hays, who covered Coca-Cola for the New York Times, captured the scene that played out in a conference room in Atlanta, at a time before the gyrations of a company’s stock could be followed on any PC, laptop, or smartphone.
The room, down the hall from the top executives’ offices on the twenty-fifth floor of the Coke tower, held a machine capable of tracking moment-by-moment changes in trading volume, index performances, and specific share prices. A squat-looking little thing, it sat on a table all by itself, powered up nearly all the time.
During the day, at any given hour, people passing by that conference room might see a dark-haired man standing before the table. His back would be to the open door, his hands stuck inside the pockets of his suit jacket. Even from behind, everyone knew who he was: the chairman and chief executive of the Coca-Cola Company. And they knew what he was doing. He was watching the market move. Unaware of the employees passing down the hall behind him, undisturbed by conversations wafting in from other rooms, he was oblivious to everything but the green light coming from the screen.
Roberto Goizueta was hardly the only corporate executive in America so transfixed.
There was a time in the United States when few businesses were owned by anyone outside of their founders (and their families) and perhaps small pools of investors. But that was long ago. Starting in the early nineteenth century, companies in one sector after another began to disburse their shares widely: textiles, railroads, banking and insurance, mining and quarrying, and finally, by the first few decades of the twentieth century, many other types of manufacturing and service operations. As corporate stock found its way into more people’s possession, another development followed: new crops of professional managers were hired and left to determine the day-to-day direction of these ventures. Ownership, now diffuse, became more distant and passive.
Not everyone was pleased by this shift, fearing that it opened up the possibility of widespread abuse by executives and directors who, while not acquiring so much as a single share of their own companies, would put their selfish interests first. “It is… evident that we are dealing not only with distinct but often with opposing groups: ownership on the one side, control on the other—a control which tends to move further and further away from ownership and ultimately to lie in the hands of management itself, a management capable of perpetuating its own position,” attorney Adolf Berle and economist Gardiner Means wrote in their seminal 1932 book, The Modern Corporation and Private Property. “The concentration of economic power separate from ownership has, in fact, created economic empires, and has delivered these empires into the hands of a new form of absolutism, relegating ‘owners’ to the position of those who supply the means whereby the new princes may exercise their power.”
Yet, for all of their concerns, Berle and Means did see one way that management might have a positive impact—even if its agenda diverged from that of shareholders. “Should the corporate leaders… set forth a program comprising fair wages, security to employees, reasonable service to their public, and stabilization of business, all of which would divert a portion of the profits from the owners of passive property, and should the community generally accept such a scheme as a logical and human solution of industrial difficulties, the interests of passive property owners would have to give way,” they wrote. Ideally, Berle and Means continued, corporate management should be attuned to “balancing a variety of claims by various groups in the community and assigning to each a portion of the income stream on the basis of public policy rather than private cupidity.”
One could argue that “balancing a variety of claims” is exactly what happened as the social contract between employer and employee evolved during the twenty-five years after World War II, with rising wages and benefits, dependable job security, and sound businesses built on the prosperity of America’s growing consumer class. It wasn’t a flawless system. Not even close. But many of the country’s biggest corporations found themselves working for “the larger interests of society,” as Berle and Means had imagined they might, even if they did so unevenly and at times begrudgingly.
Then, as the US economy languished and the weaknesses of many American corporations began to be unmasked, the old debate about the proper role of business reignited. One of the first to weigh in was Milton Friedman, the influential University of Chicago economist, who contended in a 1970 essay in the New York Times Magazine that a corporation had but one legitimate “social responsibility”: “to increase its profits so long as it… engages in open and free competition without deception or fraud.” A company that was pursuing anything else, including trying to “take seriously… providing employment,” was in Friedman’s mind doing little more than “preaching pure and unadulterated socialism.” By this reasoning, the obligation of executives placed in charge of a company was straightforward. “The manager is the agent of the individuals who own the corporation,” Friedman wrote. And it was up to him “to conduct the business in accordance with their desires” as shareholders, “which generally will be to make as much money as possible.”
Friedman’s thesis was one part of a broader intellectual movement, which extolled the cool rationality of the marketplace and ascribed to it an almost preternatural ability to sort out a vast range of social problems. By the early 1980s, the Princeton historian Daniel Rodgers has observed, “the proposition that the free play of private interests might better promote maximum social well-being”—more so than would a government program or traditional welfare capitalism—had “moved closer and closer to the default assumption” in American life. A collection of scholars (including Robert Lucas, Richard Posner, Kenneth Scott, Gary Becker, Eugene Fama, Frank Easterbrook, Daniel Fischel, and others) joined Friedman in carrying this thinking deep into the fields of economics and law, while Ronald Reagan and his supply-siders applied it to politics and policy. “You know,” the president said, “there really is something magic about the marketplace when it’s free to operate.”
Among those who understood the publication of Friedman’s article to be a major event—not just for business, but for the whole of American culture—was Michael Jensen, a professor at the University of Rochester’s Graduate School of Management, who in 1976 coauthored his own paper on the relationship between corporate executives and shareholders. The piece was written with his Rochester colleague William Meckling, who, along with Jensen, had been trained in neoclassical economic theory at the University of Chicago. Filled with mathematical equations, it would find its home in an outlet far more obscure than the New York Times—namely, the Journal of Financial Economics. Nonetheless, “Theory of the Firm: Managerial Behavior, Agency Costs, and Ownership Structure” would go on to become the most cited academic business paper of all time.
Like Berle and Means, Jensen and Meckling saw an inherent conflict between managers and shareholders. The manager, they wrote, has a natural “tendency to appropriate perquisites out of the firm’s resources for his own consumption”—in other words, to feather his own nest. He may also be disinclined to search out “new profitable ventures… simply because it requires too much trouble or effort on his part.” Unlike Berle and Means, however, Jensen and Meckling saw nothing redeeming about executives who provided good wages or security to their employees. Just as Friedman had, they characterized corporate leaders as mere agents of the shareholders; their sole function, then, was to maximize shareholder value.
For those who weren’t up to the task, Jensen encouraged that they be driven out of their companies through hostile takeovers—a comeuppance for entrenched executives. “The takeover process,” he wrote, “penalizes incompetent or self-serving managers whose actions have lowered the market price of their corporation’s stock.” Loading up corporations with debt was praised as a way to bring profligate executives to heel.
A group of buyout artists were the heroes of Jensen’s narrative. As Roberto Goizueta and Jack Welch were remaking their companies from the inside, a gaggle of corporate raiders—Carl Icahn, T. Boone Pickens, Victor Posner, Harold Simmons, and others—were taking aim from the outside, trying to squeeze more value out of what they maintained were inefficient and mismanaged businesses. Even those who were wary of the financiers’ tactics had to concede that the case they were making couldn’t be dismissed out of hand. Peter Drucker, for instance, was no fan of Milton Friedman’s framing of corporate responsibility. “Altogether far too much in society—jobs, careers, communities—depends on the economic fortunes of large enterprises to subordinate them completely to the interests of any one group, including shareholders,” he wrote. But Drucker also recognized that “what made takeovers and buyouts inevitable… was the mediocre performance” of American companies through the 1970s and early ’80s.
“Whatever the reasons or excuses,” Drucker said, “the large US company has not done particularly well on professional management’s watch—whether measured by competitiveness, market standing, or innovative performance. As for financial performance, it has, by and large, not even earned the minimum-acceptable result, a return on equity equal to its cost of capital. The raiders thus performed a needed function. As an old proverb has it, ‘If there are no grave diggers, one needs vultures.’”
They circled hungrily. By one count, nearly half of all major US companies received an unsolicited takeover offer in the eighties, much of it greased by new financial instruments such as “junk bonds.” Many of the deals that unfolded during the decade were so-called bust-up takeovers in which diversified companies (many of them assembled during the 1950s and ’60s, when conglomerates were hot) were targeted and then dismembered, with the component pieces sold off.
Some, like Boone Pickens, pointed to Jensen’s work to justify their methods, while executives and boards of directors began to talk and act as if maximizing shareholder value was their legal duty. As a factual matter, this was wrong; the law has never mandated any such thing. But that didn’t stop “agency theory” from permeating the realm of practice. “Under the sway of the new economic orthodoxy,” Harvard’s Rakesh Khurana has written, “any suggestion that the corporation was subordinate to any societal institution other than shareholders was increasingly regarded as soft-minded and suspect.”
For many years, the Business Roundtable held fast to a concept that seemed as if it came straight from the glory days of the Committee for Economic Development. “Corporations are chartered to serve both their shareholders and society as a whole,” the group said in a 1990 statement. “Some argue that only the interests of shareholders should be considered by directors. The thrust of history and law strongly supports the broader view of directors’ responsibility to carefully weigh the interests of all stakeholders.” By the late 1990s, however, the Roundtable would parrot the prevailing sentiment. “The paramount duty of management and boards of directors,” the organization now said, “is to the corporation’s stockholders.… The notion that the board must somehow balance the interests of other stakeholders fundamentally misconstrues the role of directors.”
So much for the “thrust of history and law.”
None of this was advantageous for workers. It wasn’t that the push to maximize shareholder wealth was the central cause of the social contract crumbling. New competitive realities brought by globalization, the decline of organized labor, the reach of technology—all of these were already altering the compact between employer and employee before the cult of the shareholder was established. And yet, with agency theory embraced so fully, each of these trends was amplified. “During the postwar boom, American corporations had taken on the provision of stable careers, health insurance for employees and their dependents, and retirement security,” the University of Michigan’s Gerald Davis has written. “Now corporations increasingly saw employment as an avoidable expense. Creating shareholder value was in tension with creating stable employment.”
Indeed, takeovers were frequently followed by layoffs as new owners expected executives to reduce redundancies and cut out anything—or anyone—that might be deemed waste. Meantime, many top managers didn’t wait for a Carl Icahn to storm the gates or for an institutional investor to initiate a proxy fight and attempt to oust them. They tried to get ahead of things by making their organizations leaner and flatter, hoping that such proactive behavior would help their companies, as well as their own careers, live on. “It is important to distinguish between the causes of layoffs and the CEOs who as agents of change respond to ensure the competitiveness and survival of their companies,” said Northwestern University’s Alfred Rappaport, who is credited with coining the term “shareholder value.” “Spare the messenger.”
Sometimes, they got eliminated regardless. More than a quarter of the country’s biggest corporations—those in the Fortune 500—received tender offers during the eighties. Two-thirds of them were unsolicited, and by the end of the decade a third of the names that had been on the magazine’s list no longer existed as independent entities. A good bit of this turnover was undoubtedly for the better. Economies thrive and grow over time as companies whose products and services become stale and obsolete are unseated by more innovative upstarts—a never-ending cycle of renewal that economist Joseph Schumpeter called “creative destruction.” In this context, “destruction does not mean ‘death’ in the Judeo-Christian tradition, but rather ‘transformation’ in the Hindu tradition,” consultant Richard Foster and management professor Sarah Kaplan have written.
Yet even proponents of this economic churn, which was now happening at a much faster pace than it did during the 1960s and ’70s, could see that such regeneration came with significant social costs—specifically, “the number of people left behind,’” to use Foster and Kaplan’s words. By 1990, the number of employees among Fortune 500 companies had dropped to fewer than 12.5 million from nearly 16 million a decade before. When Jack Welch had started out, vaporizing tens of thousands of jobs was still seen as scandalous. Pretty soon, it was common across corporate America—a wholesale realignment of who stood where in the eyes of management. “It used to be that companies had an allegiance to the worker and the country,” said Jim Daughtry, local leader of the International Union of Electrical Workers in Fort Wayne, Indiana. “Today, companies have an allegiance to the shareholder. Period.”
Well, not exactly. Through the 1980s and ’90s, at least one category of employee was making out better than ever: the CEO.
Top executives in America had always done quite well for themselves. In the 1950s, gadflies started to show up at corporate annual meetings to deplore the excesses of executive pay—a scene satirized in the comedy The Solid Gold Cadillac, starring Judy Holliday and Paul Douglas:
THE WOMAN. I’m sorry, I—I’ve never attended a stockholder’s meeting before. Maybe I’d better sit down.
BLESSINGTON. Just as you wish, Madam.
THE WOMAN. Thank you.
BLESSINGTON. Now, there is a motion—
THE WOMAN. On the other hand—it says here that the salary for the chairman of the board next year will be $175,000. Tell me—is that true?
BLESSINGTON. Well—uh—wherever did you get a notion like that, Miss—uh—
THE WOMAN. Mrs. Partridge. It’s on page ninety-six. Right here.
BLESSINGTON. I see. Uh—Mr. Snell, as treasurer, would you care to answer that question?
SNELL. Yes, indeed! Happy to oblige.… The—uh—could I hear the question again, please?
MRS. PARTRIDGE. I don’t want anyone to think I’m nosy, but is it true that the chairman will get $175,000 next year? It seems such a lot of money.
SNELL. Why—Madam. In a company of this size, that is not considered a large salary. Not a large salary at all. I believe that answers the question.
Actually, in 1956, the year that The Solid Gold Cadillac hit the big screen, $175,000 (equal to about $1.5 million now) was considered a large salary, though dozens of executives made more than that—some much more. The highest earner among corporate chiefs in ’56 was Eugene Grace, the chairman of Bethlehem Steel, whose compensation topped $800,000 (more than $7 million in current terms). Harlow Curtice of GM netted more than $695,000. General Electric’s Ralph Cordiner brought home about $260,000, while Kodak’s Thomas Hargrave commanded more than $210,000. Coca-Cola’s president, Bill Robinson, was paid just over $112,000 for his services.
At the midpoint of the range, top executives made about $150,000 a year during the 1950s—the same amount that they’d earned since the 1930s (when adjusted for inflation). Their pay would remain at a similar level through the 1960s and ’70s as well. There were lots of exceptions on both the high end and the low end, but overall, executive compensation held remarkably steady for five straight decades. Then, in a flash, everything changed. Median CEO compensation climbed more than 50 percent during the 1980s, and it more than doubled from there during the 1990s. By 2005, it would more than double again, so that a large-company CEO in the middle of the pack on pay was now making more than $9 million a year.
A big reason for this surge was agency theory. As companies sought to align the interests of managers and shareholders, they began to issue ever more stock options to top executives. Granting stock to CEOs was prevalent through the 1950s and ’60s, but it wasn’t a major part of their remuneration; salary and cash bonuses accounted for about 90 percent of total executive compensation through those years. Then, in the 1980s, stock awards became far more prominent. In 1993, Congress changed the tax code so as to further stimulate this trend. By 2001, shares of stock would account for 85 percent of CEO pay, by some calculations. Compensating top executives this way made exquisite sense if you bought into how Jensen and Meckling said the world should work, which by now nearly every corporate board of directors, executive team, and business consultant had accepted. The idea was shorthanded as “pay for performance”: if a company performed well for its shareholders, the CEO would rightfully get his piece.
But what looked so cogent on paper turned out to be problematic when implemented. To begin with, the expansion of stock-based pay coincided with the longest bull market in American history—a 1,500 percent jump in the Dow Jones Industrial Average from August 1982 to January 2000. “Despite the fact that the rising stock market was making options far more lucrative than boards likely anticipated when they issued them, directors couldn’t seem to break themselves of the habit of giving options to CEOs,” law professor Michael Dorff has written. “To the contrary, they kept issuing more.”
In turn, executives were incentivized to behave in perverse ways. Many stinted on investments in research and development, as well as on new factories and equipment, in order to hold down expenses. Such choices, along with creative accounting and other machinations, had the desired effect: they inflated current profits, which goosed the company’s share price—for a little while anyway—and allowed the CEO to make off with millions. Presumably, the CEO also stood to lose a bundle if he didn’t perform. But the rules and norms governing executive pay proved far less reliable than the law of gravity: what went up didn’t come down, at least not nearly to the same extent. CEOs “can make huge amounts of money,” the compensation watchdog Graef Crystal complained in 1991, “but it is hard for them to lose much money.… No matter how many times I have touted them, negative bonuses—the kind where the CEO writes a check to the company—have just never caught on.”
Of course, none of this created much value for stockholders, at least over the long haul. “Improving real-market performance is the hardest and slowest way to increase expectations” among investors and lift a company’s shares, the University of Toronto’s Roger Martin has explained. “The company has to build facilities, hire employees… and wait for these all to convert to real sales and profits increases.” It’s much easier “to hype your stock on Wall Street by providing aggressively high guidance on the company’s projected earnings” and then hit those numbers, down to the penny, by driving short-term results. In such an atmosphere, according to Martin, “companies focus more on their stock market analysts than on their customers,” and “employees feel ever less loyalty to their company, knowing that their company has precious little loyalty toward them.”
Roberto Goizueta wasn’t unmindful that maximizing shareholder value was being “vilified by many critics,” as he put it, especially after “huge layoffs at certain companies.” But none of that fazed him, as he dressed his preoccupation with Coca-Cola’s share price in a cloak of morality. Only by putting its stockholders first, last, and all of the time, he said, could the company hope to “contribute to society in meaningful ways.” Investment firms handling “the retirement funds and savings of teachers, public employees, and other citizens,” Goizueta pointed out, owned nearly 40 percent of Coke’s shares. The company’s employees owned another 20 percent or so, and some had become millionaires because of their Coke holdings.
Others rode the market wave, as well. When Jack Welch became CEO at GE, about 500 executives were awarded stock in the company every year. But Welch opened the spigot, so that by the time he retired, about 15,000 employees, or 5 percent of GE’s workforce, were awarded options worth more than $2 billion. “What a kick!” Welch would later remember. “Every Friday, I got a printout listing all the employees who exercised stock options and the size of their gains. The options were changing their lives, helping them put their kids through college, take care of elderly parents, or buy second homes.”
At Coca-Cola, for anyone who had bought $100 worth of stock the day Goizueta became CEO, its value was nearing a thousand bucks (with dividends reinvested) by his tenth anniversary in the job. The market now valued the company at $35 billion, up from $4 billion when he took over.
Nobody, though, would fare as well as Goizueta himself. Over his seventeen years as CEO of Coca-Cola, he would receive more than $1 billion in compensation—an extraordinary payout for someone who wasn’t the founder of a company but, rather, merely hired help. After Goizueta was gone and Coca-Cola slumped, some business experts would question whether he had really built the company for sustained success, though he always stressed the need to think years and years into the future, censuring leaders who slashed and burned their companies or played games to manipulate the stock. “Focusing on creating value over the long term keeps us from acting shortsighted,” he said. There was also a flap in the media when it was discovered that Coke’s board in 1991 had given Goizueta 1 million stock options, worth more than $80 million, and then obscured the fact in the company’s financial filings. Instead of spelling out “1,000,000 shares” in the part of its proxy statement that listed the CEO’s salary and bonus, the company wrote out “one million shares” and buried it in a thick paragraph three pages later. More condemnation from good-governance advocates came when it was revealed that Coca-Cola had agreed to fork over more than $100 million to cover Goizueta’s entire federal and state tax bill.
Yet, despite the outcry, one group didn’t seem to care much at all: Coke’s shareholders. As long as Goizueta was making them rich—and he was—they cheered him on. Before Goizueta’s arrival, Coke’s stock had been stagnant for twenty years. “They could have put their money in a drawer and done as well,” Keough said. Now, the stock was soaring—and so was their sense of gratitude. A group called “the Coca-Cola widows,” which was made up of Georgia women whose working-class husbands had bought a few Coca-Cola shares and held on to them until they became worth millions, thought of Roberto Goizueta as their savior. “I never sold a single share,” one elderly woman said, “and I just wanted to tell you Coca-Cola has put my six children and seventeen grandchildren through college.” At the annual shareholder meeting in 1992, Goizueta worried that he might be confronted about his exorbitant pay. Instead, those in attendance interrupted his remarks four times with their applause.
Coca-Cola was, in this way, a microcosm of America as a whole. The generation of executives that had entered the workplace in the 1950s and ’60s was steeped in an ethic of commitment and community that arose out of the shared struggle and sacrifice of the Depression and World War II. “The old giving/getting compact,” the great pollster and trend-spotter Dan Yankelovich wrote in his 1981 book New Rules, “might be paraphrased this way: ‘I give hard work, loyalty and steadfastness. I swallow my frustrations and suppress my impulse to do what I would enjoy, and do what is expected of me instead. I do not put myself first. I put the needs of others ahead of my own. I give a lot, but what I get in return is worth it. I receive an ever-growing standard of living, and a family life with a devoted spouse and decent kids.… I have a nice home, a good job, the respect of my friends and neighbors; a sense of accomplishment at having made something of my life.’” Through the eighties and nineties, however, such attitudes all but disappeared.
The “do your own thing” spirit of the 1960s and ’70s counterculture had gone mainstream and taken on a harder edge—mutating into a “culture of narcissism,” to use the title of Christopher Lasch’s 1979 bestseller. “Tens of millions of Americans have grown wary of demands for further sacrifices they believe may no longer be warranted,” Yankelovich reported, based on a plethora of social-science research. “They want to modify the giving/getting compact in every one of its dimensions—family life, career, leisure, the meaning of success, relationships with other people, and relations with themselves.” Inside many companies, the refashioning of the compact came down to this: managers were thirsting to get more and give less; they began to think about themselves rather than the larger group. “A crude individualism reasserted itself; that is, the largely mythical, nostalgic, and debilitating view that in America, people pulled themselves up by their own endeavors, acting heroically and alone, as Jay Gatsby had in F. Scott Fitzgerald’s 1925 novel The Great Gatsby or as Howard Roark did in Ayn Rand’s 1943 novel The Fountainhead,” the journalist Louis Uchitelle has written.
Among CEOs, this propensity was particularly pronounced. “The logic is, ‘What’s good for me is good for everybody,’” Yankelovich said. In point of fact, it’s difficult to see how what’s been good for CEOs has been good for almost anybody else, especially the 90-plus percent of employees who don’t hold any stock options. Until the 1970s, the pay of the average worker was going up faster than that of top executives. By the 1980s, however, CEOs were galloping ahead while most everyone else was standing still or falling behind. And the gap was getting exponentially wider. In 1965, CEOs at big companies made 20 times what the average worker did. By 2000, they’d make 376 times more. Never mind that it’s exceedingly debatable how much of a company’s performance is due to the talent of any single individual—even the CEO—as opposed to the collective efforts of the entire enterprise. Stock options are also rarely indexed to the market or the industry as a whole, so executives often benefit from a general rise in economic conditions, not from anything they’ve done personally. Some refer to this as “lucky dollars.”
As the years rolled on, many rank-and-file workers were to discover that they too were becoming more dependent on the stock market—or, more particularly, their retirement security was. They, however, weren’t nearly so lucky.
America’s pension revolution began more or less by accident, when Rep. Barber Conable, a New York Republican, tried to help out some people back in his home district: the employees of Eastman Kodak.
In 1958, Kodak had started giving its workers a choice: they could take their wage dividend as they always had—as a cash bonus—or they could have the company add it to their pension plan. One advantage of the latter was that they wouldn’t have to pay any taxes on the money until it was tapped during retirement. Through the 1970s, however, the Treasury Department kept eyeing that pool of protected income and making noise about taxing it as a way to raise revenue. Conable, a senior member of the House Ways and Means Committee, felt as if the bureaucrats at Treasury—“running dogs,” he called them—were trying unfairly to change the rules midstream. And so he looked to shut them down by adding new language to Section 401 of the Internal Revenue Code. Thanks to subsection (k), Treasury would no longer be able to go after profit sharing that had been placed into a retirement account.
When the Revenue Act of 1978 passed, Conable’s contribution to the legislation was considered largely inconsequential—a neat way to assist the men and women of Kodak, as well as others who participated in profit-sharing plans. “There was absolutely no discussion in ’78 that if you do this, the world is going to change,” said Daniel Halperin, a Treasury official at the time. Conable himself had all but forgotten that he’d even authored that part of the law when in early 2000 he was shown a copy of Pensions & Investments magazine, which had just named him as one of its top picks for “person of the century.” Long retired from Congress, Conable was at a board meeting of the insurer American International Group when someone pulled out a copy of the trade publication, which had paid tribute to Conable alongside Andrew Carnegie, Warren Buffett, Benjamin Graham, and a handful of others who’d left an everlasting mark on finance and investing. “Everybody was saying, ‘Oh, Barber, you did such a wonderful thing!’” he later recounted. “I said, ‘I don’t think I did.’” Finally, he phoned the Ways and Means staff, and someone there confirmed it: Conable had most certainly been responsible for the 401(k)—and, lo and behold, it had changed the world.
In the two decades between Conable’s seemingly modest act of constituent service and his coronation by Pensions & Investments, the 401(k) had gone from being a narrowly conceived instrument to promote profit sharing to becoming the most popular pension vehicle in the country. Yet the 401(k) wasn’t only widely used; it was also widely criticized. The principal reason: it allowed companies to transfer the risk for retirement savings from their own books onto the shoulders of their employees.
Even in its heyday, the private pension system in the United States was far from perfect. Although it had expanded rapidly through the 1950s and continued to grow during the ’60s, due mainly to bargaining-table victories by organized labor, employer-sponsored retirement coverage never reached more than about half of the private-sector workforce. Those employed by small businesses usually had no pension plan at all, except for Social Security. Some workers who thought they were covered—perhaps as many as a third of them—wouldn’t actually wind up receiving any benefits when they retired because they hadn’t met the vesting requirements. Others objected that their company’s pension payouts were so miserly that they couldn’t possibly keep up with inflation. “It must be evident to you that due to soaring medical expenses, spiraling costs of living, food prices, taxes, etc., your average GE retiree is sinking deeper and deeper into debt,” Alfred Articolo, who’d been retired from the company for a dozen years, wrote to General Electric’s HR department—one of hundreds of protests made over decades. “Many have exhausted their life savings and don’t know where to turn.” Still others watched their pensions vanish altogether when their employer hit the skids, as Studebaker had. “There is this myth that we once had a wonderful retirement system,” said Ted Benna, who sold pension plans to companies. “That’s just a bunch of nonsense.”
Nevertheless, even with all of these shortcomings, for most of the 25 million or so workers who had traditional retirement benefits through the mid-1970s, it was an excellent deal—in many regards, the fullest expression of how the corporate social contract in America was built on an expectation of lifelong loyalty between employer and employee. Before the 401(k) took off, pensions at most large businesses consisted of “defined-benefit plans,” which gave workers a set annual income upon retirement for the rest of their days, typically equal to a percentage of their average salary multiplied by their number of years of service to the company.
With the 401(k), which is a type of “defined-contribution plan,” it was now up to the employees to decide what portion of their salary they wanted to sock away for retirement. The majority of employers matched at least some of what their workers put in. But there was no dependable return. If you saved enough in your 401(k) account and that money was invested wisely, you could do just fine. But if you didn’t save adequately or couldn’t afford to, or if you invested poorly or were the victim of bad timing—so that you retired right when the stock market happened to tank—you were sunk. The conventional wisdom was that “all you have to do is contribute to your 401(k), keep it there, and invest in stocks. You’re going to strike it rich,” cautioned Karen Ferguson, director of the Pension Rights Center, a Washington-based nonprofit whose goal is to make sure that Americans have enough money to live on when they are too old to work. “The reality is that you may strike it rich, but the odds are excellent that you will not.” It didn’t take long before the 401(k) was the punch line of a joke among businesspeople: “What begins with an F and ends in K and means screw your employees?”
It wasn’t supposed to be like this. Originally, the 401(k) was designed to help workers add to their retirement savings, but they’d still rely on their defined-benefit plan and Social Security for the bulk of their pension income. “It was to be a three-legged stool,” said Benna, who was a pioneer in creating 401(k) plans. Yet corporate America had other ideas. By 1985, workers in defined-contribution plans outnumbered those with guaranteed pensions, and over the next decade the trend greatly accelerated. Hardly any new companies put defined-benefit plans into place anymore, choosing to offer only 401(k)s instead. Eventually, it became routine for older corporations to freeze their long-established retirement funds rather than keep building them up. And many—including Kodak, Coca-Cola, General Electric, and General Motors—would in time just replace them with defined-contribution plans or other alternatives that were cheaper for the corporation. The three-legged stool was now teetering on two.
Several factors spurred the headlong rush by businesses into 401(k)s. To start with, they contained some features that were genuinely well-liked by workers, including the ability to easily move the money that had accumulated in these accounts whenever someone switched companies. In an era when fewer and fewer people could count on anything approaching “lifetime employment” anywhere, many viewed such portability as a plus. By contrast, defined-benefit plans were often back-loaded so that some of the largest pension gains came in an employee’s last years—an arrangement that incentivized people to stay with the same company all the way through the end of their career and thus bring continuity to an employer’s payroll. By the mid-1980s, when corporate downsizing had become commonplace, a framework that rewarded such loyalty no longer seemed very relevant.
Companies had other motivations, as well, for substituting their old pensions with 401(k)s. In 1974, President Ford signed the Employee Retirement Income Security Act, which was intended to prop up the country’s private pension system. Among other things, ERISA established a new agency—the Pension Benefit Guarantee Corporation—to step in and pay retirees at least a portion of their plans when their employer faltered and could no longer meet its obligations. With the PBGC in place, the thinking went, there would be no more disasters like the one that had struck thousands of workers at Studebaker. For some people, this safety net would surely make all the difference. As the years went on, the PBGC would assume responsibility for the current and future pensions of about 1.5 million Americans. Yet ERISA also had unintended consequences. Businesses with defined-benefit pensions now had to pay an insurance premium to the PBGC, and they faced new funding requirements and other regulatory burdens. Companies that offered only 401(k)s escaped this hassle and expense. And so more and more elected to do just that.
Besides costing companies a third to half as much as defined-benefit plans, 401(k)s were attractive to top executives for another reason: using them to supplant traditional pensions became another means to enrich themselves. In 1987, a new accounting standard went into effect requiring companies to report pension liabilities on their balance sheet. The rule, FAS 87, was meant to increase transparency for investors. But as a byproduct, companies found that if they cut workers’ old pension benefits, they would get to record a paper gain and fatten the bottom line. “Unfortunately for employees and retirees,” the investigative reporter Ellen Schultz has written, “these newfound tricks coincided” with tying executive compensation to short-term financial results. “Thus, deliberately or not, the executives who green-lighted massive retiree cuts were indirectly boosting their own pay.” By the late 1990s, hundreds of companies had effectively transferred large amounts of wealth from frontline workers to the CEO via FAS 87.
All in all, the advent of the 401(k) would turn into a big mess for most American workers. While private pension coverage had never been very complete, it didn’t get any better as defined-benefit plans died out and 401(k)s proliferated; about half of all US workers were still left with no retirement cushion outside Social Security. As for the families that did have 401(k)s, the numerous pitfalls in these plans would leave most of them with the tiniest of nest eggs—less than $400 a month, on average. The societal implications of this deficiency were enormous: from the end of World War II until the 1980s, the share of older Americans in the labor force had fallen every year. But by the mid-1990s, in large part because of the paucity of income provided by people’s 401(k)s, that reversed itself. For many, their retirement dreams were now broken. They had no choice but to keep working.
And still, for all of its egregious faults, the 401(k) was very much a product of its time, in no small measure because it “comports so well with American cultural norms about private property and individual ownership,” as law professor Edward Zelinsky has written. The big unions that in the 1950s had fought to set up defined-benefit plans for their members had a very different sensibility—one of group solidarity. “Unions cover workers who would not otherwise choose or be able to save, such as garment workers, coal miners, and construction workers,” labor economist Teresa Ghilarducci has said. “They would never have the kind of pensions they do if it weren’t for unions. They need, like most workers, a collective solution.” By the 1980s, collective solutions were fast falling out of favor across much of the country. 401(k)s, which Ghilarducci and others have called “do-it-yourself pensions,” were well suited for a nation that now prized self-reliance over other values.
The biggest difference between Jack Welch’s revamping of General Electric and what Roberto Goizueta was doing at Coca-Cola was that the body count was so much lower at Coke. Goizueta “has done it without the scars of heavy-handed restructuring evident at many other big corporations,” Fortune gushed in 1987. “There have been no layoffs.”
Yet while Goizueta was not anything close to a “Neutron Roberto,” his tenure atop Coke wasn’t as bloodless as it may have seemed. He fired dozens of executives who couldn’t meet their numbers—though, like Welch, he never did the deed personally. And he wound up pushing out thousands of others, albeit more indirectly, as his quest to maximize shareholder value filtered down to where most of the people in the Coca-Cola system actually worked: at the bottling plants.
Goizueta knew that if Coca-Cola was going to succeed to the degree he wanted, he needed to improve the operations at many of the more than 150 independent Coke bottling companies spread across the United States. At first, his approach was to have the parent company buy up the weakest bottlers, turn them around, and then sell them back to stronger members of the system; he called this “refranchising.” A small portfolio of corporate-owned bottlers was to be kept to a minimum. But in 1986, Goizueta was presented with opportunities to purchase two giant bottling franchises, one in the South and one in the West, for about $2.5 billion. He jumped at them.
These weren’t fixer-uppers but well-run businesses that Coke wasn’t about to quickly resell. The problem was that Goizueta now needed to clean up Coca-Cola’s books, which all of a sudden were overloaded with debt. And so he turned to his chief financial officer, Doug Ivester, for the answer. Through his alchemy, Coke would spin off its now big bottling arm into a new publicly traded company, Coca-Cola Enterprises, and retain 49 percent ownership. Coke’s stake in CCE was large enough to ensure that the main company was still really in charge. But because it technically held a minority position, Coke could now dump billions of dollars in debt from its balance sheet, as well as have CCE absorb its old plants and equipment—depreciating assets that didn’t fit the picture Goizueta was trying to paint to Wall Street. Coke would sell lots of soda concentrate at its desired price to CCE, and CCE would send dividends back to the corporation. For Goizueta, Ivester’s “49 percent solution” was a money machine. “Coca-Cola has margins that would make a cocaine dealer blush,” was how Jack Bergstrand, a then-young manager who would rise up the ranks, liked to describe it.
But things weren’t so pleasant for many of those inside the factories where Coke was bottled and packaged and from which red and white trucks loaded with the stuff fanned out along routes that had been worked for generations by salesmen who knew the names of every shop owner in town, and the names of their kids, too. For them, being swept under the CCE umbrella was painful—the culmination of a decades-long shakeout.
In the late 1950s, more than 1,000 Coca-Cola bottlers were scattered across America, many of them family-owned businesses employing anywhere from several dozen to a few thousand people. Over the next twenty-five years, as competition from Pepsi intensified and the profile of customers changed from mom-and-pop groceries to supermarket chains and big-box retailers, the number of bottlers would shrink by more than 80 percent. But even so, many of the survivors were still being managed as they’d always been—without the sophistication needed to make it in this new, less forgiving world. “The business was moving fast and moving beyond some of these guys,” said Bill Casey, a former senior executive at Coca-Cola who ran several bottling operations during his thirty-two-year career at the company.
Jack Bergstrand was among those brought in to help rectify the situation. Coca-Cola had hired him in 1979, right after he got his master’s in advertising from Michigan State. But instead of being given a marketing job in Atlanta, he was sent off to Boston to be an operations analyst, and soon a manager, at one of the few bottling plants that the company then owned. With his advanced degree, Bergstrand stood out; most of those at the factory hadn’t gone to college. “I didn’t meet anybody else who was in management in the bottling business who hadn’t started on a truck and worked their way up,” Bergstrand said. “They’d put in a lot of time slinging cases of Coke. It was freezing cold in the winter and hot and windy in the summer. And then here comes this twenty-one-year-old kid into management. It wasn’t something that people were used to.”
Bergstrand was sensitive to the misgivings of those around him, having grown up in a blue-collar home in northwest Illinois, the son of a tool and die maker at Ametek Incorporated, a maker of electronic components. He’d always remember the union newspaper that his dad had laying around, with the black circle and defiant message on the front page: “Blow on this dot. If it turns blue, company promises may come true.” Bergstrand did his best to blend in at Coke, playing blackjack and drinking beers with the guys at the pubs around Needham and Worcester. “I’d spend about ten or eleven hours working and another five hours socializing,” Bergstrand said. “I tried to endear myself to the old regime.” He won over almost everyone—though, because of his position, he had things that they didn’t, like a company car. And they had things that he didn’t, like calluses on their hands. But that would change. For a while, Bergstrand drove a sales route, learning the business from the ground up, and eventually he lugged enough cases of soda to toughen up his palms.
After a series of management jobs, Bergstrand was pulled in to help form CCE. From an operating standpoint, “CCE was brilliant,” he said. “Now, overnight, you’ve got this massive publicly traded company that does nothing but bottling. You have all this synergy.” Processes were made consistent from plant to plant; KBIs, or key business indicators, replaced backslaps and shots at the local tavern as the basis of decision making. “We were running off information instead of instinct and personality,” said Ted Highberger, another longtime Coca-Cola bottling executive who went to work at CCE. “Everything got upgraded.”
Yet for those who had come up under the old system, this blast of “professionalization” was jarring—and not everyone made it through. “Some people wound up with better jobs than they’d had because the size of the company afforded greater opportunities,” Bergstrand said. “But other people got cut out completely.”
As CCE grew through a string of acquisitions, the company found new efficiencies by combining different aspects of the bottling business—from manufacturing to IT to sales to warehousing to distribution. “One of the purposes of bringing these companies together was to gain some economies of scale,” said Jean Michel Bock, the assistant to the president of CCE. “There were a lot of functions which were redundant.”
The man behind much of the consolidation was an executive named Larry Smith, who had been recruited to CCE from Pepsi. Smith was actually the one who had thought up the blind taste tests that became the Pepsi Challenge, making his presence at CCE difficult for many to bear. “He was reviled, partly because he came from Pepsi,” Highberger said. But that wasn’t the only reason. Smith was also “totally numbers-oriented. He couldn’t give a shit about people.” Said Bergstrand: “He was like the anti-Christ to the old Coke system”—notorious for supposedly telling employees, “I’ll laugh with you. I’ll joke with you. I’ll fire you.” Prior to CCE, Bergstrand recalled, “I never really heard a lot about head count” at the bottlers. “It was much more of a family orientation. It became much more of a business orientation.”
One of those who felt the sting was Bill Maiocco. His father, George, had run Coca-Cola bottling plants in Boston and Baltimore. Bill began to work full-time at the company in 1958 at age seventeen, fresh out of high school. His first job was making the Coca-Cola signs—“Delicious and Refreshing”—that merchants would hang in their stores. Soon, Maiocco got a sales route out of Boston, and then out of the plant in suburban Braintree, donning his orange uniform with green stripes as he filled vending machines at offices and college dorms.
By the mid-1960s, Maiocco had become a sales manager, often putting in eleven- or twelve-hour days, topped off by a Red Sox game or a boxing match with some of his best customers. He would continue to advance over the next two decades until he was made a vice president of Coca-Cola New England, supervising a hundred people in sales and clearing a six-figure income between his salary and bonus. “I loved it,” Maiocco said. “I had a good job, and I thought I did a good job.”
But once Coke New England was folded into CCE, Maiocco was in for a fall. First, the new leadership took away his vice president title. And Maiocco found himself being bossed around by a new top manager who had worked previously at United Parcel Service—a background that Maiocco, who bled Coca-Cola, just couldn’t understand. “The guy had never even seen a Coke bottle in his life,” Maiocco said. “Why would you hire a person to run your business who has never been in the business?”
For several years, Maiocco kept most of the same responsibilities that he’d had before CCE. But then, in 1994, the company brought in a younger manager from Europe. “He had a doctorate in something,” as Maiocco remembered it. Maiocco trained him—and then he replaced Maiocco. “He was with me for three months, side by side,” Maiocco said. “The next thing I know, I’m being demoted, and he’s being promoted into my job. I had no inkling whatsoever.”
Maiocco was offered a lower-level job as an account manager. He decided to retire. He was fifty-five and would live out the next twenty years comfortably, though not lavishly, by drawing down his 401(k). He’d have regrets from time to time, wondering if he’d retired too soon, particularly when the stock market wasn’t doing very well. But Maiocco had seen enough to know that CCE didn’t much care for the opinions of those who’d risen into management from behind the wheel of a delivery truck. “We weren’t appreciated,” he said. “Why they thought we were lousy, I have no idea.”
Jack Bergstrand had an idea. Executives like Maiocco—hardworking, personable, loyal—were exactly what was needed when there were scores of independent bottlers, each managed according to its own idiosyncrasies. But CCE was all about standardization and rationalization. Maiocco wasn’t so good with that. “I’m fine with you centralizing—as long as I can do whatever I want,” he once told Bergstrand, apparently only half in jest. If CCE hadn’t adjusted, if it hadn’t brought in all of those MBAs and others with the skills to make the system more responsive to the changing demands of the market, most of those bottlers never would have lasted anyway. “The model was not sustainable,” said Bergstrand, who in the late 1990s would become the chief information officer of Coca-Cola.
Yet Bergstrand also realized that something was now missing—a way of life in which someone with only a high-school diploma could go from blue-collar employee to manager to vice president. Even making soda had become knowledge work. “There’s a bit of Bill Maiocco that’s been lost in every company,” Bergstrand said, and that would always make him sad.
In September 1997, Roberto Goizueta checked into Emory University Hospital, pale and fatigued. In no time, doctors found a malignant tumor on one of his lungs. Hardly anyone who knew him was shocked. Goizueta had smoked cigarettes since he was a schoolboy in Cuba—sometimes as many as three packs a day.
After sixteen years as the CEO of Coca-Cola, he already had been contemplating who would succeed him. Doug Ivester, who had been promoted from CFO to chief operating officer, was Goizueta’s heir apparent. Still, the passing would come much quicker than anyone was really ready for.
Goizueta, then sixty-five, would live only about six weeks after his diagnosis. “He loved this company and the associates because of what we were able to accomplish together,” Ivester told employees in a recorded message sent around the world on the morning that Goizueta died. For Goizueta, that list of accomplishments was no doubt led by this: Coke’s stock was now valued at about $180 billion—an increase of 3,500 percent since 1981.
He was true to himself to the very end. While he was hospitalized at Emory, in the same penthouse suite where Robert Woodruff had breathed his last breath, Goizueta would regularly check the machine that had been hooked up in the adjoining room. There, he could see Coke’s share price pulsate.