Chapter 8

Corporate America, 1945–1980

World War II left the nations of Europe devastated. Major cities, including Berlin, Dresden, London, and Vienna, endured massive aerial bombing. The war displaced millions of people from their homes; transportation infrastructure—including roads, bridges, and railroads—was damaged or destroyed. Nearly 36.5 million Europeans, including Russians, died between 1939 and 1945 from war-related causes. Japan was in ruins following the firebombing of Tokyo and the atomic bombing of Hiroshima and Nagasaki. The two atomic bombs alone are estimated to have killed or wounded more than 200,000 people.

By contrast, the United States suffered 418,000 war-related deaths and its infrastructure emerged unscathed, in fact, stronger than it had been before. Its large corporations dominated global markets, its superior military power was unquestioned, and its capacity for technological innovation was unsurpassed.

Indeed, the decades immediately after the war were a golden age for American business. Manufacturers in the United States—located in cities that had escaped aerial bombing—became world leaders in oil, rubber, chemicals, pharmaceuticals, electric equipment, mass-produced machinery, appliances, automobiles, metals, processed foods, drink, and tobacco. In the 1950s, a person seeking to buy a television could choose from more than ten different American brands, including Admiral, Calbest, DuMont, Emerson, Pioneer, and the Radio Corporation of America (RCA). Construction companies also prospered thanks to new growth in the housing market, which had been dormant during the Great Depression and the war. More than a million new housing units were built annually from 1946 to 1955. A testament to this period of prosperity, the US economy grew by an average of 3.8 percent per year from 1946 to 1973.

At the helm of this booming economy were the business executives who had played an essential role in mobilizing American industry during the war. Harley Earl, who had established a wartime camouflage research and training division at GM, became head of GM’s car styling team and introduced the wraparound windshield, two-tone paint, tail fins, and many other iconic design features of the low and long cars of the 1950s and 1960s. After serving on the War Production Board, Ralph Cordiner returned to General Electric, serving as president from 1950 to 1958 and as chief executive officer (CEO) from 1958 to 1963, moving the company further into airplane engine building.

Organization men and women

General Electric, DuPont, GM, AT&T, and Westinghouse all extended their research and development activities by building specialized research centers that had remarkable modern glass-and-metal architecture. The Finnish American architect Eero Saarinen designed the GM Technical Center in Warren, Michigan (1956), the IBM Thomas J. Watson Research Center in Yorktown Heights, New York (1961), and Bell Laboratories in Holmdel Township, New Jersey (1962). The federal government also continued to sponsor corporate research, much of it driven by new defense projects for the Cold War. Indeed, the decades after World War II saw the rise of the so-called military–industrial complex, a phrase first articulated by President Eisenhower in his farewell address in 1961 in reference to the growing segment of the economy with a vested economic interested in waging war. In that speech, Eisenhower underscored the importance of America’s capacity for military production, yet warned about the potential for the misuse of the immense power that it afforded.

Ironically, during this period of innovation the leadership of large American companies was defined by homogeneity, conformity, and a high degree of corporate loyalty—traits described in sociologist William H. Whyte’s The Organization Man (1956). Many managers and senior executives worked at the same company, or at least in the same industry, for their entire careers. Whyte’s book portrayed a safe “pipeline” running from colleges to recruiting offices and, eventually, to quiet suburban communities. In the 1950s, one college senior quoted in Organization Man admitted, “I don’t think AT&T is very exciting. But that’s the company I’d like to join. If a depression comes there will always be an AT&T.”

At the largest firms, senior business leaders were almost uniformly white, male, and Protestant, and the position of “executive” became something of an American archetype. The editors of Fortune produced The Executive Life (1956) with chapters on “Who are the Executives?,” “How Hard Do Executives Work?,” “How Executives Crack Up,” and “How to Retire Executives.”

A remarkable travel service from this period was United Airlines’ men-only “Chicago Executive” flights, which operated from 1953 to 1970 with travel between Chicago and New York and Los Angeles to San Francisco. Print advertisements for the Chicago route described the exclusive offering as a “club in the sky” on the Douglas DC-6. Cigar and pipe smoking was permitted, and customers were served cocktails along with a steak dinner.

The postwar business norms of the 1950s encouraged a new, distinct culture and set of values. Those hoping to climb the corporate ladder no longer turned, as Thomas Mellon had, to Benjamin Franklin’s The Way to Wealth and its philosophy of dedication, perseverance, and thrift. Now, they looked to the advice of Dale Carnegie’s How to Win Friends and Influence People (1936), which advocated smiling, getting along, and ingratiating oneself with colleagues and clients as the key to success. Young executives, according to this new thinking, needed “people skills” to navigate corporate bureaucracies. Carnegie’s book sold millions of copies in more than thirty languages.

Large businesses grew, overall, increasingly homogenous in their management at the same time as most firms proclaimed, unironically, that they advocated meritocracy in their hiring. Corporate leaders often summed up their firm’s values with phrases such as “free competition” and “equality of opportunity,” while unhesitatingly dismissing the merit of entire classes of people, most specifically African Americans and women.

Rosabeth Moss Kanter’s 1977 book, Men and Women of the Corporation, described the consequences of “organization man” homogeneity for talented women. Well into the 1960s and 1970s, women remained unable to break out of gender-segregated roles, mostly clerical and secretarial jobs, to which they were relegated in the 1950s. Those few who did achieve managerial posts faced the problem of tokenism: When they failed, they were seen as representing all women; when they succeeded, they were viewed as exceptions or as possessing masculine characteristics.

In response to these and other obstacles to equal opportunity in the workforce, activists and civil rights leaders began to organize and agitate for fair employment practices. Progress came only intermittently. Indeed, 1964 saw both passage of a major Civil Rights Act (which banned discrimination based on race, color, religion, sex, or national origin and ended racial segregation) and riots in Rochester, New York (the home of Xerox and Eastman Kodak) protesting unemployment among African Americans in that city and discriminatory housing shortages.

Another important development came in the 1970s, when the Equal Employment Opportunity Commission (founded in 1965) gained more power to bring complaints against companies that failed to support and promote diversity. In 1973, a group of female and African American male employees of AT&T tested this new power and won a multi-million-dollar settlement in their suit over discrimination in the workplace. A Business Week report from 1975 called the AT&T case “the single strongest influence on corporate employment practices regarding women.” Increasingly, for many managers and executives, fear of facing a penalty gradually gave way to a recognition of the intrinsic benefits of employee diversity and the conviction that “fair employment is good business.” Still, it was not until the very end of the twentieth century that women started to break into positions of leadership at major firms. In 1999, Carly Fiorina became head of Hewlett–Packard. In 2001, Anne Mulcahy became CEO of the Xerox Corporation; she was succeeded in that position by Ursula Burns, the first African American woman to become head of a Fortune 500 corporation. In 2011, Indra Nooyi was appointed CEO of PepsiCo and Margaret “Meg” Cushing Whitman became head of Hewlett–Packard. But even by 2014, although women accounted for 45 percent of employees in S&P 500 companies, they held only 4.6 percent of CEO positions.

American manufacturing

In 1955, Fortune published its first list of the 500 largest companies in the United States, revealing the enormous scale and scope of the American economy. Businesses on the list fell into three broad groups. The first was large manufacturers in industries that made established or “stable technology” products (automobiles, oil, rubber, metals, and other nonelectric machinery). Such businesses typically competed with one another by making minor improvements in their product or in their processes of distribution or production. The second group was manufacturers in “high-tech” industries (such as aviation and chemicals), who competed through the commercialization of new research and development. The final category was manufacturers in “low-tech” goods (including clothes and food), where innovations usually occurred in marketing, branding, and distribution.

Among the firms clustered in established manufacturing industries, GM topped the list with annual revenues of $9.8 billion and a staggering 624,011 employees worldwide. General Motors was the largest company in the country’s largest industry. By mid-century, the United States produced 75 percent of the total world automobile output, with Great Britain, the second-largest producer, contributing only about 10 percent.

Ford, under Henry Ford II (grandson of Henry Ford and CEO of the company from 1945 to 1979), rebounded from the difficulties the company had faced in the 1930s—largely by hiring away GM executives and creating a multidivisional structure like its rival. With its new approach, Ford outpaced Chrysler, the third of the “Big Three” companies. The three car companies competed with one another by introducing annual models with an array of stylistic changes, setting high sales quotas for franchisees, and undertaking advertising campaigns in print and on radio and television.

The cars of the 1950s and 1960s tended to be more powerful, wider, and longer than their predecessors. They emphasized style and an abundance of chrome. The 1955 Chevrolet Bel Air (first introduced in 1950) had a V8 engine and featured chrome fenders and chrome spears on the hood. It also had GM’s Powerglide automatic transmission. The Cadillac Eldorado was an extreme example of exuberant styling with, in the late 1950s, large vertical tail fins.

Despite new tools for consumer research (including studies by industrial psychologists), carmakers occasionally missed the mark. Ford launched a media blitz to sell the cars of its new Edsel division in the late 1950s. The company spent ten years and over $250 million on the car’s development and hoped to create a buzz by hosting a public car-naming contest and spending lavishly on television, print, and radio ads. Still, sales were poor. The car gained a reputation for unreliability and its styling was a significant departure from popular trends of the day. The car featured a horse-collar radiator grille, to evoke nostalgia, and its rear-end, horizontal tail fins. It was an ungainly blend of past and present. The Edsel division lasted from just 1958 to 1960. Overall, however, the automobile industry was so profitable that such failures mattered little. Other models, including the muscle cars such as the Ford Mustang (1964), the Chevrolet Camaro (1966), and the Pontiac Firebird (1967), sold well.

image

8. The Cadillac Eldorado, 1958. The epitome of late-1950s style, parked at Tavern on the Green, New York City. The premium luxury car featured chrome bumpers, a jeweled grille, power seats, a dual four-barrel V8 engine, power windows, air conditioning, glove-box drink tumblers, and tail fins. It was an entirely different conception of the automobile than the simple Volkswagen Beetle, then gaining in popularity.

High-tech

A second group of manufacturers focused not on established technologies but on new, innovative high-tech sectors, including aviation, electronics, pharmaceuticals, and petrochemicals. Corporations in these industries competed with one another more often through research and development efforts and by attracting government funding rather than through marketing. Two industries that proliferated during this period were aviation and aerospace. The leading companies were Boeing (Seattle), Douglas Aircraft (Santa Monica), Lockheed and North American Aviation (Los Angeles), General Dynamics and Curtiss–Wright (New York), Martin (Maryland), and American–Marietta (Chicago). In 1955, Boeing had revenues of $1 billion and Douglas Aircraft trailed closely at $900 million.

The largest company in the jet engine industry was GE, which ranked fourth on the Fortune 500 list in 1955, with revenues of $2.9 billion. During World War II, GE transformed itself from an electric systems company to a technology giant that produced turbines, airplane engines, and other complex machines. After the war, the jet engine industry seemed headed for difficulty with impending drops in military demand, but the onset of the Cold War staved off a decline. In 1947, GE produced the J-47 engine, used by the Air Force in its new B-47 bomber, a long-range, six-engine turbojet designed to fly at high speed and high altitude to avoid enemy detection. By 1953, GE manufactured 60 percent of the jet engines built in the United States—nearly all for military use. During this period, GE also began to convert some of its models into commercial aircraft as the company entered the burgeoning business and leisure jet-travel industry.

The postwar period was also one of expansion for the electronics industry, as companies slowly made advances in the development of digital computers. No firm illustrates this postwar surge in innovation better than IBM. Thomas J. Watson Jr., who took over from his father as president of the company in 1952, oversaw a period of growth. In his memoirs, he described his goal of making significant investments in building computers. “That meant hiring engineers by the thousands and spending dollars by the tens of millions for new factories and labs,” he wrote; “The risk made Dad balk, even though he sensed the enormous potential of electronics as early as I did.”

Watson led the company past computer rivals such as Remington Rand (maker of the popular Univac), GE, NCR, and Burroughs. Among the company’s innovations during this period was the computer language FORTRAN, which was quickly adopted by other companies, and the airline reservation system SABRE, which was developed in collaboration with American Airlines and launched in 1964. Before the development of SABRE, American Airlines had been handling all booking and reservations manually—a tedious process that prevented the company from being able to rapidly and cost-effectively scale its operations. In the following years, IBM teamed up with other major airlines to develop similar systems.

IBM’s biggest gamble, and its greatest payoff, came from the IBM System/360, an entirely new system of mainframe computers and peripherals that allowed both scientific and commercial users to expand their components and software as their needs grew. The original, announced in 1964 and sold beginning in 1965, could perform an astonishing 34,500 instructions per second. The IBM System/360 was one of the most significant innovations of the period and contributed to the so-called third industrial revolution, which saw the advent of digital computers and the commercialization of information.

Low-tech

Along with high-tech companies, the American manufacturing sector included many low-tech producers, including those in the food, lumber, tobacco, and brewing industries. The meatpacker Swift (with $2.5 billion in revenues in 1955) was number seven on the Fortune 500 list, and its competitor, Armour (with $2 billion), was number nine. National Dairy Products (a maker of ice cream), General Foods, and Borden were all in the top forty largest firms in the country.

Many of these low-tech companies sought to avoid intense price competition in their respective industries by use of branded products. The breakfast cereal industry—which included such companies as Kellogg’s, General Mills, Quaker Oats, and Post—offers a window into the inventive marketing strategies that emerged. To stimulate demand, cereal manufacturers turned to television, a new medium that was becoming increasingly popular. (Although only 9 percent of American households had a television set in 1950, a remarkable 83 percent did by the end of the decade.)

In the early 1950s, Kellogg sponsored its first television show and later hired a Chicago-based advertising agency to create the iconic Tony the Tiger (drawn by Disney animators), which debuted with Frosted Flakes in 1952. By 1958, the four largest cereal-producing companies spent $47 million annually on television promotion. Post advertised its cereals on the Bugs Bunny Show; General Mills on Rocky and His Friends; and Kellogg’s on The Andy Griffith Show. This period saw the advent of many cereals that are still popular in the early twenty-first century—including Post’s Alpha-Bits (introduced in 1958), Kellogg’s Apple Jacks (1965), Quaker Oats’ Cap’n Crunch (1963), and General Mills’ Lucky Charms (1964), which used the slogan, “They’re magically delicious!”

Services

The size of the service sector also grew during the postwar decades, in fast-food restaurants, for instance, as well as healthcare and leisure travel. In 1947, service-industry firms accounted for 46 percent of total US employment. By 1976, that figure had risen to 61 percent.

Some service industries grew as “franchises.” Franchising had existed in the United States since the nineteenth century. Singer Sewing Machine (in 1850), GM (1898), and Coca-Cola (1899) all relied on building franchises in the early days of their businesses. For example, Coke sold gallons of syrup to individual entrepreneurs, who opened bottling plants and distribution companies to sell Coca-Cola directly to grocers, restaurants, and other outlets. Oil companies also sold franchises to service stations. These “product-oriented” franchises allowed for the recruitment of entrepreneurial individuals to build retail outlets to promote products throughout the country.

image

9. Sir Grapefellow, from General Mills about 1972, was one of many children’s cereals introduced in the 1970s through television advertising campaigns. Sir Grapefellow and his nemesis, Baron von Redberry, did not last long, but other cereals with cartoon champions fared better, including the monsters Count Chocula and Frankenberry.

A second type of franchising, a “business format” franchise, appeared in the 1920s with the formation of Howard Johnson’s restaurants. This type of franchise owner not only sold the franchisor’s product but also acquired marketing and management plans and quality-control systems and, in a sense, ran his or her own business. Increasingly, starting in the 1950s, franchising had tremendous appeal to Americans who wanted to start their own businesses but either lacked the start-up capital or wished to avoid some of the risks of founding a business on their own. Business format franchising, especially, became popular at restaurants (such as McDonald’s and Kentucky Fried Chicken), hotels, groceries, and auto supply shops. In 1969, there were roughly 380,000 franchises operating in the United States; this figure grew to 440,000 by 1980.

The burger industry became a site of franchise-based competition between White Castle (1921), Jack in the Box (1951), Burger Chef (1954), Burger King (1954), and McDonald’s (1955). The McDonald’s company grew through franchise ownership and profited, especially, from headquarters’ control of franchise real estate. The McDonald brothers founded their restaurant in the 1930s but eventually sold their interest in the firm in 1955 to Ray Kroc, a milk shake–maker salesman. That same year, Kroc started the McDonald’s Corporation and reorganized the company.

The new McDonald’s provided franchise owners with enough responsibility to give them the feeling of being independent entrepreneurs and allowed them to earn substantial profits, yet McDonald’s retained control over menu choices and branding and advertising decisions. Franchise owners in the 1950s received a detailed McDonald’s Manual that ran seventy-five pages and covered topics including “food specifications,” “store opening procedure,” “job turnover,” and “cleaning and maintenance”—of the walk-in refrigerator, the potato peeler, the syrup pumps, and much else. It also listed retail prices:

Hamburgers 15 cents
Cheeseburgers 19 cents
Triple thick milkshakes 20 cents
French fries 10 cents
Root Beer, Coca-Cola, and Orange 10 cents or 15 cents
Half pint milk 10 cents
Coffee 10 cents
Hot chocolate (in winter) 12 cents
Old fashion pound cake 15 cents

By 1963, there were more than 300 McDonald’s restaurants across thirty-seven states.

International business

After 1945, American investment and funds came to play an essential role in the rebuilding of Europe as part of the Marshall Plan, named for Secretary of State George Marshall. Under the Marshall Plan, the United States gave more than $13 billion (about $110 billion in 2020 dollars) from 1948 to 1952 to rebuild western Europe and restore political stability. The plan aimed to modernize European industry, improve prosperity, increase access to coal and other needed resources, and combat the spread of Communism. Roughly one-third ($4.4 billion) of the money went to Great Britain and just under one-sixth ($1.9 billion) to France. West Germany received roughly one-tenth because the restoration of Germany was thought essential to support prosperity in other European countries that depended on German resources and manufactured goods, thereby preventing the spread of Soviet influence.

Along with the Marshall Plan, US companies increased their global footprint. Major US oil companies, including Jersey Standard, rebuilt damaged refineries and expanded their presence across western Europe, seeking to take advantage of the need for petroleum products during the recovery period.

The 1960s also saw the increase of US foreign investments in manufacturing and wholesale and retail trading. Prominent industries investing abroad included transportation equipment, chemicals, machinery, food products, electrical machinery, and primary and fabricated metals. In wholesale and retail, Sears, Roebuck—the largest American merchandiser—began expanding its business into Latin America and Europe. The supermarket chain Safeway (founded in 1915) started to invest in Europe and Australia. By 1966, it had 28 stores in Britain, 5 in Germany, 11 in Australia, and 241 in Canada.

Accompanying US business abroad was the expansion of advertising, accounting, market research, and management consultant firms. The consulting firm McKinsey & Company grew in the 1940s and 1950s and promoted the M-Form and other US management strategies in Europe. By the end of the 1960s, McKinsey’s European clients included Cadbury, Cunard, and Rolls-Royce in the United Kingdom; Nestlé, Geigy, and Sandoz in Switzerland; KLM and Royal Dutch Shell in the Netherlands; and Volkswagen, Deutsche Bank, and BASF in Germany.

As American companies increased their business overseas, so international companies began to enter American markets. The American auto industry, for example, was challenged by Germany’s Volkswagen (which formed Volkswagen of America in 1955) and Japan’s Toyota—both of which made successful entries into the US automobile market in the 1950s and 1960s. Both offered smaller cars, an unfilled niche in the American car market. The Volkswagen Beetle, with its replaceable parts and simple design, proved especially popular in the early 1960s against the “chrome cathedrals” produced in Detroit, but it, too, faced competition. In 1958, Toyota began marketing its Crown automobile in the United States, but it was underpowered and flopped. In 1968, the company’s more powerful Corolla did much better, and by 1975, the Japanese car manufacturer replaced Volkswagen as the best-selling import brand in the country.

In response, American carmakers tried to move into the smaller-car market with offerings like the Chevrolet Corvair and Ford Falcon, both introduced in the 1960s. The Falcon was produced from 1960 to 1970 and exported to Argentina, Australia, Canada, Chile, and Mexico. The Corvair was initially very popular, but drivers were dissatisfied with its handling. In 1965, consumer activist Ralph Nader attacked the Corvair (and many other American car models) in his book Unsafe at Any Speed. Toyota outdid American competition through its famed “production system,” which took decentralization beyond what Sloan had imagined. Toyota’s approach revolved around a persistent effort to streamline systems to eliminate waste and allow flexible production for just-in-time delivery. By 1980, Japan surpassed the United States as the world’s leading producer of automobiles.

In the final decades of the twentieth century, Japanese imports also gradually came to replace American products in other industries—for example, in televisions and other consumer electronics products. In 1955, US television manufacturers controlled 96 percent of the American market. A decade later, they held just 30 percent, with inexpensive Japanese imports gaining a strong foothold. By 1980, there were only three American television manufacturers left in the United States—RCA, Zenith, and GTE.

In response to rising global competition, many American firms pursued a new strategy that they hoped would allow them to regain a competitive advantage. They became conglomerates, bringing varied businesses in different industries together into a single group. The 1960s was dubbed “the age of the conglomerate,” with more than 6,000 mergers and acquisitions in 1969 alone. Despite the rush to merge, however, the strategy was seldom successful.

One (albeit unsuccessful) advocate of the conglomerate strategy was RCA. The company had once been an enormous success; it had dominated the American market for radios and consumer electronics in the 1920s and 1930s and then moved into television starting in the 1940s. In the 1960s, however, it made a fateful move into computers, an area outside its core business, seeking to compete with IBM. It then made several unrelated acquisitions including Hertz Rent-a-Car, Random House publishing, and businesses in frozen food, carpets, and paper manufacturing, transforming itself from an electronics company into a highly diversified conglomerate. In 1986, this rapid and unrelated diversification ultimately brought an end to RCA. In its core market of consumer electronics rose four Japanese companies—Sony, Matsushita, Sanyo, and Sharp. Sony, especially, came to dominate digital technology. With Philips (a Dutch multinational), Sony introduced the compact disc in 1982 and the CD-ROM in 1985.

But it was not just overseas competition from Japan that threatened American business. During this period of uncertainty and turbulence in US markets, the country also experienced a destabilizing oil crisis. The United States had long been an oil-exporting nation. In 1920, for example, it produced 65 percent of the global oil supply, and even in the aftermath of World War II, it maintained a positive net export balance of 61,000 barrels per day.

However, beginning in the 1950s, in part because of the rapid growth of the automobile industry, domestic demand for oil began to outpace domestic production, and the United States became increasingly reliant on oil imports. Indeed, in 1950, the United States imported 850,000 barrels of oil per day. By 1960, that figure had risen to 1.8 million. This put the United States in a precarious situation as the balance of power among global oil producers began to shift. In 1960, major Middle Eastern oil-producing nations (Iran, Iraq, Kuwait, and Saudi Arabia) joined with Venezuela to form the Organization of the Petroleum Exporting Countries (OPEC), seeking to coordinate production and export policies among member nations to ensure steady returns to oil producers.

By 1970, US oil imports had risen to 3.4 million barrels per day, and global demand was also on the rise, straining the capacity of OPEC nations, which were already producing at 80 percent capacity. The first “oil shock” came in 1973. OPEC implemented an embargo against the United States and other nations that had supported Israel during the Yom Kippur War, resulting in a massive price spike as supply dropped. In 1979, a further drop in oil supply resulting from the Iranian Revolution, combined with steadily rising global demand, resulted in a second oil crisis. By 1980, the price of oil was twelve times what it had been in 1970.

During the 1970s, the United States experienced five quarters of decreasing gross domestic product, and the unemployment rate hit 9 percent in 1975. American companies needed very different strategies to fend off global competitors at home and to try to regain lost shares of global markets. Some economists, including Robert Heilbroner, became pessimistic about US business. In Business Civilization in Decline (1976), he argued that global capitalism had become too volatile to endure: “Much as we now inspect Chichen Itza, the Great Wall, the pyramids, Machu Picchu, so we may some day visit and marvel at the ruins of the great steel works at Sparrows Point, the atomic complex at Hanford, the computer centers at Houston.” It was a far different business environment than just a few decades before.