CHAPTER 13
The Politics of Stagnation
By the time Richard Nixon resigned office, the post–World War II democratization of American life had gone far. For the first time in its history, the United States had embraced universal adult suffrage, with only a few exceptions, like felons and noncitizens. And as a result of Baker v. Carr and subsequent court decisions, votes counted more or less equally, except in the composition of the U.S. Senate and the Electoral College. Courtrooms became more democratic when in 1975 the Supreme Court declared it unconstitutional to deny women equal access to jury service.
Democratization entailed not only individual rights but also institutional reforms that diffused power away from central authorities and insider cliques. Congress helped set the tone by moving to limit the postwar centralization of federal power in the executive branch. The 1973 War Powers Act at least symbolically reasserted the power of Congress in making war. The 1974 Hughes-Ryan Amendment required the CIA to report covert activities to congressional oversight committees. The Congressional Budget and Impoundment Control Act restricted the ability of the president to refuse to spend congressional appropriations (a favorite Nixon tactic) and established the Congressional Budget Office to provide independent expertise in economic forecasting and budget analysis. The 1976 National Emergencies Act created new procedures and congressional controls for declarations of emergency (which gave the president extraordinary powers).
Within Congress, rank-and-file members moved to restrict the power of their leaders and create greater opportunities to participate in decision making. The seventy-five new Democratic congressmen elected in 1974 in the wake of Watergate—many of them political neophytes lacking the usual deference to party leaders—forced the ouster of several longtime committee chairs and the reform of House rules. By the late 1970s, a handful of powerful congressional leaders no longer could make major decisions out of public view.
Across the society, government bodies and public institutions were allowing greater access to information once kept confidential and creating more opportunities for public input in decision making. Building a structure, changing a government regulation, or administrating a program often required extensive public consultations and hearings, unlike in the past when leaders could act on their own. With the mass mobilizations of the 1960s still part of social memory, people expected to be heard and did not feel hesitant about taking to hearing rooms or to the streets to press their positions. And when that did not suffice, they often used the courts to block or force government action.
Yet belying the formal democratization of society were many signs of public disenchantment with politics and public processes. Antiauthoritarianism, inherited from the 1960s, came to pervade society, with an air of sour disgust rather than liberatory glee. A growing part of the population disengaged from politics altogether. Many private interests, finding there to be too much democracy for their taste, began finding ways to exert power outside of formal procedures of government. As antistatist sentiment grew, democracy expanded and became hollowed out at the same time.
The Crisis of Authority
The most obvious measure of public disenchantment with politics came in the declining participation in national elections. In every presidential election during the 1950s and 1960s, at least 60 percent of the electorate voted. In 1972, just 55 percent did; in 1976, 54 percent; in 1980 and 1984, 53 percent; and in 1988, 50 percent. Even fewer people voted in off-year congressional elections. Participation hit a low point in 1978, when less than 38 percent of the eligible voters cast a ballot, the lowest level in over three decades. Some of the drop came from the lowering of the voting age, since young adults were less likely to vote than their elders. But much more it reflected a blanket rejection of politicians and a growing belief that elections had little to do with daily reality.
Incumbents suffered from the public disgust. In 1976, Gerald Ford barely managed to win the Republican nomination in the face of a challenge from former California governor Ronald Reagan, only to be defeated in the general election by an obscure governor from Georgia, Jimmy Carter. Carter, in turn, lost to Reagan in 1980, the first back-to-back defeats for incumbent presidents since 1892. In 1980, independent presidential candidate John Anderson won more than 6 percent of the vote—more than Strom Thurmond and Henry Wallace together got in 1948—to a large extent simply because he did not represent one of the major parties.
Some of the public disgust with politics and politicians stemmed from political scandals and revelations of past government misdeeds. Late one night in October 1974, the Washington, D.C., police stopped a car carrying one of the most powerful members of the House of Representatives, Wilbur Mills, in the company of a stripper named Fanne Foxe, who bolted into the nearby Tidal Basin. When months later Mills appeared on the stage of a Boston burlesque house at Foxe’s beckoning, his days as a power broker came to an end. Two years later Wayne Hayes, head of the House Administration Committee, fell from power when a member of his staff told the Washington Post that she had been put on payroll to serve as his mistress: “I can’t type, I can’t file, I can’t even answer the phone.” A murkier, more serious scandal began unfolding in 1976 with allegations that political operatives acting on behalf of the South Korean government had distributed cash to a large number of congressmen.
From 1970 through the late 1980s, the number of local, state, and federal officials indicted and convicted for criminal activity rose sharply. In all likelihood there were not more crooks than in the past but rather closer scrutiny of politicians by the press, prosecutors, and rivals. With the Democrats and Republicans locked in close national elections from 1976 on, charges of personal wrongdoing became a widely used partisan tool, as the conspiracy of mutually self-serving silence that once reigned among Washington politicians collapsed. The press abetted the process, as Watergate placed a new premium on uncovering official wrongdoing. A 1978 law furthered politics by exposé by establishing a procedure for appointing special prosecutors in cases of alleged wrongdoing by federal officials.
For a brief time, it looked like Gerald Ford might succeed in restoring integrity and confidence in government. He came into the presidency with a surge of public goodwill, a sense of relief that the strange, dark days of Richard Nixon were over. Ford’s unpretentious manner and the suburban normality of his family seemed reassuring, whatever people thought of his political views, which on many issues were more conservative than Nixon’s. Ford’s wife, Betty, signaled how much national social views had changed when she compared trying marijuana with having a first beer or cigarette and said that she would accept her teenage daughter having an affair. But Ford destroyed any chance he had of improving public perception of the political class—and any likelihood of getting elected to the presidency in his own right—when just a month into office he gave Nixon a full pardon for any offenses he might have committed as president. Ford’s approval rating plunged, as many people believed that he had made a sordid deal with the former president, or at least perpetuated a different set of rules for political insiders than for everyone else.
Disclosures during and after the Ford administration of past governmental wrongdoing furthered public disenchantment with politicians and the state. The foreign and domestic calamities of the late 1960s and early 1970s, especially Vietnam and Watergate, left in their wake resentment and bitterness among government insiders that opened the floodgates for leaks, revelations, and recrimination. Tales of CIA assassination attempts, coups, illegal spying, and use of psychedelic drugs captured headlines and sparked government investigations, which led to further revelations. Former FBI associate director Mark Felt—the Washington Post’s “Deep Throat” source during Watergate—was convicted of illegal break-ins during a search for fugitive radicals. The tawdry picture of covert, illegal, often bumbling, and sometime murderous activities by the government undercut its moral authority. In 1964, a public opinion poll found that 76 percent of the public trusted the government; by 1974, the figure had fallen to just 36 percent. Ironically, liberals, who supported an expansive notion of government, ended up contributing to its delegitimization by aggressively investigating Watergate and other government misdeeds.
The distrust of government leaders and politicians was part of a broader crisis of authority during the 1970s. All kinds of traditional figures of authority were seen as dishonest and self-interested. A 1976 poll reported that only 42 percent of the public trusted the medical profession, down from 73 percent in 1966, and only 12 percent trusted the legal profession. The antiauthoritarianism once associated with the New Left had become the common property of broad sectors of society.
Ford
Even more than the loss of confidence in the probity of national leaders and government institutions, their failure to solve major problems confronting the country fed antistatism and disaffection with electoral politics. Presidents Ford and Carter, though quite different in their ideologies and political styles, both proved remarkably unsuccessful in dealing with serious domestic troubles and foreign affairs. To some extent their failures reflected ineptitude. But the very nature of the circumstances they faced made it almost impossible to succeed.
Economics dominated politics for most of the 1970s. As soon as Nixon left office, national attention shifted from Watergate to the rapidly deteriorating economy, as unemployment and inflation shot up, the GDP shrank, take-home pay fell, and the federal deficit ballooned. The coincidence of a downward business cycle with structural changes in international capitalism threw businessmen and policymakers into confusion. In the past, unemployment and inflation had tended to be reciprocal, with one falling when the other rose. Now they began moving up together, as slow growth (or no growth) combined with inflation in what came to be called “stagflation.”
The Keynesian solutions that had become normative government policy in the United States and Western Europe no longer fit the circumstances. If the government tried to lower unemployment by stimulating the economy, it faced the danger of exacerbating already high inflation. If it tried to lower inflation through monetary policy, it faced the possibility of slowing or reversing economic growth and throwing more people out of work.
Ford made inflation his priority. He put much of the blame for surging prices on excessive government spending and the federal deficit. A political and economic conservative, he had never liked the expansion of state function that came with the Great Society, as a congressman opposing Medicare, federal aid to education, and housing subsidies. As president, he tried to trim federal expenditures. He wanted the public to cut spending too. Ford ended an October 1974 address to Congress that laid out his economic program by urging his listeners to spend 5 percent less on food and drive 5 percent less. He then pinned on his lapel a button with the letters “WIN” on it, an acronym for “Whip Inflation Now.”
The WIN program superficially mimicked the New Deal. During Roosevelt’s first stab at an economic recovery program, the National Recovery Administration (NRA) had used a Blue Eagle symbol to rally public support. Its reincarnation—the WIN name and button design—came at the White House’s request from the Benton & Bowles advertising agency, whose cofounder, Chester Bowles, had mobilized millions of volunteers to help fight inflation during World War II as head of the Office of Price Administration (OPA). But both the NRA and OPA used the public to help enforce mandatory government regulations that greatly increased the role of the state in managing the economy. Ford wanted to go the other way, to decrease federal regulation of the economy and use public mobilization as a substitute for state intervention, moving away from, not back toward, the New Deal order. Ford’s faith that family belt-tightening could “whip” an inflationary spiral rooted in commodity shortages and deep economic structures seemed to belie much understanding of political economy. WIN buttons immediately became the butt of jokes.
Congress rejected almost all of Ford’s proposals. The heavily Democratic, liberal-leaning majorities in the House and Senate, with strong ties to organized labor, gave a higher priority to attacking unemployment. The most ambitious Democratic plan was introduced in 1974 in the House by Augustus F. Hawkins and in the Senate by Hubert Humphrey (who was reelected to Congress after leaving the vice presidency). Their bill was a throwback to the New Deal and the original version of the Full Employment Act of 1946. Picking up Roosevelt’s 1944 proclamation of employment as a right, it called for the federal government to set a specific goal for full employment and use fiscal and monetary policy to try to achieve it. If those tools proved insufficient, it mandated the provision of a government-funded job for every adult seeking work.
Ford opposed the Humphrey-Hawkins bill, as did many Democratic economists, fearful that it would drive up inflation. In 1974, Ford did agree to fund 100,000 public-sector jobs through grants to local and state governments under the Comprehensive Employment and Training Act (CETA), passed the previous year to bring together various federal manpower programs. The next year he signed a tax reduction and rebate to stimulate the economy. But in early 1976, Ford vetoed a large public works program and other spending bills. Meanwhile, Humphrey-Hawkins failed to win passage, only becoming law in 1978, by which time—like the Full Employment bill on which it had been modeled—it had been gutted of its most meaningful provisions, including an enforceable public right to a job. By the time of the 1976 election, inflation had fallen to 5.7 percent from 11 percent in 1974, but unemployment remained high, approaching 8 percent.
Ford had no more success with his energy program than with his economic program. The solution to rising energy costs and oil shortages, Ford believed, lay in dismantling the complex web of federal regulations governing the oil and gas industries, including the system of price controls and federal allocations that Nixon had imposed on oil and petroleum products in 1971, which kept domestically produced oil below the world market price. The higher prices that would come with deregulation, Ford argued, would stimulate increased domestic oil production. But Congress found the prospect of sharply higher oil and gasoline prices politically unpalatable. A compromise energy law kept price controls in place until at least 1979, created a strategic petroleum reserve, and instituted a series of conservation measures including energy efficiency standards for appliances, legalizing right turns on red lights, and mandatory fuel economy standards for new cars and small trucks.
The End of Détente
The urgency of domestic matters during Ford’s presidency eclipsed foreign policy in public discourse. But in Congress, elite policy circles, and within his administration a sharp debate unfolded over the United States’ global standing and its relationship with the Soviet Union, as the 1975 communist victory in Vietnam and images of the chaotic evacuation of Saigon contributed to a widespread sense of national decline. In that debate lay the roots of an eventual return to expansive efforts to assert American power all over the world.
On taking office, Ford seemed inclined to continue the Nixon-Kissinger policy of détente. But by then, détente had begun to stall. Negotiating economic and arms control arrangements with the Soviet Union that would provide sufficient benefits to each party proved difficult, given the different situations of the two superpowers and rising opposition to détente within the United States.
In the mid-1970s, some leading conservatives began criticizing what they saw as the implicit assumption behind détente that declining U.S. military and economic power necessitated accommodations with onetime enemies ideologically and morally at odds with core American beliefs. In doing so, they rejected the conclusion of centrist leaders like Kissinger and later Jimmy Carter that in the light of the American defeat in Vietnam, Watergate, and changing global economic relations the United States could not and should not try to play the same dominant world role it once did. Critics of détente expressed skepticism about accords with the Soviet Union, seeking instead to rebuild American military power and assert core moral values in U.S. foreign policy, especially the defense of liberty. Inside the Ford administration, Secretary of Defense James Schlesinger, Chief of Staff Donald Rumsfeld, who in late 1975 replaced Schlesinger at the Pentagon, and Rumsfeld’s deputy, Richard Cheney, who succeeded him as chief of staff, ideologically and bureaucratically challenged Kissinger, whose near total control over foreign policy at the start of the Ford administration steadily eroded. Conservatives outside of the administration also pounded Kissinger and détente, most importantly Ronald Reagan, who was preparing to challenge Ford for the Republican nomination.
Republican critics of détente found allies among some liberals who had long been associated with the Democratic Party. In the early 1970s, a cluster of liberal intellectuals, political activists, and labor leaders, many with youthful roots in the socialist movement, began moving to the right, or, as they saw it, standing firm as the Democratic Party moved to the left. Domestic developments pushed some of them in a conservative direction, specifically their rejection of the New Left. Equally important, they opposed any softening toward the Soviet Union. Some of the most influential “neoconservatives” were Jews for whom the 1973 Middle East war and Palestinian terror tactics loomed large. Strongly committed to the defense of Israel, they reacted to criticism of the Jewish state by the Soviet Union and the political left at home by seeking allies on the right. Irving Kristol, Norman Podhoretz, Nathan Glazer, Midge Decter, and other neoconservative writers provided intellectual credibility and polemical ammunition for the attack on détente and skepticism of multilateralism. Daniel Patrick Moynihan, appointed ambassador to the United Nations by Ford in 1975, shook up the usually decorous world of diplomacy with attacks on what he believed to be the anti-Western and antidemocratic tilt of the UN and his unsuccessful fight against a resolution that declared that “Zionism is a form of racism and racial discrimination.”
Senator Henry Jackson, a Cold War liberal Democrat from Washington with strong ties to the AFL-CIO leadership, found an effective device to slow down détente in the issue of Jewish emigration from the Soviet Union. As part of the trade agreements Nixon negotiated, the United States promised to grant the Soviet Union most favored nation status, which would have lowered tariffs on imported Soviet goods. Jackson pushed forward a measure to deny such status unless the Soviet Union increased the number of Jews it allowed to emigrate. In response, Kissinger negotiated a compromise with the Soviets, but Jackson, who had presidential ambitions, scuttled it, leading the Soviet Union to tighten emigration restrictions and become more distrustful of American promises. U.S.-Soviet trade declined, and the movement toward a new arms control agreement stalled.
A wave of revolutions in the undeveloped world and the final push against colonial and white minority rule in Africa added to the growing tension between the United States and the Soviet Union. The revolutionary movements of the 1970s generally had only slight resemblance to orthodox communist parties, but by challenging the status quo, using left rhetoric, and in some cases taking aid from the Soviet Union they sparked enmity from American policymakers. Local wars became proxy conflicts between the United States and the Soviet Union. The largest clash took place in Angola, where rival groups fought to establish control after a coup led Portugal to give up its colonial empire. The United States provided covert aid to the factions it favored (which China and South Africa also assisted), while the Soviet Union and Cuba backed a rival group. The bloody civil war left one of the potentially richest nations on the continent in ruins.
By the 1976 election, détente had all but ended. A framework for a new strategic arms limitation agreement, which Ford negotiated on a trip to the Soviet Union, never led to a treaty, largely because of congressional opposition. Ford himself banished the term “détente” from the administration’s lexicon, recognizing that it had become toxic within the Republican Party. Even so, he faced a fusillade of criticism of his foreign policy during his primary battle with Reagan, who accused him of tacitly accepting a Soviet sphere of influence in Eastern Europe (which by then had been in place for three decades) and planning to cede control of the Panama Canal to Panama (in talks over the future of the canal that had been going on since 1964). Ford just barely beat back Reagan’s effort to oust him but ended up accepting a conservative platform plank that criticized in everything but name his conduct of international relations.
Carter
Foreign policy did not help Ford in the 1976 general election, especially his blunder during one of the presidential debates—the first since 1960—when he said, “There is no Soviet domination of Eastern Europe.” His opponent, Jimmy Carter, claimed common cause with the Republican foreign policy plank, asserting, “Our country is not strong anymore; we’re not respected anymore.” But the election largely revolved around the poor state of the economy and public disgust with the political class.
Carter had the advantage of being almost completely unknown outside of his home state before the election season began. A onetime Navy officer who ran a family peanut business, he had limited political experience and none in Washington to tar him with a public fed up with their national leaders. Ford, by contrast, could not shake the residual political effect of Watergate and his pardon of Nixon nor overcome blame for the ongoing economic troubles and his inability to work with Congress.
The South proved the key to Carter’s victory. Eleven years after the 1965 Voting Rights Act, southern black voters had become a major electoral force, heavily aligned with the Democratic Party. Nationally, Carter captured more than 80 percent of the African American vote, with the percentage even higher in the South. Though southern white voters had begun drifting toward the Republican Party, Carter, as a southerner and professed born-again Christian (by far the most devout president of the twentieth century), also won a large part of the southern white vote, enabling him to carry the entire region except for Virginia. That, along with his capture of most of the large states in the Northeast and Midwest, put him in the White House.
Carter carried Ford’s effort to deflate the pomposity of the presidency to an extreme. Calling himself Jimmy rather than James, at his inauguration he wore a business suit and walked from the Capitol to the White House, rather than wearing a morning coat and riding in a limousine, the usual practice. Once president, he sold the presidential yacht, let himself be photographed wearing blue jeans and carrying his own luggage, and gave a televised speech in a cardigan, deliberate breaks with past notions of presidential demeanor.
Carter pushed further Ford’s effort to overcome the divisiveness of the Nixon years. He appointed more African Americans and women to significant federal posts than any president until that time and greatly expanded Ford’s program to pardon Vietnam-era draft resisters and draft evaders. Carter invited to his inaugural gala musicians, actors, dancers, comedians, and athletes who represented the full range of national culture, from Aretha Franklin and Paul Simon to James Dickey and John Wayne, including Muhammad Ali, who only ten years earlier had been stripped of his boxing championship and sentenced to jail for refusing induction into the Army because of his objections to the Vietnam War.
Carter’s political symbolism and rhetoric suggested that the days of American imperial ascendancy were over, and that it was probably for the best. In his inaugural address, he adopted a strikingly modest and somber tone, warning that “we cannot dwell upon remembered glory.” “We have learned,” he said, “that ‘more’ is not necessarily ‘better,’ that even our great nation has its recognized limits, and that we can neither answer all questions nor solve all problems.”
Carter’s presidency seemed to provide proof of the last contention, as so many of the nation’s problems appeared intractable. Carter had no better success than Ford in dealing with the stagflation that brought hardship to much of the country. Coming into office, Carter proposed a stimulus package along traditional Keynesian lines. As finally passed, it provided tax credits for job creation and expanded federal funding for public works, job training, and temporary public-sector employment. Carter invoked the New Deal’s Civilian Conservation Corps in calling for the expansion of CETA, which at its March 1978 peak provided jobs for three-quarters of a million unemployed or underemployed workers, by far the largest countercyclical government employment effort since the Great Depression.
But CETA proved to be the last gasp of the Democratic commitment to full employment through direct government job creation, an idea associated with the liberal-labor wing of the party to which Carter had only weak ties. More conservative than most Democrats in Congress, Carter soon retreated from making his top priority unemployment, which slowly fell over the course of 1977. Like Ford, he came to see inflation as a more serious threat to the economy. When Congress reauthorized CETA in 1978, job training rather than direct employment became its main focus. Meanwhile, Carter tried to check inflation through voluntary wage and price guidelines and cuts in federal spending.
Carter also moved away from the New Deal in his embrace of economic deregulation. Since the late nineteenth century, the federal government had imposed an expanding web of regulatory structures over business. In the years before Carter took office, a growing number of economists and political leaders—including Gerald Ford—had pushed for deregulating industry, arguing that overregulation contributed to inflation and economic stagnation. Carter agreed. Under the leadership of economist Alfred Kahn, his administration set out to slash regulations that set price floors and limited new entrants in a host of industries.
Deregulation created unusual political configurations. Many conservatives, committed to the idea of unrestrained capitalism, backed the idea, but so did many liberals, including Senator Ted Kennedy and reformer Ralph Nader, who believed that increased competition would benefit consumers. Some deregulators, including Kahn, quite consciously sought to weaken unions and lower labor costs in monopoly industries in the name of helping out less well-off workers and their families, who would benefit from jobs at new competitors and from lower consumer prices. “I’d love the Teamsters to be worse off,” said Kahn. “I’d love the automobile workers to be worse off.” Business divided. Many goods producers supported the deregulation of transportation, hoping to lower their shipping costs. Businesses and communities in areas with poor airline service hoped that more competition and new carriers would lead to better transportation links. But companies and unions protected by regulatory structures fought their dismantling, fearing a downward spiral of prices, profits, and wages. It was largely a rearguard action, as the Carter administration pushed through varying levels of deregulation in the airline, trucking, railroad, cable television, and savings bank industries.
Carter’s trade policy continued the pattern, begun early in the Cold War, of giving priority to national security concerns over domestic economic considerations. To help out allies also suffering from stagnation, as well as to fight inflation, the Carter administration took only modest steps to stop imported manufactured goods from eroding the position of domestic producers, even when, as in the case of the steel industry, foreign-made products were dumped onto U.S. markets at prices below production costs. More efficient, or at least lower-cost, foreign manufacturers started grabbing larger shares of the U.S. market, with steel imports rising 30 percent in the first eight months of 1978.
At least in the short run, deregulation and trade policy had little impact on inflation, since food, energy, and housing costs (including mortgage interest rates) accounted for most of the rise in the consumer price index during the Carter years. Carter’s modest anti-inflationary measures came nowhere near to balancing out the massive jump in crude oil prices that came in the wake of the 1979 Iranian Revolution, which helped push the inflation rate to 11.2 percent in 1979, the highest since 1947.
That year, the lead in fighting inflation moved from the White House to the Federal Reserve System. Paul Volcker, whom Carter appointed chairman of the Fed, made stopping inflation his main goal, even if it took inducing a recession to achieve it. Rather than slowing the economy the usual way, by jacking up interest rates, Volcker used a different approach, having the Federal Reserve System tightly control the growth in the money supply, largely through increases in the requirements for bank reserves. In doing so, the Federal Reserve adopted the prescription of Milton Friedman and other monetarist economists who criticized Keynesianism and argued that the government should abandon trying to regulate the economy with fiscal tools and instead restrict its actions to adjustments in the supply of money.
The Fed, exempt as it was from democratic control, could do what no president or Congress could, deliberately set out to crash the economy and in the process create a sea of human misery in the service of creating price stability. But even it needed some fig leaf of protection from public wrath, which monetarism provided. Controlling the money supply seemed more a technical measure than directly pushing up interest rates, something the public more easily understood as a discretionary act.
The central bank restrictions on the money supply quickly impacted the economy. Interest rates shot up to extreme levels, approaching 20 percent. That in turn led to a sharp drop in consumer spending and, in the second quarter of 1980, the steepest fall of the GNP ever recorded. Yet inflation continued to rise. Meanwhile, increased imports and Volcker’s monetary policy led to a massive wave of layoffs and plant closings in the automobile and steel industries, the beginning of a decade of intense deindustrialization that would transform and hollow out the American economy. The economy had been redirected not through public discourse or legislative action but in the secretive boardroom of the Federal Reserve. Carter, who initially acquiesced with the Fed policy, began speaking out against it during the fall 1980 election, but by then it was too late to halt its downward push on the economy and on his own political fate.
Like Ford, Carter did no better with energy than with the economy. Between the end of the 1973 Arab oil boycott and the Iranian Revolution four years later, oil prices stabilized and the energy issue receded from public attention. Nonetheless, Carter made it one of his top priorities, recognizing the danger of dependence on foreign oil and the limits on fossil fuel supplies. The Carter administration crafted a complex, comprehensive energy program, designed to reduce consumption, encourage conservation, lessen dependence on foreign oil, and promote the development of renewable sources of energy, to be accomplished through the partial decontrol of oil and gas prices, subsidies, tax measures, and regulatory action.
The prolonged fight in Congress that followed reflected conflicting ideas for a national energy policy, Carter’s ineptitude in dealing with Congress, and the difficulty of governance in an era of fragmented power. Energy issues pitted political factions, economic interests, and regions against one another. Decontrolling oil and gas prices would bring huge profits to energy companies and the regions where they were located, while hurting energy consumers and regions that imported oil and gas, especially the Northeast, where oil heating made homeowners particularly vulnerable to price hikes. Carter proposed various tax schemes to recycle revenue from higher oil prices back to consumers and to encourage conservation, but these raised opposition from those who would bear them.
Powerful oil and gas interests played a large role in the congressional handling of energy, but a mobilized public, now conditioned to direct action to promote its views, played a role too, especially in opposing nuclear power. In May 1977, 1,414 protestors were arrested at the site of a proposed nuclear plant in Seabrook, New Hampshire. Taking a page from the civil rights movement, many refused bail, forcing authorities to hold them in armories around the state. Meanwhile, the proliferation of committees and subcommittees in Congress provided many entry points for lobbyists and interests. It took a year and a half for Carter to get an energy law, stripped of many of his proposals, including tax provisions designed to reduce consumption.
Within months, energy emerged as a major issue again, when the revolution in Iran led to a near complete halt of its oil exports and OPEC took advantage of the tight market to drive up the price of crude oil. With gasoline lines and shortages spreading across the country and tempers flaring, Carter used existing authority to phase out controls on oil prices, while proposing a windfall profits tax, which Congress eventually passed. But consumers bore the brunt of the burden of the transition toward higher energy costs, as they watched the pump price of gasoline more than double in three years. By the time Carter left office, the percent of the GDP devoted to energy spending exceeded 13 percent, up from 8 percent at the beginning of the decade.
The gasoline lines and shortages in the summer of 1979 occasioned one of the most notable presidential addresses of the post–World War II epoch. Carter had planned to give a national address on energy policy but at the last minute canceled it, instead traveling to the presidential retreat at Camp David. There he spent six days meeting with a stream of politicians, academics, religious and labor leaders, and businessmen about the economy, energy, the state of the nation, and his own administration. With the nation wondering just what the president could possibly be up to, Carter then delivered his postponed address to a huge radio and television audience.
“The true problems of our Nation,” Carter told his listeners, “are much deeper . . . than gasoline lines or energy shortages, deeper even than inflation or recession.” Rather, the nation suffered “a crisis of confidence . . . growing doubt about the meaning of our own lives and the loss of a unity of purpose for our nation.” “In a nation that was proud of hard work, strong families, close-knit communities, and our faith in God,” Carter continued, “too many of us now tend to worship self-indulgence and consumption. Human identity is no longer defined by what one does, but by what one owns. But we’ve . . . learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.”
After deftly analyzing what he called “the crisis of the American spirit,” Carter proposed no strategy to overcome it, no broad program to meet the problems he identified. Instead he itemized his new energy proposals. That reinforced what so many people found inadequate in Carter, a technocratic obsession with detail while lacking a political sense of how to institute large-scale change. Carter followed up his speech by firing five cabinet officers, a gesture that seemed irrelevant if, as he argued, America’s problems were deeply rooted in its culture of self-interest. Carter proclaimed in his speech that “all the legislation in the world can’t fix what’s wrong with America,” yet he offered nothing that could—and proved pretty poor at getting legislation passed to boot.
From Human Rights to Renewed Cold War
In foreign affairs, Carter pursued many of the same themes as he did domestically. At times he seemed to believe that American power inevitably was declining in an age of proliferating nuclear weapons, third world economic development, and social revolution. Rather than defending the status quo, in some situations Carter supported a controlled liquidation of American dominance, seeking to ally with or at least avoid open conflict with ascending forces for social change.
Early in his administration, Carter pushed hard and successfully for Senate ratification of treaties ceding control over the Panama Canal and the Canal Zone to Panama, culminating a long process of removing a sore point in U.S. relations with Latin America. When the left-wing Sandinista National Liberation Front led a revolution against Nicaraguan dictator Anastasio Somoza Debayle, a longtime American ally, Carter maneuvered unsuccessfully for the installation of a moderate-led government but, in a break from past practice, refrained from open or covert military action to check radical change.
Carter’s policies were animated in part by his desire to make a “commitment to human rights . . . a fundamental tenet” in how the United States acted. In moralizing foreign policy, Carter joined the neoconservatives in sharp criticism of the lack of political freedom in the Soviet Union and Eastern Europe. Unlike the conservative moralists, he also attacked the abuses of human rights by anticommunist authoritarian regimes that in the past could count on U.S. backing. Carter reduced or ended aid to countries like Chile, Argentina, and Uruguay that were engaged in murderous campaigns against left-wing dissidents. But he only went so far, continuing to support regimes engaged in human rights abuses when he perceived their opponents as fundamentally threatening U.S. interests and trying, unsuccessfully, to stop the Mariel boatlift from Cuba, in which Castro suddenly allowed the disaffected to leave the island.
Carter’s human rights agenda complicated his effort to achieve further arms control agreements with the Soviet Union. His embrace of Soviet dissidents and criticism of human rights violations in the Soviet bloc, along with his desire for deep cuts in strategic weapons, made negotiating a new Strategic Arms Limitation Treaty, SALT II, a long, difficult process. When the Soviet Union sent troops into Afghanistan in 1979 to try to keep in power a pro-Soviet regime facing a tribal and religious rebellion, Carter swiftly adopted what amounted to a renewed hard-line Cold War stance. He imposed an embargo on grain sales to the Soviet Union; forced American athletes to boycott the 1980 Moscow Olympics; withdrew the SALT II treaty from consideration by the Senate; increased covert action against left-wing governments in Asia and Africa; and accelerated the increase in military spending that he had already begun.
The sharpness of Carter’s actions reflected a sense of threat that came from events in Iran as much as Afghanistan. In the years after 1953, when the United States had helped restore the shah to power, Iran had become an increasingly important ally. But the close U.S. ties to the shah proved detrimental as opposition to the modernizing but undemocratic and sometimes brutal Iranian regime gathered strength. In early 1979, the shah left Iran in the face of growing strikes and demonstrations, paving the way for exiled Muslim cleric Ayatollah Ruhollah Mussaui Khomeini, who helped inspire the uprising, to return.
The Carter administration, largely operating within the intellectual framework of the Cold War and with a flawed view of Iranian society derived from the shah and his American backers, found itself at a loss as to how to deal with a cataclysmic social shift that did not resemble other post-Enlightenment revolutions. The United States made no effort to again return the shah to power, but Carter gave in to pressure from Henry Kissinger and other powerful friends of the former Iranian leader to allow him into the country for cancer treatment. In reaction, on November 4, 1979, radical Islamist students took over the U.S. embassy in Tehran and seized more than fifty hostages with backing from Khomeini, who used the crisis to consolidate his power at the expense of secular and left-wing rivals.
For the next fourteen months, Carter tried one measure after another to get the hostages released, from freezing Iranian assets in the United States to attempting to begin negotiations to breaking diplomatic relations, without success. In April 1980, he tried to use military force to free them, but that failed too, when helicopters on the mission suffered mechanical problems. Ultimately, the shah’s death, the outbreak of war between Iran and Iraq, and the U.S. presidential election set the stage for a resolution of the crisis. The Iranians, eager for funds for their war and perhaps concerned with what might happen if Reagan got elected president, finally negotiated an arrangement with the Carter administration in which the hostages were released—just after Carter left office—in return for unfreezing Iranian assets in the United States.
The Soviet invasion of Afghanistan, the Iranian Revolution, and the hostage crisis vividly demonstrated the changed circumstances the United States faced in the Near East. It no longer could depend on allies and surrogates, like Britain and Iran, to protect its interests in the region, on which it had become increasingly dependent for oil. It would have to act on its own. In his 1980 State of the Union address, Carter declared, “An attempt by any outside force to gain control of the Persian Gulf region will be regarded as an assault on the vital interests of the United States of America, and such an assault will be repelled by any means necessary, including military force.” Thus began a military buildup in the Gulf region, the declared possibility and eventual actuality of direct U.S. military intervention, and the recommitment to a path of development dependent on cheap, plentiful oil. Yet in spite of his bellicose language and military buildup, Carter suffered politically from keeping the hostages at the forefront of his foreign policy while proving unable to get them home, and from the failed rescue effort that seemed to confirm the impotence and ineptitude of the American military in what looked like an increasingly dangerous and hostile world.
Limiting Government
The inability of Washington to solve big problems the nation faced allowed efforts to limit the size and functions of government to gain momentum. Some of the push came from business interests that sought to circumvent democratic channels, where their power had become constrained. Some of it came from grassroots activists, as upset by the cost of government as by what it did.
The Ford and Carter regimes took modest steps back from the New Deal–Great Society conception of state function. But a much more frontal attack on the idea of the welfare state took place in cities and states hit hard by the recession of the mid-1970s. Municipalities found themselves facing greater needs for social services at the very moment when their tax bases were deteriorating as a result of the economic downturn, deindustrialization, and suburbanization. Efforts to hold down the costs of municipal employee wages and benefits led to labor conflict and disruptive strikes, including a 1974 walkout by Baltimore police, the first significant law enforcement strike since 1919. In New York and Cleveland, bankers and conservative politicians took advantage of severe fiscal problems to attack social democratic policies and populist politics.
New York City had provided a national model for an expansive, liberal notion of government in the decades after World War II. Government services included a large, free university system; a large public hospital system; a large mass transit system; public housing projects with over a half million residents; and a large, relatively generous welfare system. In addition, the city had tens of thousands of apartments in nonprofit cooperative housing projects and extensive nonprofit health insurance programs. In effect, a kind of municipal social democracy had been built.
But New York’s large public sector, welfare costs, and Medicaid expenses proved financially unsustainable on a municipal basis. During the late 1960s and early 1970s, the city ran up massive debt as the cost of the services it provided outstripped its revenues. During the winter of 1974–75, the market for New York City bonds collapsed when investors grew fearful that the city would be unable to repay what it had borrowed.
For some financiers and conservative politicians, including Gerald Ford and his secretary of the treasury, William Simon, a former bond trader who had helped New York build up its debt, the city’s fiscal woes provided an opportunity for an attack on the welfare state and public employee unions. New York’s plight confirmed their conviction that liberal attitudes and government programs threatened the economic and moral health of the nation. Many conservatives shared Simon’s beleaguered sense that capitalism and the market were in imminent danger from growing government regulation, welfare-state measures, transfers of wealth, and an intellectual elite committed to social democracy or socialism.
New York State and the federal government came up with a series of jerry-built plans to refinance New York City’s debt, allowing it to remain technically solvent. But as a condition for refunding, federal officials and business leaders appointed by the state’s Democratic governor to new oversight and finance agencies demanded that the city lay off tens of thousands of workers, defer wage increases called for by union contracts, start charging tuition at its university system, cut municipal services, raise transit fares, and reform budgeting and fiscal procedures. In doing so, the coalition of financiers and politicians that all but preempted control over New York City from its local elected leaders showed that the unthinkable could be done, or at least partially accomplished: the half-century-long movement toward expanded state function and social entitlements could be reversed. Conservatives did not transform New York as fully as they hoped. They failed in their efforts to eliminate rent control—a holdover from World War II price controls—or radically reduce municipal worker pay and benefits, as the labor movement and community groups mobilized mass resistance. But they succeeded in ratcheting down public expectations of what government would provide and forcing austerity on New York’s working class. With public services slashed in the middle of a severe recession, New York became a grim place to live. Streets grew filthy, roads literally crumbled, crime shot up, subways broke down frequently, libraries opened only a few days a week, and schools crammed children into overcrowded classrooms and eliminated art, music, and sports.
In Cleveland, a clash between local business leaders and a liberal city government led to an actual default, though short-lived. Cleveland’s mayor, Dennis Kucinich, elected in 1977 as a populist (when only thirty-one years old), made good on his campaign promise to end tax abatements for businesses. He also refused to sell Cleveland’s small municipal-owned electric company to a much larger commercial utility seeking to eliminate its rival. Financial claims by the utility against the city deeply strained its budget. Local bankers told Kucinich they would not roll over short-term city debt unless he agreed to sell the city-owned system. When Kucinich stood his ground, Cleveland became the first major city since the Great Depression to default on its debt. Kucinich rallied public support and won a referendum backing the retention of the power company and raising income taxes to enable the city to become solvent. But enough damage had been done to pave the way for Kucinich’s ouster in the November 1979 election by Republican George Voinovich.
In California, government cutbacks came as a result of a tax revolt. In 1953, the average American family paid less than 12 percent of its income in federal, state, and local taxes, but by 1977 that had risen to over 22 percent. Much of the rise, especially in the 1970s, stemmed not from explicit legislative decisions but from the impact of inflation. Inflation pushed families into higher tax brackets for their federal income taxes and in some cases for state income taxes too. It also drove up property taxes, especially in places like California, where a booming real estate market combined with the general inflationary climate to balloon the value of houses. As assessments soared, so did the property taxes based on them.
Episodic protests against high property taxes had taken place in various parts of California during the 1950s and 1960s, with limited success. But changed conditions during the 1970s created more fertile ground. With rising taxes, a stagnant economy, and soaring prices pressing family budgets, the antitax sentiment built into the DNA of the nation heated up. The accumulation by the California state government of a $4 billion surplus rubbed salt into wounds, making the growing tax burden seem unnecessary.
In 1978, Howard Jarvis, a veteran antitax campaigner, got a tax reduction referendum measure, Proposition 13, put on the ballot. It called for capping local real estate taxes at 1 percent of assessed value, scaling back assessments to their 1975–76 level, and, except at the time of sales, limiting increases in assessments to 2 percent a year. It also mandated a two-thirds majority in the legislature to raise state taxes and two-thirds voter approval to raise local taxes. Using populist rhetoric, Jarvis, who had close ties to Los Angeles–area apartment building owners, orchestrated a well-funded campaign in support of the measure.
In spite of opposition to Proposition 13 from most of the political establishment (Republican as well as Democratic), the labor movement, civil rights groups, the Chamber of Commerce, and many large corporations (small businesses tended to support it), it passed by a two-to-one margin. The measure won support from liberal and conservative voters up and down the economic ladder. Among major constituencies, only African Americans and public employees cast a majority of their votes against it.
Proposition 13 had a more conservative cast than the language used to promote it suggested. Earlier failed tax limitation efforts in California and Massachusetts had targeted the tax burden on middle- and lower-income families, proposing to shift more of the tax load to businesses and the well-off. By contrast, nearly two-thirds of the tax relief Proposition 13 provided during its first five years went not to homeowners but to landlords, farmers, and owners of commercial and industrial property.
Proposition 13 proved contagious, sparking a round of tax-cutting efforts elsewhere. Idaho and Massachusetts voted in steep property tax cuts, in the latter case with strong backing from high-technology companies, which funded a referendum campaign. Voters in over a dozen other states passed more modest tax limitation proposals. Elsewhere, dozens of state legislatures, hoping to avoid voters taking things into their own hands, preemptively reduced income and property taxes.
The antistate animus reflected in the tax revolt was in part fostered by the bifurcated system of social provision that had developed since World War II, with some people receiving benefits from private entities and others from the government. Homeowners resented paying taxes to support public housing and rent subsidies. Employees who received health insurance and other benefits through their jobs resented paying taxes to finance parallel state benefit systems, which in some cases provided more generous benefits. Southern white parents who put their children in private schools to avoid racial integration resented paying taxes for public schools they did not use and pushed for tax credits or government vouchers for private education, an idea many Catholics supported too. By contrast, government programs that had near-universal coverage, like Social Security and Medicare, remained largely exempt from popular antistate sentiment.
The antitax campaigns of the late 1970s, like the New York fiscal crisis, worked to delegitimize government, or at least an expansive notion of state function. The essential argument of the fiscal regulators in New York and the antitax rebels elsewhere was that the public good would be best served by less government, not more. Both suggested that government had become riddled with inefficiencies, self-serving leaders, and entrenched interest groups. The relatively modest cutbacks in state and local services that occurred in most states after tax limitation measures (California had its huge surplus to spend down and other states raised miscellaneous fees and taxes to make up for property and income tax reductions) seemed to give lie to claims by liberal opponents of tax reductions that they would have disastrous consequences, while suggesting that antistatist outsiders like Jarvis, with common sense rather than technical expertise, knew more about how the world really worked than the established political class and credentialed policy specialists. By merging the interests of large property holders and businesses with middle-class families pressed by hard times and resentful about paying taxes to help the less well-off, the tax revolt helped lay the basis for a broader assault on the New Deal order in the years to come.
Crime and Punishment
At the same time that pressure grew to lessen the presence of government in everyday life through reduced taxes, regulations, and social benefits, calls grew for increased use of state power to deal with criminality. Crime had been an important political issue since the late 1960s, as the rate of violent offenses more than doubled between 1965 and 1975. Liberals sought to limit access to firearms as a way to check crime, winning a federal gun control law in 1968. But efforts at further gun legislation floundered in the face of a reaction from conservatives and gun owners, who saw gun control as an unconstitutional limit on the rights of law-abiding citizens.
More incarceration, rather than gun control, emerged as the main approach to crime control. Conservatives rejected the widespread liberal analysis that stressed social circumstances in explaining criminality and spurned therapeutic approaches to dealing with lawbreakers. Instead, they promoted incarceration as a means of punishment, retribution, and prevention, getting dangerous people off the streets and keeping them off. So did the victims’ rights movement—another manifestation of the rights revolution—which pressed for longer prison sentences and harsher prison conditions. Liberal criminologists and public officials inadvertently bolstered the movement when they began to doubt the curative assumption behind the common practice of indeterminate sentencing. State legislatures took up their call for fixed sentencing, but in a political climate in which the population at large, fearful and angry, saw in the application of state power an emotional outlet and a path toward greater personal security, they ended up extending sentencing norms rather than reducing them.
The combination of more crime, more arrests, and longer sentences swelled the prison population. After a half century during which the incarceration rate had fluctuated only moderately, in the mid-1970s it began to rise steeply. It kept rising until the end of the century and beyond, even after crime rates dropped. Mass imprisonment became a distinguishing feature of the United States, as its incarceration rate came to far exceed that in other industrial nations (including the Soviet Union). The bulk of the increase in the prison population consisted of nonviolent offenders. Many were there for violating stiffened drug laws. Black men made up a disproportionate share of the newly incarcerated, in part because they were more likely to be arrested for drug violations than white men (though the two groups had roughly the same rate of drug use) and when convicted were typically given much longer sentences than whites.
The United States became an outlier in its use of capital punishment as well. Executions became much less common during the two decades after World War II, especially outside the South. Then, as a result of a series of Supreme Court decisions that deemed unconstitutional various procedures used in capital cases, no criminals were put to death between 1967 and 1977. For a while it looked like the death penalty might disappear. But once the Supreme Court made it clear that under some circumstances capital punishment would be constitutional, thirty-five states passed new death penalty laws. Proponents and critics of capital punishment heatedly debated its value as a deterrent to crime, but support for the death penalty did not rest strictly or even mainly on instrumental grounds. For many of its supporters, it served as an expressive act, a symbolic counterweight to a perceived breakdown of moral standards and respect for authority.
Capital punishment resumed with the 1977 Utah firing squad execution of murderer Gary Gilmore. That same year, an execution in France became the last use of capital punishment in Western Europe. As abolition became the norm in noncommunist industrial countries (with Japan the other exception), in the United States the annual number of executions grew through the end of the twentieth century. As in the case of incarceration, at a moment of political transition, economic travail, and cultural uncertainty, the United States embarked on a road very different from other industrial nations.
Sex and Sexuality
Some of the most wrenching political debates of the 1970s revolved around gender roles and sexuality. As the revolution in the status of women swept forward and sexual mores continued to change, adjusting the law to new attitudes and norms proved extraordinarily contentious. Many religious leaders and political conservatives began the decade seeking to counter changes in sexual behavior and gender norms by encouraging their followers to maintain families structured around a stay-at-home mother and shun permissive cultural standards. Their pleas proved puny weapons in the face of wage stagnation and inflation that made it increasingly difficult for one-wage-earner families to survive and a commercial mass culture perfectly happy to profit by endorsing greater sexual freedom and the achievement of satisfaction through consumption. Frustrated, activists averse to state power in other realms began looking to government to regulate sexuality, reproduction, and the family.
The fate of the Equal Rights Amendment (ERA) demonstrated how divided the country had become over women’s rights and sexuality and how quickly political dynamics changed. After a substantial majority of the states endorsed the ERA, in 1974 the push for ratification stalled. Phyllis Schlafly, a longtime conservative leader, spearheaded opposition to the amendment. At first she was a lonely voice against what was widely seen as inevitable ratification, but established conservative groups and new grassroots anti-ERA organizations (many all-female) joined the effort. Conservative religious denominations, notably the Mormons and Baptists, joined in too.
The claim by opponents of the ERA that it might lead to drafting women into the military, abortion on demand, unisex bathrooms, homosexual marriages, and other unwanted social changes resonated with many women and men disturbed by the rapid transformation of gender roles. Over the course of the 1970s, the number of divorces per capita continued to rise. By the mid-1970s, families consisting of a stay-at-home mother living with a working husband and children constituted less than a quarter of families. ERA opponents feared that the amendment would legitimize the changes in family structure and gender relations that had occurred and further undermine social and moral structures built around the notion of different roles for men and women. Schlafly framed her opposition to the ERA as a defense of women’s existing legal protections. She claimed that if the amendment won ratification, it would relieve husbands of the obligation to support their families or to pay alimony after divorce, a worrisome possibility to millions of housewives who depended on men for support and for whom taking a poorly paid job in a labor market stacked against women held little attraction.
ERA opponents did not have to win majority public support or even majority support from the nation’s state legislators to block it. All they had to do was keep thirteen legislatures from ratifying it, which in most states meant wooing just a third of the members, since typically a two-thirds vote was required. After 1974 only a trickle of additional states ratified the ERA, leaving it three states short of the needed three-quarters when the prescribed ratification period ended in 1979. Except for Illinois, all the states that did not ratify the ERA were either in the South or Rocky Mountain states with large Mormon populations.
Opponents of the ERA benefited from the backlash to the 1973 Roe v. Wade decision, which broadened into a general antifeminist offensive. Catholic housewives—with support from their church—played a leading role in the “Right to Life” movement, which sought to limit and recriminalize abortion. Adopting tactics made popular by the civil rights movement, they picketed abortion clinics and engaged in civil disobedience, joined over time by evangelical Protestants, who took up the issue as well.
The debate over abortion proved contentious and prolonged. Abortion opponents won an important victory in 1976 with congressional passage of the Hyde Amendment, which banned the use of Medicaid funds to pay for abortions, restricting the ability of low-income women to terminate unwanted pregnancies. But both sides dug in for a war of position, in which electoral politics became a central front. At stake were not simply abortion laws but notions about the definition of life, women’s rights, their proper role in society, and the extent to which it should be defined by motherhood.
Debate over gay rights similarly provoked intense emotion and sharp divisions because it concerned not only specific legal issues but also broader notions about sexuality, morality, family, and gender roles. Gay activists, adopting the rights revolution model, pressed in the 1970s for policies and statutes to end discrimination on the basis of sexual orientation. Their initial success led conservatives to campaign against such equal rights mandates and, more broadly, against the normalization and moral legitimation of homosexuality.
As with the anti-ERA movement, the anti–gay rights movement gained national attention through its embrace by an effective female leader, in this case pop singer and former beauty pageant contestant Anita Bryant, who was the national spokesperson for the Florida orange juice industry. When in 1977 Dade County, Florida (which included the city of Miami), passed an ordinance prohibiting discrimination on the basis of “affectional or sexual preferences,” Bryant launched a campaign for its repeal that received national media coverage. She portrayed homosexuality as sinful and homosexuals as potential child molesters seeking to recruit children to same-sex sexuality. She called her organization “Save Our Children.”
Bryant’s campaign came at a moment of increased public and government concern about child abuse. Expanding notions of children’s rights, liberal and feminist critiques of authoritarian and patriarchal family relations, and the more open display of sexual images of children (part of the general spread of pornography) led to demands for child protection from feminists, lawmakers, journalists, and parents. Bryant’s campaign wedded fears that changing moral norms threatened children to more traditional religious opposition to homosexuality. When voters repealed the Dade County ordinance by a more than two-to-one margin, they made opposition to gay rights a national movement and a mobilizing issue for conservatives, while sparking a new wave of political organizing by gay activists. Voters in Wichita, Kansas, St. Paul, Minnesota, and Eugene, Oregon, passed ballot initiatives repealing gay rights laws. In California, a proposition on the 1978 ballot, the Briggs amendment, supported by Bryant, called for firing gay and lesbian public school teachers and banning teachers from speaking favorably about homosexuality in the classroom. A campaign spearheaded by openly gay San Francisco supervisor Harvey Milk led to its defeat. (Just weeks later Milk and San Francisco mayor George Moscone were assassinated by a homophobic former supervisor, Dan White.)
In deploying state power to impose moral norms, conservatives took the notion that “the personal is political” much further than the women’s movement, which had originated the slogan. In the process they found a strategy for widening the narrow base for their economic program, which favored the well-off, by winning over voter groups deeply concerned with what came to be called social issues. The coalitions that emerged in the struggles against the ERA, abortion, and gay rights pointed the way toward a reconfiguration of electoral politics at the end of the 1970s, after a decade during which the political system had proved largely incapable of dealing with the most serious problems Americans experienced. Meanwhile, outside the political arena, in the private economy, a profound restructuring was beginning that would transform the country in the decades to come.