4 Image The Debacle

“Riots, skyrocketing crime, tax problems that multiply raise this question: Can the big cities of this country ever stage a comeback?” In 1967 U.S. News & World Report posed this query, and its tentative conclusion was not optimistic. “The crisis of the big cities, coming to a head in recent years, continues without let up,” the magazine observed. “And no real solution appears in the immediate future.”1 Many agreed with this assessment. In 1968 the mayor of Saint Louis admitted, “We just can’t make it anymore,” and his counterpart in New Orleans likewise noted, “The cities are going down the pipe.”2 Such rhetoric was commonplace in the late 1960s and 1970s. Urban America was definitely on the skids, and its plunge to oblivion seemed irreversible. This was the age of urban crisis when journalists and social scientists busily analyzed the ailing remnants of the city and periodically issued dire prognostications. The upbeat reports of renewal and revitalization so common in the 1950s seemed hopelessly naive by the 1970s, and according to all reports the American city was in shambles.

In fact, the worst fears of 1945 were being realized. Racial violence flared, and the Detroit riot of 1967 exceeded that of 1943 in casualties and damage estimates. All the human rights commissions and interracial committees had not been able to bridge the racial fissure dividing the black–white city. Moreover, the efforts of big-city mayors and downtown boosters had not been able to maintain the single-focused metropolis. By the 1970s, central cities were no longer central to the lives of many metropolitan Americans, and the sense that all metropolitan-area residents were part of one common city with common social, cultural, economic, and political interests was diminishing. A 1978 New York Times poll of 3,500 suburbanites in the New York City area found that 54 percent did not “feel part of the general New York City area,” only 24 percent thought that events in the city had a “lot of impact” on their lives, and 25 percent said that they never went into the city. The Times concluded that these residents “no longer feel themselves subordinate to New York,” and the suburbs had “surprisingly limited ties to the metropolitan core.” Declaring victory for the dreaded forces of decentralization, the Times observed: “Suburban residents have established their own institutions and go about their lives in an increasingly separate world. They see their future even further from the city, rather than closer to it.”3

Thus the disintegration of the city was readily apparent; the metropolitan community was clearly dissolving. In the late 1960s and 1970s, some optimistic entrepreneurs attempted to create new communities that would restore some semblance of social and cultural unity among metropolitan Americans. But these efforts foundered. Just as central-city leaders of the 1940s had feared, the divisions in the black–white city had worsened, and a new decentralized way of life had triumphed.

Central-City Decline

   Indicative of the triumph of decentralization was the decline in the population of older central cities. During the 1960s, the outward flow of residents accelerated, with the population of Saint Louis dropping 17 percent, that of Cleveland and Pittsburgh falling 14 percent, and Cincinnati, Minneapolis, and Detroit each losing 10 percent of its population. These unfortunate northern hubs were becoming known collectively as the Rust Belt, a swath of decaying metropolitan space stretching from New England through the Midwest that was losing people and business to the surging Sun Belt areas of Florida, Texas, Arizona, and California. Benefiting from the advent of air-conditioning, such warm Sun Belt cities as Houston, Dallas, and Phoenix attracted thousands of newcomers, in sharp contrast to their northern counterparts. They boomed as the Rust Belt busted. But nationally, the prevailing trend benefited small suburban municipalities, not the traditional central-city behemoths. And in the Sun Belt as well as the Rust Belt, central cities had to cope with the increasingly debilitating problems of crime, racial conflict, and clashes over such public services as policing and schools. Throughout the United States, the suburbs seemed ever more appealing, and in 1970, for the first time in American history, the nation’s suburbanites outnumbered central-city dwellers. Whereas 46 percent of American metropolitan residents lived in central cities, down from 59 percent in 1950, 54 percent lived beyond the central-city boundaries in suburbia. By the 1970s, the United States was a suburban nation, with the majority of its metropolitan residents beyond the reach of central-city tax collectors and schools.

The decline in urban population continued during the early and mid-1970s. Between 1970 and 1977, the nation’s central-city population dropped 4.6 percent, whereas the number of suburban residents rose 12 percent. As in the 1950s and 1960s, the outward migrants were disproportionately middle class, the very people necessary to pay the city’s tax bills and fill its retail coffers. In 1970 the per family income in the nation’s central cities amounted to only 85 percent of the per family figure for the suburbs; by 1977, this had fallen to 82 percent. Female-headed households, with no husband present, were increasing sharply in the central cities, up 27.5 percent between 1970 and 1977, but the number of traditional husband–wife families declined 7.6 percent. Since female-headed households were disproportionately poor, this again did not bode well for central-city controllers or retailers. Likewise, between 1970 and 1977, the disproportionately poor black population of the central cities rose 4.2 percent, as compared with a drop of 8.1 percent in the number of white residents.4 Increasingly, the central city was a preserve for poor blacks and single mothers struggling to survive. It was the focus of life for the poor but was becoming alien territory to the affluent.

During the late 1960s and the 1970s, as during the earlier postwar years, the centrifugal flow of affluence was evident in the declining retail fortunes of the central cities. Business was discouraging in the late 1950s and early 1960s; now it was disastrous. Between 1967 and 1977, central business district retail sales, adjusted for inflation, fell 48 percent in Baltimore, 44 percent in Cleveland and Saint Louis, 38 percent in Boston, and 36 percent in Minneapolis. In 1967 the downtown flagship outlets still accounted for 46 percent of department store business, with outlying branches contributing 54 percent; in 1976 the figure for downtown department stores was down to 22 percent, with branch sales accounting for 78 percent.5

The plight of retailers along Baltimore’s traditional shopping thoroughfare, Howard Street, was representative of the situation in many downtowns. Baltimore’s downtown department store sales fell from $93 million in 1972 to $68 million in 1975. Responding to this drop, in 1976 Hutzler Brothers department store eliminated 20 percent of its Howard Street space, explaining that “for some time” the sales at a branch in suburban Towson had outpaced those at the downtown flagship. Hutzler’s president lamented: “Downtown business is far from good, far from easy…. I don’t know who is really doing a job [of bringing back sales].” The neighboring Hochschild Kohn department store likewise closed its sixth floor, thus eliminating 17 percent of its downtown sales space. Then, in January 1977, Hochschild Kohn announced that it would close its downtown store the following July.6 In February, the president of the Stewart and Company department store added to the gloom when he warned that unless the city embarked on a massive downtown revitalization project “in the very near future, we cannot guarantee our continued presence in the center city.” Meanwhile, downtown department store sales for the first half of 1977 were 15 percent lower than those for the same period of 1976.7 In 1970 a marketing report on the suburbanization of retail trade summed up the new reality in America: “By the end of this decade the suburbs will be central and the central cities peripheral.”8 The success of Hutzler’s Towson branch and the ongoing decline of Howard Street were visible proof of this transformation.

Downtown theaters and entertainment venues were even harder hit than department stores. Suburban mall theaters were supplanting the lavish downtown movie palaces that had dominated first-run film business in 1945. With ample parking and safe locations in the white, middle-class suburbs, the mall movie houses enabled metropolitan Americans to enjoy the latest Hollywood fare without the hassle of central-city traffic and beyond the threat of inner-city muggers and weirdos. Enhancing downtown’s reputation as off-limits after dark was the fact that its remaining theaters increasingly specialized in pornography and violent films aimed at young black audiences. Moreover, a growing number of vacant downtown storefronts were attracting porn shops and sexually explicit peepshows. The once classy hubs of American entertainment were thus becoming the focus of a sleazy porn trade that offended and threatened many middle-class Americans.

In one city after another, a nationwide decline in censorship opened the floodgates for this wave of pornography. Boston’s Combat Zone was the preeminent porn strip in New England and Minneapolis’s Hennepin Avenue offered the latest in X-rated offerings. In Chicago, there was an outcry against the pornographic assault on the downtown Loop. “Just look at some of the creeps prowling the streets in daylight, attracted by hardcore porno films, X-rated garbage, and violence-oriented black exploitation movies,” observed one journalist. “These scummy joints draw the kind of negative people who make honest folks uneasy, if not downright petrified.”9 The Chicago Tribune’s architecture critic remarked: “The feeling among some that these places are filled with perverts ready to commit mass rape cannot be laughed away.” The theaters, however, were “only part of this change. Shabby little cut-rate shops and lunch counters, pornographic peep shows, and stores offering books catering to every sexual interest are also in the downtown mix.”10

Yet even Chicago did not have enough theater-going “perverts” or “negative people” to keep all its downtown theaters in business. Between 1971 and 1977, six of the sixteen downtown movie houses closed, with a total loss of 8,697 seats, or 36 percent of the 1971 capacity. One of the shuttered theaters was the Oriental, where before its closing a Chicago film critic claimed to have found “foot-high trash from God knows when filling every row” of its infrequently used mezzanine. He also wrote of another “rodent palace downtown” where he engaged in a heated debate with a rival critic over whether a mouse or a rat had crossed the aisle during a film showing.11

Nowhere was the outcry against the blight of the entertainment district more pronounced than in New York City. For decades the focus of legitimate theater in the United States, famed Times Square was well on its way to becoming a porn zone by the early 1970s. In 1972 the New York Times editorialized: “Few things make a New Yorker feel worse than watching American and foreign tourists here walk past the Times Square porno-peepshows and dirty bookstores. The impulse is to shout, ‘This is the under life, not the real city.’”12 A Broadway producer complained of “the swiftness of the area’s relentless descent into squalor” and claimed that it threatened the future of American theater. Sixty-two prominent Broadway performers signed an appeal to the mayor, and there were even mutterings about a general theater strike to protest the city’s unwillingness or inability to cleanse Times Square of its offensive porn shops and sexually explicit offerings.13 By 1978, the city was formulating plans for revitalizing the theater district, but the area’s growing population of winos added another blemish to its reputation. The owner of a hotel complained: “We’ve lost all our tourist and school-group business because people are scared of the drunks hanging around here.” Another disgruntled business owner griped: “One of those drunks had the nerve to unfold a camp chair out front and start sunning himself.”14 No matter whether the offense took the form of porn shops or winos, the message was the same: New York’s premier entertainment district was no longer as appealing to a middle-class clientele as it had been in 1945.

Meanwhile, the central city was losing its preeminence as the focus of metropolitan employment. The outward migration of manufacturing accelerated, eliminating working-class job opportunities. Since the end of World War II, growing reliance on trucking had encouraged manufactures to relocate to suburban sites adjacent to superhighways where they could build sprawling single-story plants better suited to assembly-line production than the existing multistory factories in the inner city. In the 1960s and 1970s, however, the pace of abandonment quickened, leaving a gloomy assortment of empty mills, especially in the older cities of the Northeast and Midwest. Between 1947 and 1967, New York City lost 175,000 manufacturing jobs, but from 1967 to 1977 manufacturing employment dropped an additional 286,000. From 1950 to 1967, the number of manufacturing jobs in Boston fell 21 percent; between 1967 and 1977, the loss was 36 percent. Philadelphia lost 40 percent of its manufacturing employment between 1967 and 1977, and the rate of decline in New York City, Chicago, Baltimore, Pittsburgh, and Buffalo was between 30 and 36 percent.15 Shuttered factories were increasingly familiar sights in America’s urban hubs, testifying to the industrial decline of the central city.

An increase in office employment compensated to some degree for the outflow of factory jobs. In fact, the thriving central-city office sector offered the best evidence that downtown was not headed for oblivion. Responding to the demand for downtown office space, developers erected scores of skyscrapers that literally overshadowed the high-rises of the past. In the early 1970s, New York City’s 110-story, two-tower World Trade Center rose more than 100 feet higher than the city’s previous record holder, the Empire State Building. During the late 1960s and early 1970s, Chicagoans constructed three new towers over 1,000 feet in height, the tallest being the Sears Tower, completed in 1974. Rising 1,450 feet and accommodating a daily population of 16,500, the Sears Tower was the tallest office building in the world and visible evidence that the downtown Loop remained a vital hub of business. In Minneapolis, the IDS Center, completed in 1973, rose 57 stories and surpassed the city’s once preeminent Foshay Tower by 300 feet. In 1972 San Francisco’s 853-foot Transamerica Pyramid opened to tenants and soon became a signature landmark of the city (figure 4.1). Since 1927, Los Angeles’s city hall, rising 454 feet, had been the tallest building in the southern California metropolis. But in 1968, the Union Bank Building surpassed it, and in the early 1970s the twin towers of the Atlantic Richfield and Bank of America buildings topped off at almost 700 feet. Finally, in 1973 and 1974, the United California Bank Building rose 62 stories above the city, soaring almost double the height of the once preeminent city hall. Long regarded as a moribund relic in a city known for pioneering decentralization, downtown Los Angeles seemed to be reestablishing itself as a place of commercial importance and as the office hub of southern California.

Image

FIGURE 4.1 New skyline of downtown San Francisco, with the Transamerica Pyramid in the center, 1973. (San Francisco History Center, San Francisco Public Library)

Yet the soaring skyscrapers told only part of the story. In fact, many office jobs were leaving the central city as corporations followed the earlier lead of General Foods and General Mills and established headquarters in the suburbs. From 1967 to 1974, the number of Fortune 500 headquarters in New York City dropped from 139 to 98, testifying to corporate America’s rejection of the nation’s largest metropolis.16 Leading the exodus were Pepsico, American Can, and Bohn Business Machines, which announced their impending departures during a single week in February 1967. Pepsico was planning a corporate campus on the former grounds of the Blind Brook Polo Club in suburban Westchester County, and American Can was seeking to relocate on 228 acres in the upscale community of Greenwich, Connecticut. The same week, American Metal Climax and Union Camp admitted that they were considering leaving their Manhattan headquarters for the suburbs. Moreover, a leading location consultant claimed that fourteen additional corporations, with a total of 11,500 headquarters employees, were pondering a move from Manhattan to the metropolitan fringe. Meanwhile, Flinkote was already preparing to move from the city to a 35-acre tract in Westchester, and Corn Products, Inc., was planning to transfer to Englewood Cliffs, New Jersey, in the fall of 1967.17

Underlying this ominous flight from the nation’s chief hub were numerous complaints about the problems of doing business in the city. High taxes, the misery of commuting, crime, and the need for more office space all played a role. “They all add up to the same thing,” the location consultant concluded; “New York is not a happy place to be.” Perhaps most disturbing was the consultant’s claim that “complaints regarding clerical workers in New York City are universal.”18 Pepsico cited that as its chief reason for moving, and the New York Times noted the corporations’ desire to tap “the employment potential of young housewives who are eager for office jobs close to home.” Corporate America needed middle-class women with clerical skills or the ability to learn such skills, and those women had moved to suburbia. Thus the Times lamented the “lessened white-collar reserve within the five boroughs” owing to “the long-term exodus of high- and middle-income families to suburban communities.”19 Just as the middle class was abandoning downtown movie theaters and department stores, it was no longer as accessible to downtown corporate offices. Confronted by the alternative of hiring inner-city workers who struggled to fill out application forms or educated, middle-class suburbanites, the nation’s largest corporations were opting for the latter.

Decaying central-city neighborhoods reinforced the dismal image of the urban core. By the late 1960s and 1970s a wave of abandonment swept inner-city neighborhoods as even the poorest Americans shunned them. Landlords no longer made repairs, collected rents, or paid taxes, and vandals stripped structures of plumbing fixtures, piping, hardware, and any other relic that could bring in a few dollars. Once solid structures that had earned lucrative rents were cast aside as worthless. The value of many inner-city blocks had dropped to nothing.

This phenomenon was especially evident in the older central cities of the Northeast and Midwest. In 1975 there were an estimated 62,000 abandoned dwelling units in Detroit and 33,000 in Philadelphia (figure 4.2). During the 1970s, wreckers demolished 15 percent of the housing in Saint Louis, with a loss of 4,000 units each year between 1970 and 1976.20 In the most derelict neighborhoods of Saint Louis, approximately 16 percent of the buildings were abandoned.21 From 1966 to 1974, the city of Cleveland appropriated over $4 million for the demolition of cast-off structures as an average of three dwelling units were abandoned each day.22 Chicago’s Woodlawn district was labeled “the zone of destruction.” In the 1960s, its population dropped 36 percent, from 81,000 to 52,000, and by 1973 the city was bulldozing 500 housing units in the Woodlawn area each year, with a backlog of units slated for destruction mounting to 1,500.23 But Woodlawn was not the only Chicago district being reduced to rubble. In the two-month period between September and November 1970, 2.6 percent of the dwelling units in the North Lawndale district were abandoned.24 The number of dwelling units in Chicago’s East Gar-field Park area plummeted from 20,353 in 1960 to 10,933 in 1980.25 Engulfed by the plague of abandonment and destruction half of East Garfield Park disappeared over the twenty-year period.

Image

FIGURE 4.2 Sign of the times: the shell of a house in central Detroit, 1974. (Walter P. Reuther Library, Wayne State University)

As was so often the case in the late 1960s and 1970s, the most dire news was from New York City. From 1965 to 1968, 5 percent of the city’s housing stock, approximately 100,000 units, was abandoned. By 1975, the estimated number of abandoned dwelling units in the city had risen to 199,000.26 The most notable concentration of vacant and burned-out structures was in the South Bronx. During the 1970s, commentators frequently described the devastated district as reminiscent of Berlin or Dresden after World War II or London after the Blitz. Yet in the case of the South Bronx, foreign enemies did not wreak the devastation; instead, it came from within. Seeking to collect insurance, landlords paid arsonists to torch their buildings. Realizing that burned-out households rose to the top of the eligibility list for public housing, some disgruntled tenants were also happy to set fire to their apartments. In 1975 two local youths, ten and fifteen years old, admitted responsibility for forty or fifty blazes, having taken on the job of torching the structures for fees of $3 and up.27 Arson seemed to be the district’s biggest business, and by 1974 the number of fires in the Bronx was triple what it had been in 1960, before the onset of devastation. During the 1978 World Series, every baseball fan in America became aware of the borough’s plight when television cameras covering the game in Yankee Stadium shifted from the diamond and panned the blazing panorama of the nearby South Bronx. “The Bronx is burning,” announced sports commentator Howard Cosell.28 “It isn’t pretty to watch whole communities self-destruct in the heart of the cities of the world’s richest country,” observed an article on the abandonment phenomenon.29 Yet by the 1970s, millions of Americans were watching the ugly spectacle of self-destruction in the nation’s largest city.

Although less dramatic than the burning of the Bronx, the statistics on central-city welfare recipients seemed to tell the same story of urban debacle. Despite overall prosperity and a low unemployment rate, the number of Americans on welfare rose 107 percent from December 1960 to February 1969, with the greatest increase occurring from 1965 to 1969. The sharpest rise in the dependent population was in the nation’s five largest metropolitan areas, with an increase of 300 percent in New York City and 293 percent in Los Angeles County.30 Long deemed the capital of capitalism, New York City was winning an unenviable reputation as “a welfare dumping ground.” In 1960 almost ten times as many New Yorkers were employed as on welfare; by 1970 the ratio of employed to welfare recipients had declined to nearly five to one.31 Just as Times Square no longer appeared so glamorous or Fifth Avenue department stores so bustling, the city’s residents no longer seemed exemplars of the upward mobility resulting from private enterprise. Instead, the nation’s largest city and other American urban centers were developing into hubs of despair and dependence.

Soaring central-city crime rates further testified to a deteriorating way of life. Between 1962 and 1972, the nation’s murder rate doubled. In the early 1970s, southern cities proved especially lethal, with Atlanta winning the title of murder capital of America and New Orleans ranking a close second. In 1974 a mathematician at the Massachusetts Institute of Technology computed that a person born in a major American city and remaining there was more likely to be murdered than a World War II GI was to die in combat.32 A radio ad in Washington, D.C., warned: “Most of us worry about heart attacks, automobile accidents, or cancer. But if you’re a district resident between the ages of 15 to 44, you’re more likely to die by the bullet…. So don’t feel quite so secure if you’ve quit smoking and started wearing your seat belts.”33

The nation’s robbery rate also soared, more than doubling in the short period from 1966 to 1970. In New York City, renowned as America’s mugging capital, the robbery rate far exceeded that of any other city. Describing how she adapted to the reign of robbery, one New Yorker explained that she never carried a wallet, relying instead on “a small change purse with some bus tokens, a credit card and a few dollars in case I meet a mugger.”34 Although many Americans deemed life in New York particularly hazardous, in cities throughout the nation urban dwellers were coping with an enhanced level of crime and disorder. In January 1973, the Gallup Poll reported that “one person in three living in big center-city areas” had been robbed, burgled, mugged, or suffered from vandalism during the previous year.35

Underlying much of the increase in crime and insecurity was a startling rise in narcotic drug use. During the 1950s, New York City’s medical examiner reported an average of around a hundred narcotic-related deaths each year. By the close of the 1960s this figure had risen to over 1,200. Between 1963 and 1970, the estimated number of heroin users in Boston and Atlanta soared tenfold. Young males were especially vulnerable. In Washington, D.C., over 13 percent of males born in 1953 became heroin addicts, and in some parts of the nation’s capital the figure rose to about 25 percent.36

Mounting fear of crime and violence further alienated middle-class Americans from the central cities. Racial prejudice underlay some of the fears, as whites felt threatened by the growing number of central-city blacks. Yet African Americans themselves were not exempt from the prevailing anxiety about an increasingly violent city. Explaining why whites avoided Chicago’s downtown at night, a city official noted that an overwhelming majority of Loop filmgoers were black. “The white is intimidated by seeing a group of 24 black teen-agers coming toward him on a sidewalk.” But then he added: “The black older guy is also intimidated just by seeing 24 teen-agers.” In fact, in 1975 a Chicago Tribune survey found that only 26 percent of white respondents and 42 percent of black respondents would go to the Loop at night.37 In other words, a majority of both races would not venture into the city’s center after dark. Moreover, in another Tribune survey conducted at the same time crime ranked as the chief community concern among both black and white Chicagoans, and African Americans were more likely to have taken protective measures such as installing special locks or alarms.38

Mayoral candidates responded to mounting fears by campaigning as crime fighters. In both Philadelphia and Minneapolis, voters elected white former police officers who projected a tough-guy image to the mayor’s office. In 1973 Tom Bradley, a black veteran of twenty-one years on the police force, won the mayor’s race in Los Angeles after a campaign in which he spoke proudly of his law-enforcement background. The same year, African American Maynard Jackson successfully vied for Atlanta’s highest position, promising to crack down on criminals and “bust the pusher.” A black opponent for the Atlanta mayor’s office warned voters: “If nothing is done, Atlanta will be just another big city, a southern version of New York, a city where muggers and robbers control the streets and where downtown is a no-man’s land, where the central city is a battleground where the average man dares not trod.”39

In Atlanta, New York, Chicago, and Los Angeles, crime seemed to be destroying once great cities and forcing changes in the urban way of life. Fewer people were willing to go downtown after dark, and a stroll along the sidewalks of New York was a journey of fear rather than pleasure. Central Park was a place to be mugged, and Atlanta was a prime site for murder. “There is no keen and precise estimate of the extent to which fear of crime changes people’s behavior,” observed one student of the subject. “But it’s enormous: they take taxis instead of walking; they barricade their houses; they construct medieval fortresses; they close up their cities tight after dark.”40 Victimized by thugs and thieves, America’s central cities were becoming prisons of fear.

Culminating the downward spiral of the central cities was the fiscal crisis of the second half of the 1970s. Municipal governments in the older hubs moved perilously close to bankruptcy and became the financial basket cases of the nation. Unsafe, decaying, and abandoned, America’s urban hubs ran out of money and faced the ultimate humiliation of fiscal debacle.

New York City’s brush with bankruptcy dramatically demonstrated the weakness of central-city finances. For years, the city had borrowed to pay operating expenses and resorted to bookkeeping gimmicks to disguise its deteriorating financial condition. But by 1975, it had come to the end of its rope. The banks would no longer lend the city money; the nation’s largest municipality was broke. The state of New York intervened and assumed charge of the city’s finances, leading economist Milton Friedman to observe in December 1975: “New York City is now being run by the caretakers appointed by the state of New York. At the moment New York doesn’t have any self-government.”41 One of those charged with reordering the desperate finances of the city warned: “The pain is just beginning. New York will now have to undergo the most brutal kind of financial and fiscal exercise that any community in the country will ever have to face.”42 By the close of 1975, New York City was, then, a ward of the state, and the new guardians of the incompetent city were dedicated to forcing it into shape.

Some observers, however, felt that New York City would never recover. “New York is not quite dead, but death is clearly inevitable,” commented economist Robert Zevin. New York had lost “those things which define a city’s vitality: the culture and ferment, material pleasures and comforts, exploration and invention, growth of old activities and creation of new ones which serve as positive attractions for immigrants and produce a flow of ideas and products for export.”43 Former city official Roger Starr proposed the planned shrinkage of New York, urging the abandonment of hopeless neighborhoods and the concentration of resources on areas capable of surviving. Attempting to salvage something from the city’s debacle, Starr concluded: “Essentially, planned shrinkage is a recognition that the golden door to full participation in American life and the American economy is no longer to be found in New York.”44 Americans as a whole did not seem too eager to salvage the nation’s largest city. Reflecting the mood of the electorate, President Gerald Ford rejected federal aid to the faltering city. The New York Daily News headline read “Ford to City: Drop Dead.” Early in 1976, an article in the New York Review of Books perceptively observed: “As New York once carelessly discarded its own marginal neighborhoods, so America may have decided that New York itself can now be junked.”45

Few cities had the luxury of gloating over New York’s misfortune, for other hubs were also struggling to meet their expenses. In the fiscal year 1975/1976, Philadelphia recorded a $73 million deficit, and the projected budget gap for 1976/1977 was $100 million.46 In 1974 Buffalo’s finance commissioner seemed about ready to sacrifice his city’s self-government when he remarked, “Perhaps we should just take the charter and the keys, send them to Albany, and say ‘Okay, you solve it. We can’t do any more.’”47 The next year, local banks rescued Buffalo at the last minute, thus staving off a default on the city’s short-term debt. The state of Michigan helped Detroit avoid bankruptcy in the second half of the 1970s. In 1978, however, Cleveland defaulted on its short-term notes, reinforcing that city’s image as a deteriorating remnant of America’s industrial past. Perhaps reassured by discovering a community in worse financial shape than New York City, the New York Times published an article headlined “Cleveland Caught in Long Decline” and pronounced soberly that “Ohio’s largest city is chronically ill.”48

By the second half of the 1970s, this chronic illness seemed of epidemic proportions. Just three decades before, New York City had been the hope of the world, the symbol of America’s triumphant way of life. By 1975, however, some of its neighborhoods were being compared with the bombed-out Berlin of 1945, and commentators declared it no longer a place of opportunity. The New York City of 1975 evoked images of muggers, not Macy’s, and of bankruptcy, not wealth. In one metropolitan area after another, the core was rotting; the centripetal way of life of 1945 had yielded to central-city debacle by 1975.

Race and Rebellion

   Underlying much of the alienation of the white middle class from the central city were persistent racial animosities. Throughout the nation, the central cities were becoming increasingly black and as such were deemed off-limits to many whites. Between 1960 and 1980, the African American share of New York City’s population rose from 14 to 25 percent, from 23 to 40 percent in Chicago, and from 29 to 46 percent in Saint Louis. By 1970 Washington, D.C.; Atlanta; Newark, New Jersey; and Gary, Indiana, were all majority African American. By 1980 Baltimore, Birmingham, Detroit, New Orleans, and Richmond, Virginia, had been added to the list of black-majority cities. In fact, by the latter date, Atlanta, Detroit, and Washington were over 60 percent black. Moreover, whites fully expected that it was only a matter of time before other hubs became predominantly African American. In 1966 U.S. News & World Report predicted that by the year 2000 Saint Louis, Philadelphia, New York City, and Chicago would be at least 50 percent black, and African Americans would constitute 66 percent of the population of Cleveland. The Bay cities of San Francisco and Oakland, Cincinnati, and Kansas City, Missouri, were expected to be at least 40 percent black. “If present trends continue,” the magazine reported ominously to its white, middle-class readers, “Negroes will outnumber whites in 8 out of the 10 biggest cities in the U.S.—or come close to it—by the year 2000, a date that is now just 34 years away.” Emphasizing the significance of these figures, U.S. News made clear that “this could mean Negro governments in New York, Chicago, Philadelphia, Detroit, Cleveland, St. Louis, Baltimore, and Newark.” The magazine concluded: “Unless something occurs to check the current trends, some of the most important cities of this country are going to wind up under Negro control.”49 Both the census figures and the mass media were presenting whites with much the same message. In the near future, the nation’s urban hubs were going to pass from their hands and become black metropolises. The central cities would no longer be central to white metropolitan life.

As the black presence in the central cities grew, it did not result in a marked increase in stable, racially integrated neighborhoods. Instead, as blacks moved into neighborhoods, whites moved out, so areas went from all white to all black in a relatively few years. Whites did not share residential space with African Americans and did not frequent areas deemed black. Thus there were carefully defined black districts that few whites penetrated. In Boston, the African American district was Roxbury; in Cleveland, the East Side was black, and the West Side was white; in Saint Louis, the North Side was black, and the South Side was white; in Los Angeles, the South Central area was the African American preserve.

Nowhere was the continuing black–white division more dramatically evident than in the massive public-housing projects. These central-city projects were becoming highly visible reservations for poor blacks; segregated zones of welfare, unwed mothers, and violent gang activity, they represented everything middle-class whites abhorred and feared. In the 1940s and 1950s, poor white families as well as blacks sought residency in public housing. But in the 1960s and 1970s, the projects became increasingly black and associated in white minds with a culture of poverty that was peculiarly African American. By the early 1970s, 70 percent of all public-housing households were nonwhite; public housing for the elderly still attracted a number of white tenants, but projects for families were overwhelmingly black. In Atlanta and Chicago, 95 percent of the non-elderly households were African American.50 In Saint Louis, the Clinton Peabody Terrace project went from 6.1 percent black in 1957 to 96.1 percent black in 1974; the Cochran Gardens Apartments went from 16 to 98.8 percent African American over the same period; the increase in the Joseph M. Darst Apartments was from 18.9 to 98.9 percent.51 Basically by 1974, family projects in Saint Louis were for African Americans.

Blacks as well as whites were very conscious that the African American population was being warehoused in government-sponsored reservations where they would be out of the way of more affluent Americans. In Chicago, blacks were isolated in rows of monolithic high-rises extending south from the Loop and separated from the rest of the city by the multi-lane Dan Ryan Expressway. In 1965 the Chicago Daily News ran a series of articles on the Robert R. Taylor Homes, described as a “$70 Million Ghetto” and “the world’s biggest and most jam-packed public housing development.” According to the News, the project was “virtually [an] all-Negro city within a city” that its own tenants labeled as “a ‘death trap,’ a concentration camp, and even, with sardonic self-derision, ‘the Congo Hilton.’ Here live 28,000 people, all of them poor, grappling with violence and vandalism, fear and suspicion, teen-age terror and adult chaos, rage, resentment, official regimenting.”52 The tenant-occupied towers of Robert Taylor Homes were the very opposite of the oft-proclaimed American dream: ownership of a single-family, detached house on a plot of grass with shade trees. Whites were realizing this dream, but poor blacks were relegated to its antithesis.

Image

FIGURE 4.3 Public housing at Hunter’s Point, San Francisco. (San Francisco History Center, San Francisco Public Library)

On the West Coast, the most notable black public-housing reservation was Hunter’s Point in San Francisco (figure 4.3). On an isolated peninsula in San Francisco Bay, this project was, according to the local housing authority, 95 percent black and 5 percent “Caucasian” and “Samoan.”53 Except for some dealings with welfare and housing authority personnel, the black residents of Hunter’s Point had little contact with whites, and few white San Franciscans had ever been to the project. Even whites on the city police force generally avoided Hunter’s Point, abdicating responsibility for it to the racially integrated housing authority security force. Hunter’s Point was beyond the pale of white society; it was a community apart from the city and out of sight of San Francisco’s more affluent residents.

Some of the black population was seeping beyond the central-city boundaries and settling in “better” neighborhoods. By 1969, African Americans owned one-fifth of the houses in the Lomond neighborhood of the affluent Cleveland suburb of Shaker Heights.54 In 1970 almost 15 percent of Shaker Heights residents were African American; ten years later, the figure approached 25 percent. Meanwhile, nearby middle-class Cleveland Heights went from less than 1 percent black in 1960 to 25 percent black in 1980. In the Chicago area, the middle-class suburb of Oak Park had 132 African American residents in 1970; ten years later, 5,929 blacks lived in the community. In southern California, thousands of upwardly mobile blacks moved into the single-family tract houses along the tree-lined streets of suburban Compton, although in the 1970s they were joined by an increasing number of poorer refugees from the central city. The black migrants to suburbia, however, were the exception to the rule. Most African Americans remained confined in segregated neighborhoods of the central city, removed from a white population that was resigned to the abandonment of the urban core.

The well-publicized riots of the 1960s heightened white fears of central-city blacks and, if anything, widened the racial divide. Racial violence was nothing new to the nation’s urban areas, but the chain reaction of disorder during the 1960s shocked Americans. Extensive television coverage brought the reality of rebellion to the living rooms of Americans in even the most remote corners of the nation. Residents of lily-white suburbs could not avoid what was occurring in the central cities. Each evening, the television networks broadcast racial violence into their homes.

In the summer of 1964, New York City ushered in the age of disorder, suffering the first of the riots. As did so many of the 1960s disturbances, it began with a police incident. Thomas Gilligan, a white police officer, shot and killed a fifteen-year-old African American boy who had rushed at Gilligan with a knife. Incensed by what they perceived as yet another example of white police brutality, blacks rioted in both Harlem and the Bedford-Stuyvesant area of Brooklyn, smashing windows, setting fires, and looting stores. One rioter was killed, 118 persons were injured, and 465 arrested.55 During the remainder of the summer, lesser riots broke out in Rochester, New York, and Philadelphia.

Yet it was Los Angeles’s Watts riot in the summer of 1965 that brought the purported urban crisis to the forefront of the nation’s consciousness. Like the New York City disorder of the previous year, police action sparked the rebellion. On a sweltering evening in August, the California Highway Patrol pulled over twenty-one-year-old Marquette Frye for drunk driving. A crowd gathered around the stopped car, and Frye and his mother and brother began arguing with the police. An angry Mrs. Frye jumped on a police officer’s back. The white patrolmen dragged into their squad car a young black woman in a loose smock who appeared to be pregnant; the authorities claimed that she had spit on an officer. As the police pulled away with the Fryes in custody, the irate mob responded with a barrage of rocks and bottles.

For six days, rioting spread through Los Angeles’s Watts neighborhood and adjoining black districts. Middle-class black leaders sought to pacify the mob, but young African American rioters were not willing to follow the lead of their more affluent elders, who appeared to be allies of the hated whites. When a black state legislator who was attempting to halt the violence refused to join in the rock throwing, a young rioter responded: “Hell! You’re with the Man.”56 Thirty-four people died in the uprising; 1,032 required treatment for injuries; and 3,952 were arrested. Rioters burned, damaged, or looted almost one thousand buildings with a total estimated property loss of $40 million (figure 4.4). One unemployed man explained why so many joined in the looting: “They wanted everything the whites had, including color TV. They saw the stores were open. If you are hungry and don’t have no money, you want anything and everything.”57

Image

FIGURE 4.4 Building on fire during the riot in the Watts neighborhood, Los Angeles, August 1965. (Herald Examiner Collection, Los Angeles Public Library)

Although nothing in the summer of 1966 matched the intensity or destruction of the Watts uprising, there was enough civil disorder in America’s cities to keep white fears of racial rebellion very much alive. In July, Chicago suffered an outbreak of rock throwing, arson, and looting that resulted in three deaths and 533 arrests. The same month, rioting broke out in the Hough area of Cleveland, leaving four blacks dead.58 In San Francisco, Hunter’s Point erupted after a white police officer shot and killed a fleeing sixteen-year-old black boy who had refused to halt for questioning. Again, there was some looting, rock throwing, and minor arson. In calmer years, the Hunter’s Point disturbance would not have been deemed a riot but simply an insignificant outburst of neighborhood unrest. By 1966, however, the news media was labeling every mob action by angry blacks as a riot, and to uneasy white San Franciscans it seemed that their city was on the verge of another Watts. As in other cities, prominent blacks attempted to calm the youthful rioters. But when a black member of San Francisco’s governing Board of Supervisors spoke to the angry Hunter’s Point demonstrators, he was greeted with rocks and jeers. “That cocksucker forget he’s black,” remarked one Hunter’s Point resident, “but when we put them fuckers on the run, they sure let him know at City Hall right away.”59 Altogether there were forty-three reported race-related civil disorders and riots in 1966, but 1966 was relatively quiet compared to the violence of the following year.60

During the first nine months of 1967, there were a reported 164 disorders. In mid-June, Tampa and Cincinnati erupted; in late June, Buffalo was the scene of rioting. The following month, the race rebellion spread to Newark. Black rioters stole or damaged over $8 million of merchandise in the New Jersey city, and before order was finally restored, twenty-three people had been killed.61

In late July, Detroit suffered the most destructive and lethal riot of the 1960s. Again, a police incident ignited the black population. On a steamy summer night, Detroit police raided an illegal drinking establishment in a black neighborhood and arrested the eighty-two people patronizing the bar. A crowd gathered to witness the arrests and became angry because of the police officers’ supposed brutal treatment of those arrested. According to one police officer, a black youth incited the crowd, shouting, “Black Power, don’t let them take our people away; look what they are doing to our people…. Let’s kill them whitey motherfuckers … let’s get the bricks and bottles going.”62 The bricks and bottles did get going as well as the now all-too-familiar looting and arson.

In Detroit, as elsewhere, African American leaders attempted to quiet the angry mob. When black congressman John Conyers tried to disperse the crowd, one bitter rioter shouted: “Why are you defending the cops and the establishment? You’re just as bad as they are!” Later a discouraged Conyers complained: “You try to talk to those people and they’ll knock you into the middle of next year.”63 For five days, the wave of violence continued. In Detroit, however, some whites joined with the blacks in the looting. Twelve percent of the adults arrested were white, as were two of the seventeen looters who were killed.64 Seemingly, rebellion in Detroit was an integrated affair.

The extent of the violence shocked the nation. Forty-three people were killed in the Detroit riot, and 7,200 were arrested; there was an estimated $45 million in property damage. The fire department reported 682 fires resulting from the riot; the blazes demolished 412 buildings (figure 4.5).65 In the black neighborhoods of Detroit, ruins of destroyed businesses testified to the implosion of the central city.

Image

FIGURE 4.5 Aerial view of Detroit on fire during the riot of July 1967. (Walter P. Reuther Library, Wayne State University)

In the wake of the 1967 riots, President Lyndon Johnson appointed the National Advisory Commission on Civil Disorders to investigate the nature and causes of the urban uprising. Released in early 1968, the commission’s report concluded: “Our nation is moving toward two societies, one black, one white—separate and unequal.” It further argued that “within two decades, this division could be so deep that it would be almost impossible to unite: a white society principally located in suburbs, in smaller central cities, and in the peripheral parts of large central cities; and a Negro society largely concentrated within large central cities.”66 By bluntly exposing the nation’s developing racial division, the commission sought to shock the white public into embracing a program of healing action. But the commission’s conclusions were hardly surprising or shocking. Anyone with even a superficial knowledge of the black–white city would have realized that the United States was not moving toward two societies, separate and unequal; two societies, separate and unequal, had existed for centuries. Rather than describing a fearsome future, the commission was restating an American tradition. Moreover, there was already “a white society principally located in suburbs” and “a Negro society largely concentrated within large central cities.” And the message conveyed to many white Americans by the riots was that this was how it should remain. White Americans should confine themselves to the safe suburbs and avoid the increasingly black central cities. Despite the commission’s well-meaning call for change, many white Americans felt the answer to the “urban crisis” was to live, work, and play in the suburbs and abandon the central city to troublesome blacks.

In its emphasis on the black–white division and its narrow focus on race, however, the commission report downplayed some salient features of the urban unrest. The disturbances of the 1960s were not traditional race riots like the Detroit melee of 1943. In 1943 and in earlier riots, black and white mobs clashed, attacking anyone of a different skin color. In the 1960s riots, blacks attacked white police and white-owned businesses in African American neighborhoods, but they did not invade white neighborhoods or attack whites outside the black ghetto. Few of the casualties of the 1960s riots were white civilians; most of the whites injured were police officers or firefighters. Moreover, white mobs did not take to the streets and attack African Americans. The 1960s riots were rebellions against authority: against “the man,” those with power who, like the white police, were pushing blacks around or, like white-owned neighborhood businesses, were unfairly exploiting them. And to some degree, the riots were a rebellion against even black authority figures. Repeatedly, rioters jeered white-collar black peacemakers; they were no better than “the man.” A survey of rioters in Newark found that 50.5 percent believed that “Negroes who make a lot of money are just as bad as whites.”67 The riots in Los Angeles, Newark, and Detroit were not simply black–white clashes; they were attacks by powerless blacks on neighborhood businesses that exploited their patronage and on police who treated them like dogs. Many rioters deemed successful blacks who cooperated with the hated white authorities as little better than the whites.

No group had more reason to rebel than the young black males who constituted a disproportionate share of the rioters. Their youth, race, and gender made them the focus of police scrutiny; if there was trouble, they were assumed to be the troublemakers. Moreover, they suffered the highest rates of unemployment. White businesses were moving to suburbia to hire young white, middle-class women. They were not anchored in the central cities by a desire to employ African American males. For whites in business and law enforcement, the young black male did not conjure up an image of honesty, reliability, or obedience. The white power structure deemed young black males the enemy, and in the riots of the 1960s, young African American men rebelled against that structure.

It made little difference to middle-class whites, however, whether the riots of the 1960s were simply anti white or anti–middle class. Either way, they were the target. What was evident was that much of the central city was out of bounds to them. Newark and Detroit were no longer their cities. They belonged to rebellious blacks and thus were not desirable places to invest and live in or even to visit.

Reinforcing the perception of rebellion were government-sponsored community action programs. In 1964 President Johnson launched his War on Poverty, a federal initiative to level the social and economic playing field in America. One vital element of the federal scheme was the community action councils, which were to guide the assault on poverty in poor neighborhoods throughout the nation. There was to be “maximum feasible participation” by the poor on these neighborhood councils. In the minds of many Americans, the program was intended to empower the poor, specifically poor blacks, and enable them to seize control of their destinies from the prevailing white power structure. According to one contemporary observer, the federal bureaucrats in charge of the War on Poverty “operated on the assumption that the involvement of the poor in policy-making was necessary in order to redistribute power in the cities; without power redistribution, they believed, there would be no great improvement in the lot of the Negro poor.”68

The notion of a federally funded revolution understandably troubled many white central-city officials. In San Francisco, for example, Mayor John Shelley fought an unsuccessful battle with local black activists over control of the city’s community action program. At the meeting of the United States Conference of Mayors in 1965, an embittered Shelley joined with Mayor Samuel Yorty of Los Angeles in introducing a resolution that accused the federal antipoverty agency of “fostering class struggle.” Two years later, Shelley concluded: “Maximum feasible participation … are words that expanded the social revolution in San Francisco into a chain reaction of unrest and distrust that has left its mark on every major civic improvement project attempted here in recent years.”69 The white mayor of Newark agreed, charging the local community action agency with stirring dissent in the days before that city’s riot. Community action employees had participated in the antipolice rally that preceded the riot, and community action equipment was used to produce leaflets calling for the protest demonstration.70 At a time when white mayors were under siege and attempting to administer collapsing cities, they certainly did not welcome such federally sponsored assaults on their authority or the stability of their communities.

The community action program did empower a new corps of black leaders, most of them middle class, and provide jobs for favored followers of those leaders. But it created more conflict and controversy than economic uplift. Commenting on the Hunter’s Point neighborhood of San Francisco, one student of the program concluded: “Maximum feasible participation of the poor meant that upwards of a hundred persons were able to secure some full- or part-time staff positions and several hundred more were involved in block meetings concerned with improving some of the worst abuses in public housing. Hopes were raised, but the extent of changes brought about in this ghetto were negligible.” The editor of the local antipoverty newspaper summed up the situation: “The most outstanding thing is that everything is the same.”71 For whites, however, all the talk of black empowerment and social revolution simply reinforced the notion that the central city was off-limits. In their minds, black power meant the exclusion of whites.

A rising number of African American victories at the polls proved even more significant in the struggle for black empowerment and the advance of white alienation. As the black population increased in central cities, African Americans won more political offices, and in 1967 Cleveland and Gary became the first major American cities to elect black mayors. Three years later, Newark voters chose an African American as their city’s chief executive, and in 1973 blacks won the mayor’s office in Detroit, Atlanta, and Los Angeles. These victories did not signal a new era of racial tolerance in America’s cities. Instead, urban voters split largely along racial lines, with blacks winning because they constituted a majority or near majority of the population. Race dominated the Cleveland campaign of 1967; the white opponent of victorious Carl Stokes complained: “Over in the Negro part of the city the ministers and newspaper editors and everybody else were saying, ‘Vote color,’ and over in the white community Stokes was saying, ‘No, don’t vote color—consider a person on his qualifications, not on account of his color.’”72 Some whites were also recommending that the electorate vote color. “Vote Right—Vote White,” read an anonymous leaflet distributed in some white neighborhoods before the election.73 Gary’s black candidate, Richard Hatcher, declared independence from the white-dominated Democratic organization, proclaiming, “Plantation politics is dead.” Such rhetoric worried some of Gary’s whites. One white resident said of his neighborhood before the election: “Every racist in the area … were out in the open, up and down the street.”74

The election returns demonstrated the racial split. Hatcher won more than 96 percent of the black ballots but only 14 percent of the white vote; Stokes captured an estimated 95 percent of the African American vote yet only 19 percent of the white.75 In Newark, the white police director said the mayoral battle of 1970 was a “black versus white situation,” “a battle for survival.” Raising visions of apocalyptic doom he told a white audience: “Whether we survive or cease to exist depends on what you do on [election day].”76 This racial divide was evident in the election returns: Newark’s black candidate won nearly unanimous backing from African Americans and no more than 20 percent of the white ballots.77 In Detroit’s 1973 contest, the African American Coleman Young secured 92 percent of black votes; his white opponent was the choice of 91 percent of white voters.78 Meanwhile, in Atlanta the black victor received 95 percent support from African Americans and an estimated 17.5 percent of white votes.79 Only in Los Angeles was a black candidate able to draw strong support from both black and white voters. Elsewhere race determined the people’s choice.

Once they took office, black mayors were able to translate political power into new opportunities for their African American supporters. Between 1973 and 1978, the share of managerial positions in Detroit’s city government held by blacks soared from 12 to 32 percent, and the black share of professional positions in Atlanta’s municipal bureaucracy rose from 19 to 42 percent. Victory at the polls meant more African American department heads, municipal engineers, and city attorneys. Similarly, in Detroit and Atlanta, the percentage of the cities’ business in the form of purchases and contracts going to minority-owned firms increased from 2 to 3 percent in 1973 to 33 percent in 1978.80 Although the beleaguered older central cities were not the rich prizes they had once been, political power produced important gains for blacks long closed out of the inner circle at city hall.

Yet as blacks took power, whites felt increasingly insecure about their position in the central cities. Rhetoric about “a battle for survival” struck a chord with whites facing the possibility of displacement. Exemplifying the racial tensions arising from African American empowerment was the highly publicized clash over school government in New York City. During the late 1960s, New York’s African Americans grew increasingly frustrated with the city’s highly centralized education bureaucracy. Black children were not receiving adequate education, and responding to a growing chorus of complaints the city embarked on an experiment in neighbor-hood control. In 1967 three experimental districts were created where the schools would be governed by neighborhood boards. One of the districts comprised the predominantly black Ocean Hill–Brownsville area of Brooklyn.

The white-dominated teachers union, however, soon clashed with the black-dominated neighborhood board and the black administrator of the Ocean Hill–Brownsville district, resulting in a bitter barrage of attacks and counterattacks. White teachers received “hate literature” demanding that only black or Puerto Rican teachers be employed in Ocean Hill–Brownsville. Moreover, one manifesto proclaimed: “All ‘whitey’ textbooks must be burnt and replaced by decent educational material. ‘Whitey’ art and John Birch–type social studies must be replaced by African arts and crafts and African history.” “The Black Community Must Unite Itself Around the Need to Run Our Own Schools and to Control Our Own Neighborhoods Without Whitey Being Anywhere on the Scene,” announced one leaflet placed in teachers’ mailboxes. “We Want to Make It Crystal Clear to You Outsiders and You Missionaries, the Natives Are on the Move!!! Look Out!!! Watch Out!!!! That Backfire You Hear Might Be Your Number Has Come Up!!!!” The many Jewish teachers felt especially threatened when the literature warned: “Get Off Our Backs, or Your Relatives in the Middle East Will Find Themselves Giving Benefits to Raise Money to Help You Get Out from Under the Terrible Weight of an Enraged Black Community.”81

The neighborhood board denied any responsibility for the hate literature and denounced anti-Semitism. But the well-publicized conflict in the nation’s largest city was added proof of the racial chasm dividing the nation. A committee appointed by the mayor to investigate the Ocean Hill–Brownsville conflict found that “an appalling amount of racial prejudice—black and white—in New York City surfaced in and about the school controversy. Over and over again we found evidence of vicious anti-white attitudes on the part of some black people, and vicious anti-black attitudes on the part of some white people.”82 Given such racial vitriol, many middle-class whites had to conclude that the suburbs were the best place for them.

Clinching the case against the central city was the battle over busing. In 1954 the United States Supreme Court, in Brown v. Board of Education, held racial segregation in the public schools unconstitutional, and by the 1970s the courts were interpreting this to mean that school districts had to ensure that the student body of each school had a racial mix proportionate to the racial mix in the district as the whole. In one city after another, federal judges ordered busing of children to schools outside their neighborhoods in order to achieve racial balance. This might require the transportation of students to schools miles from their all-white or all-black neighborhoods.

Throughout the nation, school districts engaged in lengthy court battles to avoid busing. And polls showed that white parents were overwhelmingly opposed to the integration policy. Some whites transferred their children to private schools; others opted to move out of central-city school districts and thereby avoid busing orders. Parents complained bitterly about the possibility of their children being sent miles from home to distant neighborhoods. Proponents of school integration, however, contended that foes of busing were simply racists. “It’s Not the Distance, ‘It’s the Niggers,’” the NAACP Defense Fund bluntly observed.83 Whatever the motives for opposition, busing was a key factor in determining where whites chose to live. If they could avoid the transportation of their children to distant institutions, they would do so. The result was further abandonment of the central city. In the minds of most whites, busing further raised the penalty for living in the urban core.

The greatest battle over busing was fought in Boston. In 1965 the Massachusetts legislature adopted the Racial Imbalance Act, which defined any school that had more than 50 percent nonwhite students as racially imbalanced and ordered local districts to eliminate such concentrations of minority pupils. For the following nine years, Boston’s school board refused to allow anything more than token action to correct imbalances. Its Irish Catholic members, led by Louise Day Hicks, stirred not only racial animosities but also class conflict. They emphasized that upper-middle-class lawmakers from the suburbs were imposing this scheme of social engineering on working-class Bostonians. Attacking the coauthor of the imbalance law, a resident of the upper-middle-class suburb of Brookline, Hicks argued: “The racial imbalance law does not affect Brook-line, so he smugly tells the elected officials of Boston what they should do. I, for one, am tired of nonresidents telling the people of Boston what they should do.” Shifting the blame for social ills to the more affluent whites outside the city, Hicks challenged suburban residents to “help the poor city correct the situation. Take the Negro families into your suburbs and build housing for them and let them go to school with your children.”84 Another school board member attacked an approach to government that mandated “that suburban patricians rule urban plebeians from 9 A.M. to 5 P.M. It seems to be an elitist concept which would rule the destinies of the great ‘unwashed’ (us) through inquisitions, innuendo and high-powered Madison Avenue scare techniques.”85

In 1974 the clash entered a new phase when federal district court judge W. Arthur Garrity, a resident of the affluent suburb of Wellesley, found the Boston school authorities guilty of maintaining segregated schools and ordered busing to achieve racial balance. Opposition to Garrity’s busing order was especially vehement in the working-class Irish neighborhood of South Boston. At the opening of the school year in September 1974, 90 percent of South Boston’s white students participated in a school boycott, staying away from their classes. According to Time magazine, “a jeering, mostly teen-aged crowd of whites threw stones and bottles at two yellow buses that carried the 56 black students who showed up for opening day” at South Boston High School. After school that afternoon, “whites brandished lengths of rubber hose and clubs and again threw bottles at the buses.” The next day, black students were confronted by “several dozen white mothers, who chanted, ‘Southie won’t go!’ and by some 200 stone-throwing white youths.”86

South Boston whites not only opposed the importation of African American students but were, if anything, more opposed to the transfer of their children to schools in black Roxbury, a district perceived as dangerous and crime ridden. One white parent of a boy slated for busing explained: “I worked nine years in Roxbury as a street cleaner, and I’ll never let him go there.”87 Raising the specter of rapacious black sexuality, another white father observed: “The question is: Am I going to send my young daughter, who is budding into the flower of womanhood, into Roxbury on a bus?”88

In the working-class Charlestown district of Boston, the opposition was virtually as intense. Charlestown whites had earlier fought the physical engineering plots of urban renewal authorities to disrupt their enclave, and now they mobilized against the social engineers of busing. In fact, whites throughout the city opposed busing, and many of those who could afford to move to suburbia did so. Meanwhile, poor whites and poor blacks remained in the central city, embroiled in racial conflict.

In the early 1970s, some federal judges, however, attempted to foil fleeing whites by mandating inter district busing between cities and suburbs. Most notably, in 1972 federal district court judge Stephen Roth ordered busing between the city of Detroit and fifty-three independent suburban school districts. Roth found Detroit school authorities guilty of actions that promoted racial segregation in the city’s public schools, but he held that busing solely within the increasingly black city was not an adequate remedy. Shifting students between the black areas of Detroit and the remaining white city neighborhoods would still result in schools that were largely black. To achieve truly integrated schools, there would have to be busing between the predominantly black city and the predominantly white suburbs. Under Roth’s scheme, a total of 310,000 black and white children over a three-county area would be transported from their school districts to others in order to achieve racial composition in each classroom proportionate to the racial composition of the student population of the metropolitan area as a whole. The outcry against the decision was deafening (figure 4.6). In effect, Roth was using the equal protection clause of the United States Constitution to eradicate the boundaries between the central city and the suburbs and discard not only the concept of the neighborhood school but the principle of local self-rule.

Image

FIGURE 4.6 Antibusing demonstrators in Michigan, early 1970s. (Walter P. Reuther Library, Wayne State University)

The suburban school districts appealed to the United States Supreme Court, where attorneys for the NAACP defended Roth’s ruling. Speaking of the central city and suburbs, the NAACP general counsel argued: “They are bound together by economic interests, recreation interests, social concerns and interests, governmental interests of various sorts, and a transportation network.”89 In other words, socially, economically, and culturally the metropolitan area was a single entity. Consequently, busing advocates believed that all its subdivisions should share in the legal remedy proposed to achieve equal educational opportunity.

In 1974, in Milliken v. Bradley, the Supreme Court disagreed, overruling Roth’s draconian scheme by a 5 to 4 vote. Speaking for the five-person majority, Chief Justice Warren Burger contended: “No single tradition in public education is more deeply rooted than local control over the operation of schools.” Deferring to this tradition, the Court was not willing to uphold Roth’s inter district scheme for racial mixing. Burger held that “without an inter district violation …, there is no constitutional wrong calling for an inter district remedy.”90 According to the majority, there was no evidence that the suburban districts had taken any unconstitutional actions to promote racial segregation in the Detroit schools, so Roth could not require them to be part of the remedy for correcting such segregation within the city of Detroit. The lower courts could order busing between black and white neighborhoods within the city of Detroit, but the suburbs could not be forced to be involved. Detroit’s segregation problem was legally none of the suburbs’ business. They were separate entities; they were not part of the central city’s racial problems and thus need not be part of the mandated solution. Burger in effect excused white suburbanites from the hated remedy of busing. Central cities across the nation remained subject to busing orders, but for the most part busing between cities and suburbs was not necessary.

In this decision, the Supreme Court added the Constitution’s imprimatur to the destruction of the single-focus, interconnected metropolis. Contrary to the argument of the NAACP counsel, the Detroit metropolitan area was not a single entity bound together socially, its peripheral parts legally responsible for its core. Instead, it was a disparate mass of population in which the outer, predominantly white parts had no responsibility for the predominantly black core. The tradition of local control had trumped the notion of metropolitan interdependence. The periphery could abandon the center and pursue its independent course. In the 1978 Times poll, New York suburbanites had not felt part of a single metropolitan area, and neither had they believed that what went on in the central city affected them. Milliken v. Bradley was the constitutional equivalent of these findings. What happened in bankrupt New York City or poor, black Detroit need not concern people living, working, and shopping in separate communities twenty-five or thirty miles from decaying central-city downtowns.

The Supreme Court confirmed the suspicions of working-class whites stranded in the central city of Boston. The burden of righting the wrongs of the past were on them, the poorest of the white population; working-class whites in Charlestown were saddled with the white man’s busing burden, just as they had been targeted as the victims of the urban renewal crusade. They were the ones chosen to face the bulldozers as well as the buses. Judge Garrity and his neighbors in Wellesley need not be part of the wrenching remedies for America’s urban ills. During the 1970s, the central city seemed not only a welfare dumping ground but a convenient dumping ground for America’s guilt. Those with money, mainly whites but also some middle-class blacks, could escape. But the less affluent residents of South Boston and Roxbury would have to pay the price for the nation’s social ills.

Creating a New Alternative

   Confronted by the debacle in the central cities and the monotony of malls and tract houses in the suburbs, some Americans of the 1960s posited the need for a third alternative. They found inspiration for this new option in Europe. Since World War II, the governments of Great Britain and Scandinavia had invested in the creation of “new towns,” carefully designed communities in outlying areas to house the growing number of metropolitan residents. Admired by American visitors, these new towns seemed an answer to America’s metropolitan dilemma. Through the creation of entire new communities, Americans could supposedly correct the urban flaws resulting from the unplanned development of the past. Rather than segregating rich and poor, white and black, the new cities could embrace social and racial heterogeneity and provide housing and jobs for a diverse and hopefully harmonious population. Moreover, through enlightened planning, the new communities could halt mindless urban sprawl and offer something better than the prevailing pattern of unrelated housing subdivisions and garish commercial strips along traffic-jammed highways. They could also provide adequate open space and recreational facilities and preserve rather than bulldoze the natural beauty of the countryside. And they could foster imaginative modern architecture rather than create additional rows of the banal boxes with neo-colonial trim so typical of suburbia. The dream was, then, to apply the ingenuity of planners and architects to the urban dilemma and produce something better than what existed. Urban America had seemingly failed; now was the time to start from scratch and build new cities.

One person who believed that new towns were a viable response to metropolitan ills was Robert E. Simon. In 1961 Simon purchased eleven square miles of undeveloped land in northern Virginia eighteen miles west of Washington, D.C., and began planning a community known as Reston. Its master plan, completed the following year, envisioned a city of 75,000 residents that would include detached single-family homes, townhouses, and apartment buildings as well as industry and retailing. Residents would enjoy a full range of cultural and recreational facilities. In other words, it was not a subdivision; it was intended to be an entire city.

Yet it was to be a carefully planned city quite unlike the aging, troubled metropolises of the eastern United States. Reston was to consist of seven villages, each with a village center featuring shops and recreational facilities. Walking paths were to link the villages, offering an alternative to excessive dependence on the automobile. Aesthetics were important to Simon. He sought to preserve the site’s natural topography and woodlands and favored innovative, architect-designed housing. The first village constructed, Lake Anne, was a modernist version of the Italian fishing village of Portofino, complete with apartments and shops enclosing a waterfront piazza. Sailing, golf, horseback riding, swimming, and other forms of recreation were to play a significant role in the lifestyle of the new town. “The main idea,” Simon explained, “is that in this age of leisure people should have a wide choice of things to do which are stimulating, pleasurable, exciting, fun.”91

Reston, however, was not planned to be simply a leisure age retreat for white, upper-middle-class Washingtonians. It was intended to offer housing for low-income families as well as the more affluent. A 1968 editorial from the Reston newspaper expressed the multiclass, multi race vision of the community’s pioneers: “Many of us came to Reston believing, at least hoping, that the town was the answer to the urban dilemma our nation faces—that the vast economic gap between the haves and have-nots could somehow be bridged.”92 An early African American resident of Reston summed up the sentiment succinctly: “New towns are the black man’s hope.”93

A similar philosophy underlay the development of Columbia, Mary-land, a new town between Baltimore and Washington with a target population of about 100,000. By 1963, shopping-mall entrepreneur James Rouse had assembled fourteen thousand acres of land for this community, which was intended to serve as a model for private developments throughout the nation. “The surest way to make the American city what it ought to be,” Rouse observed, “is to demonstrate that it is enormously profitable to do it a better way.” Disgusted by the unplanned suburban sprawl that was laying waste to the American landscape, Rouse believed that heretofore the wrong way had prevailed. “Sprawl is ugly, oppressive, massively dull,” he complained. “It squanders the resources of nature—forests, streams, hillsides—and produces vast, monotonous armies of housing and graceless, tasteless clutter.” He believed that the sprawl resulting from traditional subdivision development was “inhuman. The vast, formless spread of housing pierced by the unrelated spotting of schools, churches, stores, creates areas so huge and irrational that they are out of scale with people.”94

Seeking to re-create the human scale that he knew as a boy growing up in a small town on the Eastern Shore of Maryland, Rouse conceived of his new city as a group of distinct villages, each with about twelve thousand residents and a village center consisting of stores, schools, and community facilities. The villages would be organized around an urban downtown, the common focus of this new city. Village life would supposedly nurture an intimacy and fellowship lacking in the sprawling, unplanned metropolis. “A broader range of friendships and relationships occurs in a village or small town than in a city,” Rouse believed; “there is a greater sense of responsibility for one’s neighbor and a greater sense of support of one’s fellow-man.” But Columbia was to have the advantages of a city as well as those of a small town. Rouse hoped to attract thirty thousand jobs to the community, thus offering residents a full range of employment opportunities. Moreover, one-fifth of Columbia’s area would be devoted to open space, with woodland trails, golf courses, riding stables, lakes, tennis courts, and swimming pools providing unequaled recreational options.95 Columbia was not to be a residential suburb dependent on a larger city for employment and recreation but a self-sufficient community where one could live, work, and play.

Like Reston, it was also intended to provide a home for all races and social classes. Rouse boasted that the new community would have housing for everybody from the company president to the janitor. Each village was intended to include upper-middle-class, single-family homes as well as subsidized rental apartments for those who could not afford market rates. “Like any real city of 100,000 Columbia will be economically diverse, poly cultural, multi-faith and interracial,” Rouse declared in 1967.96 At a time when class and racial differences were ripping apart existing cities, both Columbia and Reston would provide an alternative; they would supposedly breach the social and cultural chasms in the nation and provide models for tolerant, harmonious living.

On the West Coast, the new town of Irvine, California, was also capturing the imagination of Americans who hoped to create a better way of metropolitan life. By the late 1950s, southern California’s metropolitan frontier was fast approaching the 120,000-acre Irvine Ranch in Orange County. Rather than opt for piecemeal development of its vast property, the Irvine Company decided to create a planned new community of 100,000 residents on the ranch site. Approved by Orange County authorities in 1964, the master plan for the new town, like those of Reston and Columbia, provided for a city composed of a series of distinct villages. A particular style of architecture and landscaping would distinguish each village. One would be California ranch style, another Mission style, and still another would have a Cape Cod motif. And each would have a village shopping center. As in the eastern new towns, there would be ample recreational facilities for leisure-conscious residents. Planned to be a self-sufficient community Irvine was to be a hub of manufacturing, retailing, and office employment as well as the site of a new branch of the University of California. Less idealistic than Rouse and Simon, the Irvine Company originally was not committed to accommodating lower-income residents in subsidized housing.97 Yet like its eastern counterparts, the new town of Irvine was a clear departure from the unplanned sprawl that had traditionally characterized American metropolitan development. It was a carefully designed city, not just one more housing subdivision.

As the first structures arose in the new towns during the mid-1960s, it was clear that the dreams of their developers were not identical. Simon was more interested in innovative modernist architecture than was Rouse or the Irvine Company; a vision of small-town fellowship particularly motivated Rouse; and the Irvine Company was more concerned with the ideal, and ultimately lucrative, development of its property than with bridging the nation’s racial and class divisions. Yet together, the new towns seemed to promise a better metropolitan future. Through enlightened planning, Americans could perhaps create new cities that righted the wrongs so obvious in Los Angeles, Chicago, and New York.

Three towns alone, however, could not change America. To spread the gospel of Simon and Rouse throughout the nation, new-town advocates turned to the federal government, lobbying Congress for a program that would foster additional communities. The result was a series of measures, the most notable being Title VII of the Housing and Urban Development Act of 1970. This legislation authorized the federal government to guarantee bonds issued by developers to finance new towns that, like Reston and Columbia, were to offer housing for blacks as well as whites and lower-income Americans as well as the middle class. In other words, if the developers defaulted on their bonds, the federal government was obligated to assume the debt payments. The law further permitted the federal government to make direct loans and grants for planning and public facilities to new-town developers. Basically, Title VII was intended to aid private developers willing to take the risk of investing in large-scale, innovative schemes. The bond guarantee would encourage lenders to advance the money necessary for the construction of entire new cities, and the government loans and grants were similarly intended to ease the burden of initiating the ambitious projects.

The legislation stirred considerable interest among private developers, with the Department of Housing and Urban Development receiving almost one hundred applications and pre applications for Title VII assistance from 1970 through 1974. Thirteen projects actually received bond guarantees and embarked on the federally sponsored pursuit for new and better communities.98 One of the projects, for example, was Jonathan, twenty-five miles southwest of Minneapolis. The dream of a wealthy Minnesota businessman and environmentalist, Jonathan was planned to be a city of fifty thousand residents. Its developers sought to preserve the site’s meadows, wooded ravines, and wetlands while also promising “exciting innovations in housing designs and systems.” The community experimented with modular “stack units,” and one newspaper account described Jonathan as “dotted with striking, unusual houses” (figure 4.7).99 With both subsidized and market-rate housing, the Minnesota new town attracted what a local pastor called “an amalgamation of a great assortment of people—conservatives, militants, escape artists, chronic transients. It’s the closest thing to a cross-section, representative community that I’ve found, a microcosm of a major city.” He claimed, “We range from Ph.D.’s to ex-cons.”100

Other federally aided new towns also seemed to offer exciting alternatives to the status quo. Shenandoah, thirty-five miles southwest of Atlanta, was designed to be a solar-powered community housing seventy thousand residents. Among its first structures were a community center heated and cooled by solar power and a solar-energy model home.101 Thirty miles north of Houston, Woodlands was a Title VII new town with a projected population of 150,000 and a special sensitivity to the natural environment. Its development plan promised that “procedures and techniques for land planning, land development, and construction will be established to preserve and enhance the natural surroundings and create a healthful, ecologically sound, aesthetically pleasing community.”102 The Texas new town developed an innovative natural drainage system, winning plaudits from environmentalists. One leading ecologist said of Woodlands: “Builders found that they could love money and trees at the same time.”103

Image

FIGURE 4.7 Modernistic housing design in Jonathan, Minnesota, 1975. (Steve Plattner, Minnesota Historical Society)

Initial enthusiasm for the federal new-towns program, however, quickly changed to despair. By the close of 1974, the federal new towns were in serious financial trouble, and in January 1975 the Department of Housing and Urban Development announced that it would not process any further applications for new-town development. Instead, it would devote its resources to salvaging the existing projects. In 1976 the federal government decided to purchase seven of the thirteen faltering new towns and terminated two of the seven as no longer feasible.104 One of those terminated was New fields, outside Dayton. Developers had projected that by 1975 there would be 919 housing units in the Ohio new town; in November of that year, only 65 units were actually standing and a mere 25 occupied.105 A 1976 government report on the new communities said of New fields that “a market for a new town does not exist” in the Dayton area and noted that the federal government planned to turn the failed town into a residential development of no more than five hundred acres. Similarly, the core area of the defunct new town of Ganada, east of Rochester, New York, was to be developed as “a conventional subdivision.”106

The data from one community after another was dismal. By late 1976, two and a half years after development began, Shenandoah was far from its goal of 70,000 residents, having a total population of only 7. At the close of 1982, it had risen to 750.107 Four years after groundbreaking, Maumelle, outside Little Rock, Arkansas, consisted of sixty-five dwelling units, forty-two of which were occupied.108 In late 1974, Jonathan defaulted on a $468,000 interest payment on its bonds, forcing the federal government to intervene and come up with the sum. At the beginning of 1975, the Minnesota new town was up for sale.109 No longer were there optimistic projections of a city of 50,000. In late 1976, the Department of Housing and Urban Development was predicting that Jonathan’s population “could increase to 5,000 in 5 years and to 15,000 in 20 years.” It failed to reach these figures; six years later, at the end of 1982, it had an estimated 3,125 residents.110

Even the highly vaunted new towns of Reston and Columbia were not proving to be ideal investments. As early as 1967, Gulf Oil Company, Reston’s chief financial backer, took control of the community away from Robert Simon because of fears that the new-town project was approaching bankruptcy. Although the Virginia community’s financial prospects improved during the late 1960s and early 1970s, in 1975 the Washington Post reported that Reston was “still losing money daily.”111 In January 1975, James Rouse was forced to negotiate emergency refinancing of Columbia owing to what he described as “the precipitous collapse of the real estate industry.” Columbia land sales plummeted from an average of $24 million a year in 1971 through 1973 to only $6.5 million in 1974.112 Compelled to cut expenses, Rouse also laid off approximately half his employees. Through the 1970s, Reston and Columbia continued to grow in population, but not at the rate originally projected.

Unfavorable economic conditions were partially responsible for the crisis in new-town development. During the years 1973 to 1975, a combination of economic recession, record-high interest rates, and double-digit inflation meant that few Americans were in the market for houses and that the price of borrowing and building was soaring. In other words, developers’ expenses were rising at the same time their revenues were diminishing. Yet in its analysis of the troubled Title VII program, the Department of Housing and Urban Development contended that the economy was not wholly to blame for the new-town fiasco. “While adverse economic conditions exacerbated the program’s difficulties,” the department noted, “these difficulties would have occurred in any event” owing to defects in Title VII itself and to the administration of the new community initiative. Because of the federal bond guarantee mechanism, developers incurred excessive initial debt; moreover, they “did not receive Federal grants of the types and amounts necessary to offset” this burden.113 In addition, the Department of Housing and Urban Development did not have adequate staffing or the necessary expertise to implement a successful program. Repeated complaints by developers about administrative delays and red tape seemed to support the latter conclusion. Overall, the federal government, and state and local governments as well, needed to commit itself more fully to promoting the innovative developments if Americans were actually to enjoy the benefits of new-town living. America could not create model cities on the cheap.

Yet there was another problem that new-town advocates were reluctant to confront. Perhaps Americans were not all too eager to embrace life in a new town. Slow sales not only reflected the shortcomings of the federal government and the developers but also testified to a lack of interest among home buyers. Commenting on Jonathan’s unimpressive growth, a former marketing director noted that architectural innovation had little sales appeal in a metropolitan area where “95 per cent of the people want traditional housing.”114 A University of North Carolina study of resident attitudes in new towns concluded that “people may have moved to Jonathan in spite of its housing, since it was not viewed by many residents as a key aspect of the community’s appeal” and postulated that “the character of housing” in the new town might have accounted “for the very slow sales pace.”115 Similarly, the idea of economic and racial diversity did not appeal to many perspective residents. Concluding that Shenandoah “suffered from a marketing problem,” a study found that “county officials and residents … perceive[d] it to be a ‘low-income project,’ since almost one-half of its units constitute some form of assisted housing.” In 1979 a marketing report on Park Forest South, a federal new town outside Chicago, found that the community had a poor reputation among white, middle-class home seekers in part because of “the involvement of the federal government” and “a growing minority presence among the project’s residents.”116 The Irvine Company had originally not included low-income housing in its plans, and later attempts to accommodate the less affluent were not appreciated by many of the new town’s residents. In 1974 the mayor of Irvine explained her constituents’ aversion to low- and moderate-income housing: “The quality of life in our community began as a concern for the environment, for saving the eucalyptus trees. Now it means exclusion of blacks and chicanos.”117

In other words, the dreams of Robert Simon and James Rouse did not necessarily coincide with those of the average home buyer. Planners and professors might inveigh against sprawl, strip malls, and the ubiquitous golden arches of McDonalds, but sales figures demonstrated that a big yard, a big parking lot, and a Big Mac appealed to millions of Americans. Conventional subdivisions proved quite satisfactory to most Americans; consequently, there was no particular reason for them to flock to new towns. Moreover, many middle-class whites were adverse to risking their investment in a new home by settling in a community that included low-income blacks and public housing. Economic and racial integration might be a socially desirable goal, but when a family was making the largest investment of its life, safety seemed more important than social reform. Similarly, an investment in a solid neocolonial manse with conventional heating and cooling seemed much wiser than sinking one’s money into stack modules or solar-energy experiments. Idealistic multimillionaire developers and the federal government might be able to afford philanthropic reforms and daring experiments; the average home owner could not. Woodlands, the one Title VII community ultimately deemed a success, turned itself around not through ecological experiments or social class mixing but by attracting the Houston Open golf tournament to its freshly sodded links.118 In the America of the 1970s, golf could win more home buyers than could good deeds.

By 1975, then, the old central cities appeared to be going down the drain, and the idealistic option of creating new cities was being jettisoned. The traditional single-focus metropolis with its powerful downtown hub had largely disappeared, yet Americans did not seem too eager to accept the carefully planned communities of James Rouse or Robert Simon as a substitute. Instead, an amorphous suburbia was prevailing. This sprawling, unfocused mass, exempted by the Supreme Court from responsibility for the nation’s racial ills, was becoming the American norm.