“We are on the threshold of a crucial era of change in the urban way of life,” wrote the respected architect-planner Henry S. Churchill in 1945. “Vast disintegrating and destructive forces are loose on the world,” he observed, causing Americans to seek “new physical urban settings” as well as “new social and economic patterns.”1 Although his vision of the future was not perfect, Churchill’s sense of impending radical change proved prescient. The American city was indeed on the brink of a revolution that would transform the metropolis and the lifestyle of the nation’s residents. During the following half century, traditional notions of the city would become obsolete, and concepts standard to the understanding of urban areas would grow increasingly outmoded. By 2000 changes in metropolitan life would draw into question the meaning of such terms as “urban” and “suburban,” as language and notions appropriate to the world of Henry Churchill seemed to fall short of explaining the new reality.
In 1945 the United States was an urban nation, dominated by clearly defined urban places with an anatomy familiar and comprehensible to experts like Churchill as well as city dwellers in general. The metropolis was a place with readily discernible edges, its lifestyle sharply distinguished from that of the rural “rubes” and “hicks,” many of whom had obtained the benefits of electricity only a decade before. Cities were in the nation’s vanguard, enjoying the latest technology and defining the cutting edge in fashion and culture. They were the centers of commerce, manufacturing, entertainment, and intellect where the luckiest Americans made and spent their fortunes. Manhattan and Chicago were magnets attracting the ambitious and adventurous, those who sought to get ahead and enjoy the best in life. The vast expanses beyond metropolitan America were the “sticks,” the home of those who remained behind.
At the core of each of these urban places was a single central business district, the undisputed focus of the metropolitan area. Although segregated by socioeconomic class in different residential zones, all metropolitan Americans recognized the downtown as the center of urban life. It was the unquestioned hub of finance, retailing, office employment, government, and transportation, and Americans viewed the metropolis as radiating from this single preeminent center. Each metropolis had one dominant heart marked by bustling crowds and soaring skyscrapers that was perceived as essential to the urban area’s continued existence.
Metropolitan Americans not only perceived a single dominant focus for urban life, but also shared common space. The realities of urban existence forced the diverse elements of the populace to come into contact; it was difficult to escape the various fragments of the metropolitan mosaic. Because of rationing of gasoline and tires during World War II and because few families had more than one automobile, residents relied heavily on public transit. Middle-class men commuted to work on buses or streetcars that passed from middle-class neighborhoods through blue-collar districts, taking on working-class passengers, to the downtown area, a destination for residents from throughout the metropolis. Likewise, middle-class women shoppers traveled to downtown department stores by means of public transit, moving slowly through the various social zones of the city. On reaching downtown, they shared the sidewalks with businessmen, panhandlers, and working-class shoppers.
Metropolitan Americans not only shared common space, but had a common vested interest in urban governmental institutions. Although there were upper-middle-class suburban municipalities, the largest central cities still comprised a full range of neighborhoods from skid row to elite. The central-city government and central-city school administrators had to accommodate a socially and culturally diverse constituency, one that included all elements of the metropolitan social mix. Even residents of independent suburban municipalities generally worked and shopped in the central city, spending much of their lives within its boundaries. Their safety while shopping or working depended on central-city police and firefighting forces; the viability of their businesses depended on central-city tax rates and regulations. Despite the existence of suburban municipalities, then, the metropolis was to a great extent one city politically. Just as everyone recognized one common hub and shared common space, the overwhelming majority of metropolitan residents realized that the government of the central city affected all their lives and was of significance to the welfare of them all.
Thus in 1945 the great mass of metropolitan Americans still lived an urban existence. Although divided socially, they inhabited a shared metropolis and could not avoid day-to-day contact with one another. They were separated by ethnicity and class, but they lived in the same city. They were different elements of a shared urban world.
Over the course of the following half century, however, the single-focus metropolis disappeared and was replaced by an amorphous sprawl of population without a unifying hub or culture. By the close of the twentieth century, most metropolitan Americans commuted in private automobiles by themselves, or with co-workers of a similar social background, from their homes in one suburb to their jobs in another suburb. If they ever entered the central city, they generally did so along depressed expressways, their vision shielded from the dreary neighborhoods on either side of them. They shopped in enclosed suburban malls that excluded panhandlers and other “undesirables” and insulated them from the social and climatic hardships of the metropolis. The malls catered to the consumption patterns of their social class, and this generally ensured that they would be mixing with people like themselves. An increasing number of Americans were living in gated communities, insulated from those they did not want to see, walled off from the bothersome or threatening elements of the population. Moreover, as residents of suburban municipalities, they did not share a common city government with the less affluent of the central city. And their businesses, jobs, and favorite stores were not in the central city. What happened in central-city government or schools did not personally affect them.
The middle-class Americans who chose to avoid the suburban lifestyle and live in the central city were most often those least dependent on central-city government services. The back-to-the-city movement appealed to childless young professionals who did not suffer personally from the poor quality of inner-city public schools. Central cities attracted these young adults as well as gays and others who did not want to share the American “norm” along the suburban fringe. In other words, by the close of the twentieth century, American metropolitan areas had become spatially and culturally fragmented, with enclaves for the middle-class nuclear family of father, mother, and two children; with special communities for senior citizens, where those over sixty could be isolated from the more youthful; with gentrifying communities for young singles and gays; and with incipient hubs of gentrification inhabited by artists and others who liked to deem themselves bohemian. Because of their poverty, still others were relegated to the areas no one else wanted. Moreover, these disparate groups did not need to mix on a day-to-day basis. Middle-class suburbanites remained in the outlying areas twenty-four hours a day, removed from the other elements of the metropolitan mix.
By the close of the twentieth century, then, the single-focus metropolises had disappeared. Sprawling metropolitan regions, which defied traditional notions of a city, had supplanted them. In Sunbelt Florida, a metropolitan region mushroomed along the Atlantic coast, an unbroken stretch of dense human habitation sprawling over one hundred miles from south of Miami to north of West Palm Beach with two parallel superhighways serving as the regional main streets. Metropolitan development spread over thousands of square miles in southern California, and only a small portion of the residents of the Atlanta metropolitan area actually lived or worked in the city of Atlanta. Across America, new hubs of production and consumption developed around freeway interchanges, and multiple commercial centers dotted metropolitan regions. The central city was no longer central; most Americans lived in regions, not cities.
During the second half of the twentieth century, a revolution in ethnic composition, perception, and politics also transformed metropolitan America. In 1945 and during the following three decades, Americans inhabited black-and-white metropolises. The great ethnic divide that strongly influenced social and political development was between European Americans and African Americans. Other ethnic groups existed, but when Americans discussed the race problem, they meant the troubled relations between blacks and whites. Race was black and white. The black–white division underlay settlement patterns, political debate, and attitudes on schooling and policing. In the black–white city of the post–World War II era, the dilemma of race relations between European Americans and African Americans was an ever-present reality that could not be ignored.
From the 1970s on, however, a new wave of immigration, especially from Latin America and Asia, increasingly complicated the racial picture and transformed the ethnic profile of metropolitan America. By the 1990s, Hispanics outnumbered blacks in many cities, and Asians were a growing presence. In some cities, the percentage of the population that was black actually decreased as newcomers seemed poised to displace African Americans as the preeminent minority group. In Miami, Cuban Americans clashed with African Americans; in Los Angeles, Korean immigrants battled local blacks. Well-to-do Chinese newcomers “invaded” suburban areas, and in sharp contrast to traditional notions of ethnic invasion, they did not depress property values but brought new prosperity to their suburban communities. Asian American physicians, scientists, and engineers joined their European American counterparts in prestigious outlying subdivisions, creating an ethnic diversity at odds with longstanding stereotypes of suburbia.
By 2000, then, Americans inhabited a radically different world from that of 1945. Metropolitan areas sprawled over hundreds of square miles without a distinguishable common center or clear-cut edges. The black–white world had given way to a metropolitan population of every shade, an ethnic world more complex and less sharply defined than in 1945. It was a world that Henry Churchill and his colleagues from 1945 would not have understood, a world that did not conform to their preconceptions of the city.
Although many commentators of the late twentieth century deplored the decentralization of American life while others were wary of the new wave of immigrants, America’s metropolitan revolution reflected the felt desires of millions of people who enjoyed unprecedented freedom and mobility. The automobile liberated Americans from dependence on centripetal public transit; federal mortgage guarantees permitted millions of young white couples to escape from tenements or their in-laws’ spare room and purchase a house and yard of their own; Social Security and pension plans freed senior citizens from the necessity of living with their children and allowed them to opt for gated communities tailored to their interests; sexual liberation permitted homosexuals to come out of the shadows and openly create enclaves for themselves; and a gender revolution liberated young women from expectations of early marriage and substituted the possibility of a single life in the central city. Meanwhile, liberalized immigration laws unlocked the nation’s doors to millions of newcomers from Latin America and Asia, whereas heightened ethnic tolerance and civil rights legislation lowered the barriers to suburbanization for diverse ethnic groups.
Not everyone shared equally in the benefits of prosperity and mobility. Many had no choice but to take the bus and inhabit run-down apartments in crime-ridden neighborhoods. To an unprecedented degree, however, changing technology and increasing wealth enabled metropolitan Americans to pursue different lifestyles and carve spatial niches tailored to their individual preferences. Decentralization and fragmentation undermined prospects for a united metropolitan community. Yet the amorphous pattern of 2000 seemed to reflect the amorphous nature of American life. Metropolitan Americans chose to disperse rather than cluster.
The result was a world that even scholars and journalists of the late twentieth century had a difficult time comprehending. As business moved to the metropolitan edge, they sought to label this new inside-out world in which the center was on the rim and the hub was increasingly peripheral. Some wrote of urban villages, others of edge cities, still others of technoburbs, and some settled for the generic post-suburbia. They knew that they were living in a world in which the traditional labels of urban and suburban no longer exactly fit, but they struggled to conceptualize the strange new environment around them. It simply did not make sense to people unable to escape the concepts of the past. Where did the so-called Philadelphia metropolitan area begin and the New York metropolitan area end? Was Princeton, New Jersey, a satellite orbiting around Philadelphia or New York City? Or was it an independent body, revolving around neither of the historic central cities? Was the larger city of Virginia Beach a suburb of the older city of Norfolk, and what about the populous adjoining municipalities of Chesapeake, Newport News, and Hampton? Was Mesa, Arizona, a city larger than Pittsburgh or Cincinnati, a suburb of Phoenix, and, if so, what made Mesa suburban and Phoenix urban? And what about the adjacent cities of Scottsdale and Glendale, both of which had over 200,000 residents?
Similarly, the experts floundered in their attempts to categorize the new ethnic world. In 1945 the Census Bureau had divided the population into white and nonwhite, the latter consisting primarily of African Americans and a relative small population of East Asians, American Indians, and Pacific Islanders. A Hispanic category did not exist; Mexican Americans were whites. In the racial world of 1945, the categories of white and nonwhite worked. One was either white or not, and that was all that mattered. In 2000, however, the new ethnic world had destroyed the simplicity of past census dichotomies. The 2000 census gave Americans the option of choosing “one or more races” to describe their racial identities. About 6.8 million respondents selected this option, most often identifying themselves as white and some other race. Moreover, more than twice that number passed over white, black, Asian, American Indian, and Pacific Islander categories and identified themselves simply as “some other race,” although it was unclear what other races there were. Thus millions of Americans did not conceive of themselves as fitting into the traditional convenient categories of race. In 1945 metropolitan Americans knew quite clearly whether they were black or white, and if anyone had any doubts about racial category, Jim Crow laws in the South and less formal social restraints in the North would make clear their racial place in society. At the beginning of the twenty-first century, the racial picture had become murky, and the validity of racial categorization seemed in question.
The metropolitan revolution of the second half of the twentieth century thus swept away the spatial and racial certainties of the past. The black–white, single-focus metropolis with clearly identifiable central cities and dependent suburbs yielded to a strange new world that traditional thinkers could barely comprehend. This was the world in which Americans of the early twenty-first century would have to live. This was the new scenario with which they would have to come to terms and whose problems they would have to confront.