Francophone populations, though usually neglected in recounting the history and development of the South, have nonetheless played a prominent role, exemplifying ethnic-racial complexities. The first settlers in North America seeking refuge for religious reasons were not the Pilgrims at Plymouth Rock in 1620, but French Huguenots in 1562 at coastal Charlefort (sometimes also called Charlesfort), located in what is now South Carolina. Because of privation and Spanish aggression, the settlement at Charlefort did not last long, but toward the end of the 17th century hundreds more Huguenots became part of the founding population of South Carolina, especially in Charleston. Many of their progeny grew wealthy, and indeed for a time Charleston rivaled every other colonial city on the Atlantic Coast for affluence, because of the extensive development of slave-based rice plantations. Despite sporadic testimonies of the subsequent survival of French, it appears that these prominent Huguenots (the most famous among them being “Swamp Fox” Francis Marion) assimilated rather quickly to the English language that predominated in the surrounding colonial setting. Much more for the sake of tradition than for linguistic necessity, the Huguenot Church at the corner of Church and Queen streets in Charleston still conducts an occasional service in French according to the 18th-century liturgy of les Eglises de la Principauté de Neuchâtel et Valangin.
Though Charleston is unrivaled in Old South tradition, its French Quarter cannot contend with the French Quarter of New Orleans, La., as the leading French-related cultural icon of the South. Indeed, the best-known and longest-surviving Francophone population in the South—and, until recently, the largest—has been located in Louisiana and its environs since the early 18th century. Robert Cavelier de La Salle descended the Mississippi from New France to its mouth in 1682 and claimed the entire Mississippi Valley and its tributaries for Louis XIV of France (hence, la Louisiane). Soon after, French settlements, or posts, were founded at Biloxi (1699), Mobile (1701), Natchitoches (1714), and La Nouvelle Orléans (1718). The varieties of French from the founding of Louisiana up to the present time form a complex picture, much of which is still speculative. Records clearly attest that French was the language of the colonial administrators, and some form of “popular French” was certainly in wide use. However, French was probably not the first language of many early colonists, who would have spoken a regional patois of France or who were often recruited from Germany or Switzerland. Moreover, communication with early indigenous peoples took place through a regional lingua franca, Mobilian Jargon, which served as a trade language and the language of diplomacy rather than French. It is noteworthy that very few French borrowings penetrated into the indigenous languages of the Southeast, such as the language of the Choctaw, with whom the French had a strong and long-standing military alliance.
The largest single population of French speakers to arrive in Louisiana during the colonial period did not come directly from France. French colonists who left the west central provinces of France and arrived in the early 17th century in Acadia (modern Nova Scotia and New Brunswick) were expelled from there by the British in 1755. Between 1765 and 1785, approximately 3,000 of them migrated to Louisiana (under Spanish administration from 1763) and occupied arable land principally along the bayous of southeastern Louisiana and the prairies of south-central Louisiana. This prolific population was destined to become the standard-bearer for maintaining French in Louisiana to the present day.
During the Spanish administration, the plantation economy of Louisiana began to blossom, dramatically affecting the French language there. On the one hand, the massive importation of slaves, coming directly from West Africa for the most part, apparently led to the formation of a French-based Creole language. Theories vary as to the genesis of Louisiana Creole, as it is usually referred to by scholars. The debate surrounding the origin of creole languages is complex, but, at the risk of oversimplifying, the two poles of opposition can be summed up here. Some scholars contend that creole languages were spontaneously generated on (large) plantations where slaves were linguistically heterogeneous and did not share a common tongue. According to this view, the structural parallels among creole languages are because of either linguistic universals or the interaction of a particular set of African and European languages. Others contend that most creole languages are simply daughter dialects of a pidgin associated with the slave trade, and though the lexicon can vary from one site to another owing to different vocabulary replacement, their basic structure remains the same. Regardless, Louisiana Creole became the native mode of communication within the slave population of Louisiana. Very frequently it was also the first language of the slave masters’ children, who were typically raised by domestic bondservants.
In tandem with this, the wealth of what was then known as the Creole society grew. Creole society is not to be confused with the entire population of Louisiana Creole speakers but is composed rather of the affluent European-origin planter class and also of mostly biracial Creoles of Color who held considerable social standing during the Spanish administration, sometimes being plantation owners themselves. Their considerable resources allowed for widespread schooling among Creoles and the resultant acquisition of the evolving prestige French of France, by virtue of boarding schools in France and Louisiana or by private tutoring. Referred to as Plantation Society French in recent scholarship, this brand of French has all but disappeared, in part because of the ruin of Creole society in the aftermath of the Civil War and the resultant severing of ties with France and in part because of the rather swift acquisition of English (even before the Civil War) by members of Creole society. English became economically and socially important under the new American administration, which brought a massive influx of Anglophones, both free and slave, into Louisiana after the Louisiana Purchase of 1803. Not all newcomers after 1803 were English speakers, however. Prior to the Civil War, the affluence of Louisiana attracted additional Francophone immigrants, often educated, whose ranks were swelled by defeated Bonapartistes banished from France and by planters (with their creolophone slaves in tow) fleeing the successful slave revolt in Saint-Domingue (modern Haiti). Not all of the 19th-century newcomers settled in Louisiana: a group of Bonapartiste exiles founded Demopolis (in Alabama) in 1817, just prior to statehood.
Dependent on wealth generated by the plantation system and vulnerable to competition with the English for maintaining socioeconomic standing, once-prestigious Plantation Society French disappeared from use rather quickly. The vast majority of the agrarian, lower-class Acadian (or “Cajun”) population worked small farms and did not operate plantations (though some Acadians did gentrify and merge with creole society). Their autonomy and relative isolation led to a greater longevity for Cajun French. Significant numbers of immigrants (German, Irish, Italian, for example) in some places even assimilated to the use of Cajun French for a time. Meanwhile, Louisiana Creole was undergoing its demise, ultimately because of the breakup of the plantation system itself, but even before that because of the immense influx of Anglophone slaves during the 19th century when the importation of foreign slaves was prohibited, leading to massive importation of slaves from states farther east to meet the demand in booming Louisiana. Nevertheless, the social isolation of its speakers led Louisiana Creole to fare better than Plantation Society French, and today approximately 4,500 mostly elderly speakers remain, with the largest concentrations in Point Coupee and St. Martin parishes.
Acadiana, the 22-parish area where most Cajun French speakers still live, can be roughly described as a triangle whose apex is in Avoyelles Parish in the center of the state and whose base extends along the coast from eastern Texas to the Mississippi border. The Cajun French spoken there did not begin its demise until the 20th century, with the advent of compulsory schooling in English in 1916. This factor, coupled with better state infrastructure and accelerated exposure to mass media, eroded the social isolation of the Cajun population and led inexorably to assimilation to the English standard by younger ethnic Cajuns. Despite an important resurgence in Cajun pride since the late 1960s (the “Cajun Renaissance”) and attempts to bolster French in Louisiana, only the smallest fraction of children are now acquiring fluency in any variety of French. The few thousand who do so are not acquiring it in natural linguistic communities but primarily in French immersion programs in the public schools (introduced in 1968 when the state created the Council for the Development of French in Louisiana, CODOFIL, to reclaim French and promote bilingualism). Nevertheless, the Cajun Renaissance has spawned preservationist tendencies in some households, has resulted in a modicum of literary production by Cajun poets, playwrights, and storytellers, and is linked to the resurgence in popularity of Cajun music. Authenticity demands that Cajun musicians perform pieces in French and that even young songsmiths compose a portion of their newest lyrics in Cajun French.
Census figures for 2000 indicate a Francophone population of 194,100 for Louisiana. Even adding the creolophone population (4,685 in 2000) to that figure, the total fell far short of the combined French-speaking and creolophone populations of Florida in 2000 (125,650 + 211,950 = 337,600). Though European immigrants must also be taken into consideration, Florida is the new front-runner in the South, primarily because of its large number of French Canadian “snowbirds” and recent Haitian refugees. The Canadians, though elderly, represent a sustainable population as long as Florida remains an attractive retirement location, whereas the mostly elderly Francophone population of Louisiana is not self-replacing, except in a very small minority of cases where grandchildren are being reintroduced to French in a conscious attempt to preserve a linguistic legacy.
A description of the salient features of Cajun French and Louisiana Creole must take into account archaisms harking back to the French of the colonizers as well as innovations resulting from isolation and contact with other languages, especially English. For example, in various vocabulary items such as haut ‘high’ and happer ‘to seize’, both Cajun French and Louisiana Creole preserve the archaic pronunciation of the initial h, whereas the initial h has fallen silent in the contemporary French of France. At the grammatical level, both Cajun French and Louisiana Creole preserve the progressive modal après (sometimes apé or ap) of western regional France, denoting an ongoing action—for example, je sus après jongler ‘I’m thinking’ (Cajun French) or m’apé fatigué ‘I’m getting tired’ (Louisiana Creole). This usage is entirely absent in the standard French of France. Concerning innovations, historic contact with indigenous languages has enriched the vocabularies of French dialects in Louisiana with words such as chaoui ‘raccoon’ and bayou (subsequently borrowed into English). Contact with dominant English has had a profound impact on both Cajun French and Louisiana Creole. Assimilated borrowings are easy to find (récorder ‘to record’), but because of near-universal bilingualism among Cajun French speakers and creolophones in Louisiana, it is even more common to hear the insertion of English vocabulary items into French or creole conversation: j’ai RIDE dessus le BIKE ‘I rode the bike.’ Imitative calques of English phrasing are also common, as in the case of a Cajun radio announcer reciting the standard expression apporté à vous-aut’ par ‘brought to you by.’
Though overlapping vocabularies are extensive, Louisiana Creole is distinct from Cajun French in a variety of ways. Of particular note in Louisiana Creole are a different system of pronouns, more frequent use of nouns that have permanently incorporated part or all of what were once preceding French articles, placement of the definite article after its noun, absence of linking verbs, absence of inflection on main verbs, and use of various particles to indicate tense. Compare the Louisiana Creole sentence yé té lave zonyon-yé ‘they washed the onions’ with ils ont lavé les oignons in Cajun French.
English in Louisiana dethroned Plantation Society French as the most highly valued idiom and may have temporarily protected nonstandard Louisiana Creole and Cajun French from absorption by prestigious standard French (such was the demise of patois and regional French in France). However, today English has become a formidable competitor, with the imminent prospect of supplanting all traditional varieties of Louisiana French and Louisiana Creole in the region where both thrived for over two-and-a-half centuries. Yet despite the demise of French as a first language in Louisiana, its influence remains noticeable in the spoken English of the region. Even in New Orleans, where the transition from French to English as the language of everyday communication has long been complete, vestiges of French can be found in colloquial calques such as get down ‘get out’ (of a vehicle) and borrowings such as banquette ‘sidewalk,’ parrain ‘godfather,’ and beignet (type of fried dough). This phenomenon is even more common in rural Cajun communities, where one hears French borrowings in remarks such as we were so honte (that is, ‘embarrassed’) and I have the envie (that is, ‘the desire’) for rice and gravy. Less common but still used are structural calques from French: Your hair’s too long. You need to cut ’em (hair as plural, corresponding in use to les cheveux) or That makes forty years we married (calqued from ça fait quarante ans qu’on est marié).
English is not the only force arrayed against the survival of French in Louisiana. In Plaquemines Parish, Hurricane Katrina, in 2005, decimated one of the few remaining non-Cajun Francophone enclaves, and ecological degradation, such as that which augmented the devastating effects of the storm, is contributing to the breakup of some of the more isolated Francophone communities (who mostly self-identify as Houma Indians) in Terrebonne Parish, where French has been best preserved among younger speakers.
MICHAEL D. PICONE
University of Alabama
AMANDA LAFLEUR
Louisiana State University
Barry Jean Ancelet, Cajun and Creole Folktales: The French Oral Tradition of South Louisiana (1994); Carl A. Brasseaux, French, Cajun, Creole, Houma: A Primer on Francophone Louisiana (2005); Marilyn J. Conwell and Alphonse Juilland, Louisiana French Grammar (1963); Thomas A. Klingler, If I Could Turn My Tongue Like That: TheCreole Language of Pointe Coupee, Louisiana (2003); Kevin J. Rottet, Language Shift in the Coastal Marshes of Louisiana (2001); Albert Valdman, ed., French and Creole in Louisiana (1997).
(ca. 1901–1960) WRITER AND FOLKLORIST.
Born in either 1891 or 1901—the latter is normally given as the date of birth but recent studies suggest an earlier date—in the all-black town of Eatonville, Fla., Zora Neale Hurston became a distinguished novelist, folklorist, and anthropologist. She was next to the youngest of eight children, born the daughter of a Baptist minister who was mayor of Eatonville. Her mother died when Hurston was 9, and she left home at 14 to join a traveling show. She later attended Howard University, where she studied under Alain Locke and Lorenzo Dow Turner, and she earned an A.B. degree from Barnard College in 1928, working with Franz Boas. She became a well-known figure among the New York intellectuals of the Harlem Renaissance in the mid-1920s and then devoted the years 1927 to 1932 to field research in Florida, Alabama, Louisiana, and the Bahamas. Mules and Men (1935) was a collection of black music, games, oral lore, and religious practices. Tell My Horse (1938) was a similar collection of folklore from Jamaica and Haiti.
Hurston published four novels—Jonah’s Gourd Vine (1934), Their Eyes Were Watching God (1937), Moses, Man of the Mountain (1939), and Seraph on the Sewanee (1948). Her autobiography, Dust Tracks on a Road, appeared in 1948. Married and divorced twice, she worked for the WPA Federal Theatre Project in New York (1935–36) and for the Federal Writers’ Project in Florida (1938). She taught briefly at Bethune-Cookman College in Daytona Beach, Fla. (1934), and at North Carolina College in Durham (1939), and she received Rosenwald and Guggenheim fellowships (1934, 1936–37).
In her essay “The Pet Negro System,” Hurston assured her readers that not all black southerners fit the illiterate sharecropper stereotype fostered by the northern media. She pointed to the seldom-noted black professionals who, like herself, remained in the South because they liked some things about it. Most educated blacks, Hurston insisted, preferred not to live up North because they came to realize that there was “segregation and discrimination up there, too, with none of the human touches of the South.” One of the “human touches” to which Hurston referred was the “pet Negro system” itself, a southern practice that afforded special privileges to blacks who met standards set by their white benefactors. The system survived, she said, because it reinforced the white southerner’s sense of superiority. Clearly, it was not a desirable substitute for social, economic, and political equality, but Hurston’s portrayal of the system indicated her affirmative attitude toward the region, despite its dubious customs.
Hurston had faith in individual initiative, confidence in the strength of black culture, and strong trust in the ultimate goodwill of southern white people, all of which influenced her perceptions of significant racial issues. When she saw blacks suffering hardships, she refused to acknowledge that racism was a major contributing factor, probably because she never let racism stop her. Hurston’s biographer, Robert E. Hemenway, notes that “in her later life she came to interpret all attempts to emphasize black suffering . . . as the politics of deprivation, implying a tragedy of color in Afro-American life.”
After working for years as a maid in Miami, Hurston suffered a stroke in early 1959 and, alone and indigent, died in the Saint Lucie County Welfare Home, Fort Pierce, Fla., on 28 January 1960. Alice Walker led a “rediscovery” of Hurston, whose works have become inspiration for black women writers.
ELVIN HOLT
University of Kentucky
Valerie Boyd, Wrapped in Rainbows: The Life of Zora Neale Hurston (2002); Robert E. Hemenway, Zora Neale Hurston: A Literary Biography (1977); Zora Neale Hurston, I Love Myself, ed. Alice Walker (1979); Carla Kaplan, ed., Zora Neale Hurston: A Life in Letters (2002); Alice Walker, In Search of Our Mothers’ Gardens (1983).
Following decades of racial prejudice, combined with economic envy and sexual anxiety over miscegenation and nationality, 120,000 Japanese Americans during World War II were indiscriminately subjected to eviction from their homes, expulsion from the West Coast, detention in 16 makeshift compounds, and then concentration into 10 longer-term camps. Empowered by President Franklin Roosevelt’s Executive Order 9066 of 19 February 1942, military and civilian officials carried out these constitutionally dubious procedures with guns and euphemism. The forced migrations were called, in turn, “evacuation,” implying rescue; “removal,” suggesting it was voluntary; “assembly,” a purported freedom; and “relocation,” a mere transfer.
However, after a grueling four-day train journey, blinds down, Americans of Japanese descent arriving at the two camps in southeast Arkansas expected no relief. Disembarking, they saw guard towers, barbed wire fencing, and—at both camps—nearly a square mile of barracks. Previously referred to by their dehumanizing family numbers, attached to clothing and suitcases, they now would be known by their three-part address: block and building number and the letter of their one-room apartment, none more than 20 by 24 feet. Ever after, Japanese Americans would remember where they or their ancestors had been incarcerated during the war. Here, near the banks of the Mississippi, in rural Chicot and Desha counties, respectively, it was the Jerome and Rohwer camps.
With peak populations of 8,500 each, Jerome and Rohwer brought together captives from southern California and the Central Valley, along with white staffers, mostly Arkansans. They worked side by side in similar jobs, as teachers and legal professionals, for example, but at a fraction of the pay for the former. Still, Japanese American women, less tied to onerous domestic duties, worked at the same rates as Japanese American men, enjoying unexpected autonomy. Most jobs were voluntary, though boys and men were forced into hazardous logging and woodcutting operations for winter fuel, resulting in dozens of injuries and at least three deaths. Labor unrest was characterized by strikes and harsh administration crackdowns, especially at Jerome. Inmates were targeted for indoctrination, through so-called Americanization campaigns in the schools and adult classes. The largely Buddhist population endured aggressive Protestant revivals.
English-speaking Christians gained special privileges, including greater freedom of movement. Leave was selectively granted for temporary work contracts outside the camps and for college education. Japanese Americans enrolled at otherwise segregated southern campuses, including elite private universities such as Emory, Rice, Tulane, and Vanderbilt and flagship state universities in Florida, North Carolina, and Texas, among others. From all 10 camps and from the Territory of Hawaii, thousands volunteered—more were later drafted—for military service in the 442nd Regimental Combat Team, which trained at Camp Shelby in Mississippi. Though segregated into their own units and barracks, the 442nd competed in “white” rather than “colored” sports leagues, at Shelby and across the South. Generally under local biracial structures of Jim Crow, authorities encouraged soldiers, students, contract laborers, and day-pass holders to use white facilities and to frequent white establishments. Many of these establishments turned them away.
At the same time, officials conspired to delimit Japanese American interaction with whites and with blacks. Interracial dating and marriage were particularly proscribed, and given the segregation of USO clubs around Camp Shelby, Japanese American women at Jerome and Rohwer were regularly bussed 250 miles for dances with the 442nd. Leave was denied to inmates with job offers from black schools and businesses. When black agricultural workers took collective action against local planters, these labor hotspots were declared off-limits to Japanese Americans. Though sometimes joining African Americans, as a matter of principle, in the back of the bus, Japanese Americans more commonly sought and occasionally secured the benefits of higher status, even blacking up with their white captors for camp minstrel shows.
When government agencies finally attempted to ascertain individual allegiance, rather than assuming collective guilt, Japanese American discontent hardened. Across all 10 camps in the West and the South, an ill-conceived, ill-administered loyalty questionnaire both confounded and enraged prisoners. And at higher percentages than at any other camp, Jerome protesters were reassigned to the Tule Lake camp in California, reconfigured as an isolation center. Among them was Tokio Yamane, who in sworn testimony to Congress described officers torturing him and others there. Yamane and hundreds more renounced their American citizenship and moved to Japan. No Japanese American was ever convicted of espionage.
Though resilient and productive, growing almost all their own food and sourcing other essentials through an innovative system of cooperative stores, Japanese Americans in Arkansas experienced yet another wrenching upheaval. Called “resettlement” by authorities, the closing of the camps involved a calculated dispersal of Japanese Americans in order to break up prewar ethnic enclaves: a massive population redistribution from west to east. Arkansas inmates proposed converting Jerome and Rohwer into agrarian cooperative colonies, given their marked improvements to local infrastructure. Instead, the government sold it all and kicked them out. Just as early in the incarceration a despairing John Yoshida committed suicide at the railroad tracks near Jerome—haunting photos of his decapitated body lingering in the archives—so too in the wake of Hiroshima, ancestral prefecture to the largest number of Japanese Americans, did Julia Dakuzaku take her life after release from Arkansas.
The most cynical euphemism of all, persisting to this day, “internment” is a concept recognized in international law for the wartime detention of “enemy aliens,” citizens of combatant nations. The vast majority—over 70 percent—of the 120,000 people imprisoned in American concentration camps were birthright U.S. citizens; the remaining 30 percent were decades-long residents who were forbidden naturalization. Though these Americans of Japanese descent advocated for and eventually won restitution as partial recompense for loss of income and property, the United States has never accepted responsibility for loss of life.
JOHN HOWARD
King’s College London
Roger Daniels, in Nikkei in the Pacific Northwest: Japanese Americans and Japanese Canadians in the Twentieth Century, ed. Louis Fiset and Gail M. Nomura (2005); John Howard, Concentration Camps on the Home Front: Japanese Americans in the House of Jim Crow (2008); Emily Roxworthy and Amit Chourasia, Drama in the Delta: Digitally Reenacting Civil Rights Performances at Arkansas’ Wartime Camps for Japanese Americans, www.dramainthedelta.com; Jason Morgan Ward, Journal of Southern History (February 2007).
Jazz can be defined by its musical elements, such as improvisation, syncopation, blue notes, cyclical forms, and rhythmic contrasts, but also through its place in the social, economic, and cultural history of the United States. The subject of its racial identity has proved particularly thorny for musicians, critics, and audiences, for despite the diversity of jazz styles and performers, it is rooted in African American musical traditions at the turn of the 20th century. The tension between its aesthetic breadth and racial specificity has long animated debates regarding ownership, politics, and history. Rather than simplifying matters, its New Orleans origins demonstrate that the birth of jazz was enmeshed in the historical development of modern African American identity and culture.
Among the most prosperous and populous cities in the antebellum South—central to both international commerce and the domestic slave trade—19th-century New Orleans was musically and culturally rich. Contemporaries noted with wonder the proliferation of opera houses, orchestras, parades, and carnivals, as well as the slaves and free blacks who gathered at Congo Square on Sundays to dance and perform. With the end of American slavery, waves of freedpeople left the rural South. They brought with them African-influenced musical traditions, including spirituals and field songs as well as instrumental bands and arrangements formed in the lower Mississippi Delta. By the end of the 19th century, the city was an amalgam of English, French, and Spanish cultures, but its structure was soon transformed in the wake of Jim Crow racial codes spreading throughout the South. Caribbean Creoles, or gens de couleur, had previously held significant political and economic power as an intermediate group between white Europeans and blacks. However, with the Supreme Court’s 1896 Plessy decision to uphold the separate-but-equal statute, the situation changed. Longtime Creole residents and recent black migrants, who continued to enter the city in ever-greater numbers seeking employment, found themselves on the same side of a stricter color line.
While Jim Crow was racially redefining many of its pioneers and performers, the musical components of jazz came together in New Orleans. It is difficult to know what early jazz sounded like without recorded documentation, but scholars believe that fundamental transformations in the form and instrumentation of popular music occurred between 1900 and 1910. Hoping to fill a dance hall, attract a crowd, or enliven a march, Uptown blacks and Downtown Creole musicians crossed previous social and geographic divides. They found new techniques and instrumentations in the various quarters of New Orleans and transformed well-known arrangements and popular songs by syncopating instrumental parts, improvising melodies over marchlike rhythms, and adding melodic strains and blue notes. Musicians drew on a diverse repertory, which included ragtime, marches, blues, popular songs, dance music, Spanish Caribbean music, and spirituals. They played in brass bands and in smaller ensembles made up of a rhythm section—drums, piano, banjo, bass, tuba—and a lead section, which could include a clarinet, trumpet, cornet, or trombone. While this new sound was associated with saloons in the red-light district of Storyville, groups like the Golden Rule and the Eagle played at store openings, parades, picnics, funerals, and balls. Bandleaders such as Buddy Bolden, Manuel Perez, Nick LaRocca, and Kid Ory cultivated a dense and competitive musical environment that circulated new rhythmic and melodic styles. Many of the soloists known for the New Orleans jazz style, like Joe “King” Oliver, Bunk Johnson, Louis Armstrong, Jelly Roll Morton, and Sidney Bechet, trained within an all-male environment that valued individual technique as well as ensemble arrangements and fraternal support.
Through itinerant musicians and expanding transportation networks, many of the identifiable musical precedents of jazz continued to develop on their own beyond New Orleans. In St. Louis, ragtime and blues formed the root of a distinctive boogie-woogie piano-playing style. Meanwhile, the Kansas City style achieved national recognition through musicians like Count Basie and Mary Lou Williams. The development of jazz was further aided by the large-scale migration of African Americans that began at the turn of the century. Like other black Americans who left the South, jazz musicians sought work in Chicago, New York, and California, where they found new audiences but maintained southern professional networks. The migration was accompanied by major advances in radio and recording technology that in turn made jazz a national and international music. White New Orleans musicians in the Original Dixieland Jazz Band made the first jazz recordings, but soon black southerners, including Oliver and Armstrong, were putting out records and touring Europe. As jazz gained global recognition, the American South remained an important cultural symbol. Jazz enthusiasts in the 1920s flocked to New York’s Cotton Club and Chicago’s Plantation Club, where racial stereotypes of the South were reproduced for urban audiences in the North. In the 1930s, the revival of the Dixieland style represented authenticity in the face of the rising popularity of big band arrangements. The South also remained politically important for many musicians. A wide range of performers supported the southern civil rights movement by fund-raising for the Congress of Racial Equality (CORE), speaking out against civil rights abuses, and creating music, like Max Roach’s 1960 We Insist! The Freedom Now Suite. From its origins to contemporary efforts at preservation, the South has been a site of opportunity and struggle for jazz musicians, shaping the roots of this racially defined yet culturally cosmopolitan music.
CELESTE DAY MOORE
University of Chicago
David Ake, Jazz Cultures (2002); Sidney Bechet, Treat It Gentle (1960); Charles B. Hersch, Subversive Sounds: Race and the Birth of Jazz in New Orleans (2007); William H. Kenney, Chicago Jazz: A Cultural History, 1904–1930 (1993); Ingrid Monson, Freedom Sounds: Civil Rights Call Out to Jazz and Africa (2007); Jelly Roll Morton, Jelly Roll Morton: The Complete Library of Congress Recordings by Alan Lomax (1938); Burton Peretti, The Creation of Jazz: Music, Race, and Culture in Urban America (1992); Eric Porter, What Is This Thing Called Jazz? African American Musicians as Artists, Critics, and Activists (2002).
(1929–1968) CIVIL RIGHTS LEADER.
Born into a middle-class black family in Atlanta, Ga., on 15 January 1929, Martin Luther King Jr. emerged as the key figure in the civil rights crusade that transformed the American South in the 1950s and 1960s. As a student at Atlanta’s Morehouse College (1944–48), he majored in sociology and developed an intense interest in the behavior of social groups and the economic and cultural arrangements of southern society. King’s education continued at Crozer Theological Seminary (1948–51) and Boston University (1951–55), where he studied trends in liberal Christian theology, philosophy, and ethics, while also engaging in an intellectual quest for a method to eliminate social evil. With a seminary degree and a Ph.D. from Boston, King lived remarkably free of material concerns and personified the intellectual-activist type that constituted the principal model for W. E. B. Du Bois’s talented-tenth leadership theory.
Although mindful of how poverty and economic injustice victimized both races in the South in his time, King understood the social stratification of the region largely in terms of the basic distinctions between powerful whites and powerless Negroes. Framing the struggle as essentially a clash between loveless power and powerless love, King rose to prominence in the Montgomery Bus Boycott in 1955–56, and he and his Southern Christian Leadership Conference (SCLC) later led nonviolent direct-action campaigns for equal rights and social justice in Albany, Birmingham, St. Augustine, Selma, and other southern towns. King’s celebrated “I Have a Dream” speech during the March on Washington on 28 August 1963 firmly established him as the most powerful leader of the black freedom struggle.
After receiving the Nobel Peace Prize in 1964, King moved toward a more enlightened and explicit globalism. Convinced that the struggle for basic civil and/or constitutional rights had been won with the Civil Rights Act of 1964 and the Voting Rights Act of 1965, he turned more consciously toward economic justice and international peace issues. He saw the interconnectedness of racial oppression, class exploitation, and militarism and moved beyond integrated buses, lunch counters, and schools for blacks to highlight the need for basic structural changes within the capitalistic system. He recognized that economic justice was a more complex and costlier matter than civil rights and that poverty and economic powerlessness afflicted both people of color and whites. He prophetically critiqued the wealth and power of the white American elites and chided the black middle class for its neglect of and indifference toward what he labeled “the least of these.” King also fought for the elimination of slum conditions in Chicago in 1965–66, launched a Poor People’s Campaign in 1967, and participated in the Memphis Sanitation Workers’ Strike in early 1968. His attacks on capitalism, his call for a radical redistribution of economic power, his assault on poverty and economic injustice in the so-called Third World, and his cry against his nation’s misadventure in Vietnam were all aimed at the same structures of systemic social evil. King framed his vision in terms of the metaphors of “New South,” “American Dream,” and “World House,” all of which embodied what he considered the highest human and ethical ideal, namely, the beloved community, or a completely integrated society and world based on love, justice, human dignity, and peace.
King’s broadened social vision can be understood in terms of democratic socialism and the tactics of massive civil disobedience and nonviolent sabotage that he thought would be required to achieve this ideal. While traveling to Oslo, Norway, to receive the Nobel Prize, he saw democratic socialism at work in the Scandinavian countries. In King’s estimation, democratic socialism, which he considered more consistent with the Christian ethic than either capitalism or communism, would allow for the nationalization of basic industries, massive federal expenditures to enhance city centers and to provide employment for residents, a guaranteed income for every adult citizen, and universal education and health care, thus amounting to the kind of sweeping economic and structural changes essential for the creation of a more just, inclusive, and peaceful society.
King was assassinated in Memphis, Tenn., on 4 April 1968, weeks before his planned Poor People’s Campaign was launched. Economic justice and international peace remain as the core issues in his unfinished holy crusade. In the half century since his death, some conservative forces have increasingly sought to use him as a kind of sacred aura for their own political ends, particularly in their attacks on affirmative action, immigration, reparations, and government spending for social programs.
LEWIS V. BALDWIN
Vanderbilt University
Lewis V. Baldwin, The Voice of Conscience: The Church in the Mind of Martin Luther King, Jr. (2010); Clayborne Carson, ed., The Autobiography of Martin Luther King, Jr. (1998); Kenneth L. Smith, Journal of Ecumenical Studies (Spring 1989); William D. Watley, Roots of Resistance: The Nonviolent Ethic of Martin Luther King, Jr. (1985).
The Ku Klux Klan (KKK) is the oldest documented white supremacist group in the United States. Historically, the KKK precipitated, engaged in, and supported numerous acts of intimidation and violence in the South. Bombings, murders, assaults, and other violent acts were sanctioned by the social norms of southern culture during a time in which KKK members were also employed in positions of power (for example, as sheriffs and judges). Their place in society contributed to the disproportionate enforcement, prosecution, and sentencing of whites who antagonized and victimized blacks and others in the South. Although the first two waves of KKK members benefited from a cohesive unit of organization, members of the third wave arose from dozens of independent groups that utilized the KKK moniker during the 1960s in resistance to the civil rights movement.
The rise of black freedom struggles in the 1950s provoked a massive resistance on the part of southern whites. The KKK reemerged as the most violent expression of this resistance. KKK members were implicated in a series of incidents, including the 1963 church bombing that killed four young girls, the 1963 assassination of NAACP organizer Medgar Evers, and the 1964 murder of three civil rights workers in Neshoba County, Miss. The nationwide media coverage of the aftermath of these violent incidents contributed to the KKKYou need to cuts increasingly unfavorable image outside the South.
During the 1970s and 1980s, racially motivated acts of violence perpetrated by KKK members did not cease entirely. For instance, in 1979, in what came to be known as the Greensboro Massacre, KKK members (in collaboration with Nazi Party members) murdered five protesters at an anti-Klan rally in Greensboro, N.C. In 1980, four older black women were shot after a KKK initiation rally in Chattanooga, Tenn. In 1981, Michael Donald became the last documented lynching in Alabama. Unlike earlier incidences in which cases were dismissed or offenders were acquitted by all-white juries, the perpetrators of these acts were criminally prosecuted for their crimes. In some instances, KKK organizations faced civil opposition, resulting in their financial collapse (for example, United Klans of America and Imperial Klans of America). The Southern Poverty Law Center’s founder, Morris Dees, led civil cases against these groups, and the U.S. government increased its oversight. These factors made it increasingly unacceptable for the KKK to resort to violence as a means to further its political agenda.
Today the Southern Poverty Law Center estimates that thousands of KKK members are split among at least 186 KKK chapters. These fragmented factions have been weakened by “internal conflicts, court cases, and government infiltration.” However, they still disseminate hate against blacks, Jews, Latinos, immigrants, homosexuals, and Catholics. Instead of violence, some of today’s KKK organizations focus on collective political action by participating in and restructuring the government. Others focus on marketing strategies in order to appeal to mainstream America, with the intention of increasing recruitment and disseminating their ideology to a wider audience. Although violent acts, like the 2008 murder of a woman in Louisiana after a failed KKK initiation, do still randomly occur, violence is no longer considered a socially accepted means to achieving white hegemony in the South.
STACIA GILLIARD-MATTHEWS
West Virginia University
Josh Adams and Vincent Roscigno, Social Forces (December 2005); Chip Berlet and Stanislav Vysotsky, Journal of Political and Military Sociology (Summer 2006); David Chalmers, Backfire: How the Ku Klux Klan Helped the Civil Rights Movement (2005), Hooded Americanism: The History of the Ku Klux Klan (1987); David Holthouse, The Year in Hate (2009); Diane McWhorter, Carry Me Home: Birmingham, Alabama: The Climactic Battle of the Civil Rights Revolution (2001); Pete Simi and Robert Futrell, American Swastika: Inside the White Power Movement’s Hidden Spaces of Hate (2010).
The Ku Klux Klan was the name popularly given to hundreds of loosely connected vigilante groups that emerged in the early Reconstruction era in locations throughout the South. These groups used violence and threats, primarily against freedpeople, local white Republicans, immigrants from the North, and agents of the federal government, to gain political, social, cultural, and economic benefits in the wake of the war. Although some prominent figures attempted to organize the Klan and use it as political tool, the Klan was never effectively centralized. Klan groups proliferated rapidly in 1868 and saw a second peak in 1870–71. The Klan movement was in decline by late 1871 and had almost disappeared by the end of 1872.
The first group to call itself the Ku Klux Klan began in Pulaski, Tenn., probably in summer 1866. The six original members were young, small-town professionals and Confederate veterans. This group was at first fundamentally a social club. Members performed music and organized entertainments. Significantly, they also introduced a particularly elaborate version of the rituals and costumes common to fraternal associations.
As the Pulaski Klan spread, local elites became interested in its potential as a political organization in opposition to the government of Gov. William Gannaway Brownlow. In an April 1867 meeting in nearby Nashville, they produced a governing document called the Prescript. The Prescript described the Klan as a political organization opposed to black enfranchisement and in favor of southern autonomy and the strengthening of white political power. It also detailed a complex and rigidly hierarchical organization. At this time, Nathan Bedford Forrest was probably chosen as the Klan’s first Grand Wizard. Other prominent men like Albert Pike, Matthew Galloway, and John B. Gordon joined around this time and used their influence to spread the organization.
The tightly organized, politically focused regional Klan envisioned by the Prescript never materialized. Each state faced substantially different political situations, making coordination difficult; Klan groups had few effective ways to organize or communicate; and the federal government soon became aware of the Klan and worked to suppress it. The Tennessee leaders disbanded the group in 1869. Klan activity persisted, and even increased, after this disbandment, but the disbandment spelled the end of the attempt to centralize the Klan.
The Klan, instead, became an amorphous movement that included a range of clandestine groups in many parts of the South that exploited postwar political, social, and economic disorganization for various ends. Each group had its own composition, goals, and tactics. Some had political goals, such as intimidating Republican voters, politicians, and local government officials. Others hoped to prevent the establishment of schools for freedpeople. Some styled themselves after western lynch mobs and portrayed themselves as protecting the weak and punishing crime and immorality. Some were conventional criminal gangs using a Klan identity to escape detection or punishment for theft, illegal distilling, rape, or other violent and sadistic acts. Others apparently had economic goals, such as driving away freedpeople competing with them as laborers or tenants, terrifying workers into compliance, or forcing tenants to abandon their crops, animals, or improvements. Still others engaged in Klan violence to settle personal disputes involving land use, social status, feuds, or sexual competition.
Perpetrators of Klan violence varied from place to place. Some Klan groups consisted largely of privileged, though temporarily dispossessed, southern elites, who rode horses and wore extravagant costumes. Other, probably most, Klan groups consisted of poor whites. Many Klan groups, for instance, too poor to own horses, committed their attacks while riding mules or going on foot and either did not disguise themselves or simply covered their faces with cheap materials like painted burlap sacks or squirrel skins.
Klan tactics differed as much as did membership and apparent motives. Some Klan groups were largely performative, parading through the streets and leaving cryptic messages about town. Most, however, brought intimidation and/or violence against specific targets. Even when they were pursuing goals that were not primarily political, their victims were almost always Republicans and were usually freedpeople. By targeting these groups, Klansmen frequently gained broad support among local Democratic whites. The most common form that intimidation and violence took was the nighttime visit, in which a group of Klansmen would descend upon the home of their victim and either force their way in or demand that their victim come outside. Klan visits frequently involved property theft from victims, whether the Klansmen were “confiscating” firearms or simply stealing money, food, or household goods. Some Klansmen threatened their targets, requiring that they renounce a political party, leave town, or otherwise change their behavior. Other Klan groups whipped their victims. A number of Klan attacks were sexual in nature: Klansmen raped victims, whipped them while naked, forced them to perform humiliating sexual acts, or castrated them.
Klansmen sometimes killed their victims. Because of the weak and disorganized nature of local government at the time and because of the difficulty in defining which attacks should count as Klan attacks, it is impossible to get reliable numbers on how many people the Klan killed, but the number is at least several hundred. In most cases, Klansmen killed victims execution style, by either shooting or hanging them. Klansmen shot others while they were attempting to escape. Klan groups killed some victims, particularly those who were politically connected, through ambush. Additionally, Klan groups committed some larger collective murders, such as the abduction and killing of ten freedmen in Union County, S.C., in spring 1871.
Freedpeople and white Republicans often attempted, sometimes with success, to prevent or resist Klan threats and violence. Those anticipating attack fled to nearby cities for safety or “laid out,” spending the night out of doors in their fields. Others gathered friends and family, or, in South Carolina, black militiamen, to stand guard for them. At the same time, Republican leaders and local agents of the federal government gathered information about Klan activity and plans and sent it urgently to state and federal officials, in the hope of gaining protection. Klan survivors and witnesses often agreed to testify to state or federal committees, even at grave personal risk. In the face of threatened violence at election time, Republicans tried various strategies, such as approaching the polls in groups. Faced with an attack, some who had managed to arm themselves met approaching Klansmen with gunfire. Unarmed victims sometimes used household implements as weapons. Others attempted to reason or plea with their captors; frequently, they recognized some of their attackers and directly called upon their protection.
The federal government, convinced that local and state efforts were ineffective, took several steps to suppress the Klan but could intervene only when Klan violence had a political nature. Congress passed a series of bills popularly referred to as the Enforcement Acts, intended to enforce the voting rights granted to freedmen under the Fifteenth Amendment. The first, passed on 31 May 1870, then strengthened and supplemented by another act passed on 28 February 1871, made it a federal crime for individuals to conspire or wear disguise to deprive citizens of their constitutional rights and set up federal mechanisms for the arrest, prosecution, and trials of accused offenders. The most controversial, popularly called the Ku Klux Force Act, which was passed on 20 April 1871, gave the president the authority to suspend the writ of habeas corpus and to send federal troops to areas incapable of controlling Klan violence, even without the invitation of a governor. It also made punishable by federal law several common forms of political Klan behavior and forbade Klansmen from serving on juries. President Ulysses S. Grant took limited advantage of this legislation, sending small numbers of troops to some of the hardest-hit areas. South Carolina became the focus of federal Klan enforcement: Grant suspended habeas corpus briefly in nine counties, federal marshals and troops made hundreds of arrests, and the federal district court began a high-profile series of trials of accused Klan leaders in fall 1871.
The Ku Klux Klan was significant in federal politics, particularly during the Johnson impeachment and in the federal elections of 1868 and 1872. The Klan first emerged to national notice during the impeachment trials, as Johnson’s opponents attempted to associate him with the Klan. In the election of 1868, supporters of Ulysses S. Grant, the Republican candidate, labeled supporters of Democrat Horatio Seymour as the “Ku-Klux Democracy.” Though the election of 1872 occurred after the Klan’s decline, the Klan was even more central to it than to the election of 1868. Grant’s supporters attempted to tie Horace Greeley, the Liberal Republican and Democratic candidate, to the Klan, claiming that a vote for Greeley was a vote for Klan resurgence. Greeley’s supporters claimed that Grant was using the Klan as a “bugbear” and that Klan suppression was a pretext for unconstitutionally increasing the reach of federal power.
In the months after winning reelection, Grant stopped federal Klan arrests and trials and quietly released those dozens of men who had been committed to federal prison as Klansmen. Besides some interest surrounding the publication of Albion Tourgée’s 1879 Klan-themed novel, A Fool’s Errand, the Ku Klux Klan would not be significant in American social and political life, or even in cultural representation, until its 20th-century revival.
ELAINE FRANTZ PARSONS
Duquesne University
Steven Hahn, A Nation under Our Feet: Black Political Struggles in the Rural South from Slavery to the Great Migration (2003); Kwando Kinshasa, Black Resistance to the Ku Klux Klan in the Wake of the Civil War(2006); Scott Reynolds Nelson, Iron Confederacies: Southern Railways, Klan Violence, and Reconstruction (1999); Mitchell Snay, Fenians, Freedmen, and Southern Whites (2007); Allen Trelease, White Terror: The Ku Klux Klan Conspiracy and Southern Reconstruction (1971); Xi Wang, The Trial of Democracy: Black Suffrage and Northern Republicans, 1860–1910 (1997); Lou Faulkner Williams, The Great South Carolina Ku Klux Klan Trials, 1871–1872 (1996).
The Ku Klux Klan was never more powerful than it was in the 1920s. At that time, it thrived as a nativist and racist organization, championing the rights and superiority of white Protestant Americans. Unlike the first Klan of the Reconstruction era, the second Klan was a nationwide movement. At its height, it boasted 5 million members in 4,000 local chapters across the country, although some historians contend that it never had more than 1.5 million active members at any one time. The Klan’s main appeal was its promise to restore what it deemed traditional values in the face of the transformations of modern society, and it was most popular in communities where it acted in support of moral reform. Klansmen opposed the social and political advancement of blacks, Jews, and Catholics, but they also virulently attacked bootleggers, drinkers, gamblers, adulterers, fornicators, and others who they believed had flouted Protestant moral codes.
Although the Klan was strongest in the Midwest, in states like Indiana and Illinois, where it peddled its slogan of “100% Americanism” to great effect, it was still in many ways a distinctly southern organization. William Simmons, a former Methodist preacher from Alabama, reestablished the Klan in an elaborate ceremony atop Stone Mountain, just outside of Atlanta, Ga., on Thanksgiving Day in 1915. Even as the organization spread across the country, its leadership and base of operations remained in Atlanta. Moreover, Klansmen regularly engaged in rituals and rhetoric that drew upon southern traditions from the Reconstruction Klan and Lost Cause mythology to enact their nationalistic agenda.
Klans throughout the country held public rallies, staged parades, and engaged in various political activities, mostly attempting to influence political leaders to adopt Klan positions. Klansmen tended to be solid, churchgoing middle- and working-class men who were concerned about the loss of traditional white, patriarchal power in the face of urbanization, immigration, black migration, feminism, and the cracking of Victorian morality. As much as they expressed contempt toward those at the bottom of the social ladder, they railed against the excesses of Wall Street and Hollywood, leading one historian to characterize their politics as a kind of “reactionary populism.”
In wanting to present itself as a mainstream movement that stood for law and order, the Klan, as an organization, prohibited and disavowed acts of violence. That did not stop individual Klansmen, in full Klan regalia, from committing numerous acts of terror and violence, especially in southern states. Klansmen whipped and tortured blacks who transgressed Jim Crow racial codes, but they also targeted whites who had violated moral codes. They engaged in threats, beatings, and tarring and featherings to humiliate their victims. During the 1920s, probably over 1,000 violent assaults took place in Texas and Oklahoma alone, and over 100 assaults each in Florida, Georgia, and Alabama. In 1921, the New York World published a three-week serial exposé on the Klan, highlighting its moneymaking scams, its radical propaganda, and its violence. The articles led to a congressional hearing on the organization, which ended abruptly with no conclusion.
Although the Klan supported traditional gender roles, white women received their own recognition in the formation of the Women’s Ku Klux Klan in 1915. Implemented as a separate organization, the Women’s Ku Klux Klan bound itself to the Klan’s ideals but remained independent of the men’s organization. As Klanswomen, members marched in parades, organized community events, and recruited new Klan members—primarily children. As it grew in numbers and visibility, the Klan expanded to include its youth. In 1923, the Klan voted to create two auxiliaries, the Junior Ku Klux Klan for adolescent boys and the Tri-K-Klub for teenage girls. The Junior Klan sought to promote the principles of the Ku Klux Klan in preparation for adult male membership. The Tri-K-Klub, under the umbrella of the Women’s Ku Klux Klan, taught girls the ideals the Klan desired in wives and mothers, such as racial purity, cheerfulness, and determination.
In the late 1920s, the Klan’s power began to wane and its membership declined. After the 1921 hearings, mainstream newspapers and the black press increased their reportage of Klan violence, and the NAACP began its own documentation of Klan terror. In addition, a number of prominent Klan leaders were caught in embarrassing scandals, exposing the hypocrisy of the organization. Finally, the Klan’s insistence that its movement was democratic and patriotic began to appear contradictory. For some, the Klan in America began to resemble the rising fascism in Europe, a perception only furthered by the increasing radicalism of Klan leaders. By 1930, the national Klan movement had gradually retreated into the South, where the economic crises of the Great Depression further weakened the organization. In that year, it claimed barely 50,000 members. In 1944, the Internal Revenue Service presented the Second Ku Klux Klan with a bill for $685,000 in unpaid taxes. Unable to pay, the Imperial Wizard on 23 April 1944 revoked the charters and disbanded all Klaverns of the Klan.
KRIS DUROCHER
Morehead State University
AMY LOUISE WOOD
Illinois State University
Charles C. Alexander, The Ku Klux Klan in the Southwest (1965); Kathleen M. Blee, Women of the Klan: Racism and Gender in the 1920s (1991); David Chalmers, Backfire: How the Ku Klux Klan Helped the Civil Rights Movement (2003); Kenneth T. Jackson, The Ku Klux Klan in the City, 1915–1930 (1967); Nancy K. MacLean, Behind the Mask of Chivalry: The Making of the Second Ku Klux Klan (1995); Wyn Craig Wade, The Fiery Cross: The Ku Klux Klan in America (1997).
The Black Mardi Gras Indians are African Americans (some of whom claim American Indian ancestry) who perform a colorful, elaborate, and symbol-laden ritual drama on the streets of New Orleans. Their dynamic street performances feature characters that play specific roles, polyrhythmic percussion and creolized music texts, and artistic suit assemblages that reflect ritual influences from both Indian America and West Africa. With roots stretching back to the 18th century, this unique tradition has given rise to a rich array of customs and artistic forms and continues to testify to the historical affinity of the region’s Indians and African Americans.
New Orleans hosts two Mardi Gras every year. One is the highly commercialized celebration planned for the aristocratic krews, with their carnival balls and float parades. The other is the walking and masking festival that includes Baby Dolls, Skeleton Men, and the Mardi Gras Indians. Each celebration features unique traditions that are deeply rooted in a distinctive culture and environment.
The Black Mardi Gras Indians draw upon American Indian, West African, and Caribbean motifs and theatrics to create a unique creolized folk ritual. In Louisiana, these three cultural groups came together during the French and Spanish colonial periods. Indians were African Americans’ first allies in resisting European enslavement and general labor oppression; and they too were often enslaved. The shared experience of bondage led to many intermarriages between West Africans and Indians, yielding a legacy of mixed ancestry that is still evident among many Mardi Gras Indians hundreds of years later. Such ancestry, however, has never been a criteria for joining the Indian “tribes” or “gangs.” Wearing and performing the Indian mask has instead long served as a means of escape and a way to resist and protest the white hegemony that has so long defined Jim Crow New Orleans.
No one knows exactly when the Mardi Gras Indian tradition started, but it was first documented in the late 1700s. Its early years were marked by fierce rivalries between African American “tribes” from New Orleans’s Uptown and Downtown districts, with the masked marchers carrying weapons and attacking their “tribal” adversaries. In recent decades, the resolution of these territorial rivalries has shifted from a physical to an aesthetic plane.
In today’s New Orleans, neighborhood tribes—dressed in elaborately beaded and feather-laden costumes—display their dazzling artistry every year on Mardi Gras Day, St. Joseph’s Day (March 19), and Super Sunday (the third Sunday in March). The colorfully costumed Indians parade from house to house and bar to bar, singing call-and-response songs and boasting chants to the exuberant accompaniment of drums, tambourines, and ad hoc instruments. Alongside of and behind the procession, “second liners”—relatives, friends, and neighborhood supporters—strut, dance, and sing along. When the group meets an opposing tribe, the street is filled with dancing and general “showing off,” as the costumed participants proudly display their masques, hand signals, and tribal gestures, exhibiting a shared pride in “suiting up as Indian.” This street theater—with its percussive rhythms, creolized song texts, boasting chants, and colorful feather and bead explosions—reflects Indianness through an African-based lens of celebration and ritual.
The most obviously “Indian” element of these performances is the full-body masque worn by tribe members, replete with a feather crown and detailed beadwork that typically depicts Native American themes. Though clearly a tribute to Native American culture, these elaborate costumes also show an overt connection to West African assemblage styles and beading techniques. Most of the patchwork scenes depicting Native Americans are flat and show warriors in battle or other stereotypically “Indian” scenes. Among Downtown tribes, however, the beaded images are more varied and often break into sculptural relief. Rather than conveying Native American themes, they offer Japanese pagodas, aquatic scenes, Egyptian regalia, and whatever other images their creators can imagine and then craft. Constructed anew each year, each suit testifies to countless painstaking hours on the part of the Indian who wears it in Mardi Gras.
The Mardi Gras Indians’ lavish outfits and spirited performances also reveal a distinct social hierarchy within each tribe, with different positions—chief, spy boy, flag boy, and wild man, in descending order of status—presenting themselves differently and filling particular roles in the unfolding performance. Many tribes now also crown their own queens, reprising and reinvigorating a role that was less prominent in the early, more violent years.
The unique tradition of the Mardi Gras Indians has given rise to an array of distinctive artistic forms and shared customs. At the same time, this tradition displays strong ancestral ties to West Africa and testifies to the historical affinity of Indians and Africans, two groups that played leading roles in the creolization of New Orleans. In essence, the Mardi Gras Indians’ ritual performances speak to the need to celebrate life and death in all of their splendor and to address power and enact resistance through masking and dramatic street theater.
JOYCE MARIE JACKSON
Louisiana State University
Joyce Marie Jackson and Fehintola Mosadomi, in Orisha: Yoruba Gods and Spiritual Identity, ed. Toyin Falola (2005); Maurice M. Martinez and James E. Hinton, The Black Indians of New Orleans (film, 1976); Michael P. Smith, Mardi Gras Indians (1994).
Celebrated on the third Monday in January, the Martin Luther King Jr. federal holiday honors the civil rights leader and has special meaning in the South where he was born and where his triumphs and the tragedy of his assassination took place. Michigan congressman John Conyers introduced the legislation to support the holiday shortly after King’s death, but Congress did not pass it until over a decade later, after a national promotional campaign led by Atlanta’s King Center, which overcame opposition led by North Carolina senators John East and Jesse Helms. President Ronald Reagan signed the King Holiday law in 1983. President Bill Clinton signed the King Holiday and Service Act in 1994, which honored King’s legacy by encouraging public service work on his holiday. Some southern states did not officially recognize the holiday as a paid one for state employees at first, and some combined it with commemoration of Confederate heroes, especially Robert E. Lee, whose 19 January birthday was near King’s actual birthday of 15 January. Some southern whites still use the holiday ironically to honor Civil War heroes, but in 2000 South Carolina became the last southern state to officially recognize the King Holiday as a paid state holiday for employees, giving it official legitimacy.
The King Holiday has had spiritual, political, and commercial significance. Martin Dennison notes that “this holiday reverentially recalls ‘St. Martin Luther King.’” King led a social movement, with profound political impact in the South, but he was a religious figure as well. While leading the Montgomery Bus Boycott in the mid-1950s, he became known as Alabama’s Modern Moses, and his assassination in April 1968, shortly after Palm Sunday, evoked the religious language of martyrdom. King Holiday commemorations often occur in black churches, with homilies, prayers, and religious music making the day one on the South’s sacred calendar.
The holiday has also had political meanings. On 17 January 1998, in the 30th year after King’s death, 50 Indiana Klansmen staged a rally in Memphis, the site of King’s assassination. A crowd of 12,000 black and white civil rights supporters gathered in response, with a few young gang members resorting to violence, tarnishing King’s nonviolent legacy. More important, peaceful rallies occurred throughout the city, affirming King’s contributions. In January 2000, the King Holiday became the focus for advocates of the Confederate battle flag atop the South Carolina state capitol, but counter demonstrators rallied to remove the flag, which state legislators authorized later that year.
The King Holiday is widely honored now, partly as a day of rest and commercialization. King Holiday sales market much American produce far removed from civil rights, but this represents a normalization of the day in typical American fashion and its wide acceptance. Critics suggest its commercialization trivializes the holiday’s meaning. The day continues, though, to include projects celebrating social justice, racial reconciliation, tolerance, nonviolence, and the special place of African Americans in the nation’s democratic heritage. Southern churches, community centers, arts centers, public schools, town halls, and other public facilities host these activities, recognizing the centrality of African Americans to the region’s historical memory. On 19 January 2009, the King Holiday included more than 13,000 service projects.
CHARLES REAGAN WILSON
University of Mississippi
Martin Dennison, Red, White, and Blue Letter Days: An American Calendar (2002).
“An artificial line . . . and yet more unalterable than if nature had made it for it limits the sovereignty of four states, each of whom is tenacious of its particular systems of law as of its soil. It is the boundary of empire.”
Writing his history of the Mason-Dixon Line in 1857, James Veech reflected the well-founded anxieties of the day—the fear that the horizontal fault between slave and free territory was about to become an open breach. Although the Mason-Dixon Line was long associated with the division between free and slave states, slavery existed on both its sides when it was first drawn. To settle a long-standing boundary dispute arising from ambiguous colonial charters, the Calvert and Penn families chose English astronomers Charles Mason and Jeremiah Dixon to survey the territory. After four years of work (1763–67), they fixed the common boundary of Maryland and Pennsylvania at 39°43’17.6” north latitude, marking their line at every fifth mile with stones bearing the arms of the Penn family on one side and the Calvert crest on the other. Halted in their westward survey by the presence of hostile Indians, their work was concluded in 1784 by a new team, which included David Rittenhouse, Andrew Ellicott, and Benjamin Banneker.
In 1820, the Missouri Compromise temporarily readjusted the fragile tacit balance between slave and free territory and extended the Mason-Dixon Line to include the 36th parallel. By that date, all states north of the line had abolished slavery, and the acceptance of the line as the symbolic division both politically and socially between North and South was firmly established.
The Mason-Dixon Line has been a source of many idiomatic expressions and popular images. Slogans (“Hang your wash to dry on the Mason-Dixon Line”) originated with early antislavery agitation; variations on the theme (Smith and Wesson line) and novel applications (the logo for a cross-country trucking firm) are contemporary phenomena. A popular shorthand for a sometimes mythic, sometimes very real regional distinction, the term “Mason-Dixon Line” continues to be used, and its meaning is immediately comprehended.
ELIZABETH M. MAKOWSKI
University of Mississippi
Journals of Charles Mason and Jeremiah Dixon (1969); John H. B. Latrobe, History of Mason and Dixon’s Line (1855); James Veech, ed., Mason and Dixon’s Line: A History (1857).
The subject of two award-winning documentary films—At the River I Stand and I Am a Man: From Memphis, a Lesson in Life—at least one play, and several books, the 1968 Memphis Sanitation Workers’ Strike has become emblematic of the universal struggle for dignity and respect by downtrodden people. Its iconic slogan—“I AM a Man!”—has been appropriated by labor struggles in the United States and internationally. And its memory is renewed every year on 4 April—the eve of a planned march in support of the sanitation workers and the date of Dr. Martin Luther King Jr.’s assassination.
In February 1968, nearly 1,300 sanitation workers walked off their jobs in a strike for collective bargaining rights that would ultimately represent a pivotal moment in which the labor movement, the antipoverty movement, and the black freedom movement coalesced. The quest for union recognition could hardly have been more dramatic. African American workers with wages so low that their families qualified for food stamps, with neither sick pay nor disability insurance, whose families lived in the very poorest neighborhoods in the city, confronted a segregationist mayor and city administration determined to deny them union recognition. The garbage men had been attempting to win union recognition since 1960, but their dramatic walkout on 12 February was precipitated by the deaths of two coworkers who were crushed inside a garbage truck while waiting out a rainstorm (an electrical malfunction tripped the mashing mechanism). The striking sanitation workers, who had joined Local 1733 of the American Federation of State, County, and Municipal Employees, risked instant dismissal; federal law did not accord municipal employees the protections it extended to private-sector workers.
What began as a strike transformed into a mass community movement two weeks later after city police officers sprayed the workers and their supporters, including African American ministers, with mace. Even the most well-heeled among them determined that the struggle for dignity and respect was not only for the poorest and most maligned but for all African Americans. Although the gendered slogan may seem relevant only to men, women—especially the factory workers and welfare rights activists who became the backbone of the support movement—saw in it a struggle against not only the racist indignities suffered by the men but also those confronted by black women.
When King arrived on 18 March to address a mass meeting, he was stunned at the turnout of 15,000. “Now, you are doing something else here!” he declared. “You are highlighting the economic issue. You are going beyond purely civil rights to the question of human rights.” For King, the struggle for human rights was about power: “Let it be known everywhere that along with wages and all of the other securities that you are struggling for, you’re also struggling for the right to organize and be recognized. This is the way to gain power—power is the ability to achieve purpose. Power is the ability to effect change.”
That struggle for power, born out of a quest for justice among black workers earning starvation wages and facing racist indignities on a daily basis, continues to have meaning today.
LAURIE B. GREEN
University of Texas at Austin
Laurie B. Green, Battling the Plantation Mentality: Memphis and the Black Freedom Struggle (2007); Martin Luther King Jr., in All Labor Has Dignity, ed. Michael K. Honey (2011).
Migrant workers for agriculture, forestry, and fisheries emerged as distinct social classes during the period following the Civil War, when migrant crews seasonally supplemented the work of sharecroppers, tenant farmers, and debt peons. During the first decades of the 20th century, the demand for migrant workers grew with the increase in fruit and vegetable production along the Eastern Seaboard to supply urban markets, resulting in the development of southern- and Caribbean-based crews of African Americans, Mexican Americans, and Puerto Ricans. African Americans and Puerto Ricans, based primarily in Florida and Puerto Rico, supplied labor to farms, forests, and seafood plants as far north as Maine, while Mexican Americans, based in south Texas, supplied labor across the Midwest and Great Plains. World War II drew many of these migrant workers out of agriculture, forestry, and fisheries and into the defense industry, stimulating the U.S. federal government to develop a class of migrant workers that could supply wartime food needs. By constructing labor camps and creating guest-worker programs to access foreign labor, federal officials assisted with recruiting and transporting migrant labor. Following the war, the U.S. government relinquished control of the migrant labor supply to grower associations and labor contractors.
Southern migrant labor began shifting from primarily domestic to primarily international supply regions during the 1960s and 1970s, creating an underclass of largely undocumented migrant workers from Mexico and Central America, which continues today. Within the migrant labor force, upward mobility is limited to workers who can become labor contractors or supervisors, and the majority remain confined to class positions that provide relatively low annual incomes—30 percent of all farmworkers have family incomes below federally established poverty levels. When undocumented, paid by the piece rather than hourly, and working for labor contractors rather than directly for companies, migrant labor’s relationship to capital has been stripped of worker protections in the form of guaranteed minimum wages, unemployment insurance, and health and safety standards. These conditions lead to high annual labor turnover rates, with 16 percent of all those surveyed in the National Agricultural Worker Survey reporting that they plan to work in agriculture for fewer than two or three years. High labor turnover has also led many southern employers of seasonal workers to embrace guest-worker programs. From 1943 to 1992, Florida sugar producers brought over 8,000 workers from the Caribbean annually, and today mid-Atlantic tobacco growers, forestry companies, and seafood processors utilize several thousand guest workers from Mexico under temporary contracts.
In response to conditions of economic hardship facing migrant workers, several federal programs and networks of advocacy organizations have developed to provide migrant workers with legal services, food and medical assistance, education, job training, and other services. For many years, these organizations acted on behalf of migrant workers in lieu of collective bargaining. In North Carolina, the Farm Labor Organizing Committee, after a prolonged boycott of Mt. Olive Pickles, signed a union agreement with the North Carolina Growers’ Association, while the Coalition of Immokalee Workers forced a piece-rate increase in Florida’s tomato fields by organizing farmworkers and boycotting Taco Bell and other large buyers of Florida tomatoes. Similar collective bargaining successes have not been achieved by migrant forestry or seafood workers, most of whom are temporary foreign guest workers carrying H-2B visas.
Migrant workers have occupied the lowest strata of the southern working class since the end of the Civil War. Every year, as migrant-crew buses arrive throughout the South and as workers crowd into low-quality labor camps and other temporary housing, local newspapers print exposés about conditions in the fields, factories, forests, and labor camps and life on the road. Popular and scholarly books and documentaries, focusing on the plights of migrant workers, often lead to congressional investigations. Presidential commissions on migrant labor have been established to hold public hearings, fund research, and present information about the lives of migrant workers. None of these efforts, unfortunately, have improved the lot of the migrant.
DAVID GRIFFITH
East Carolina University
Pete Daniel, In the Shadow of Slavery: Debt Peonage in the U.S. South (1972); David Griffith, American Guestworkers: Jamaicans and Mexicans in the U.S. Labor Market (2006); David Griffith, Ed Kissam, et al., Working Poor: Farmworkers in the United States (1995); Cindy Hahamovitch, The Fruits of Their Labor: Atlantic Coast Farmworkers and the Making of Migrant Poverty, 1870–1945 (1997); U.S. Department of Labor, National Agricultural Worker Survey, Research Report 9 (2005).
The quad cities of Florence, Tuscumbia, Sheffield, and Muscle Shoals, Ala., known collectively as Muscle Shoals, are located in the state’s northwest corner nearly halfway between Nashville and Memphis. Muscle Shoals assimilated the sounds of country music, blues, and rhythm and blues, becoming influential in the creation and popularity of soul music. “The Shoals” developed as an important center of popular music, becoming the self-titled Hit Recording Capitol of the World during the 1970s and requesting musical amalgamation across rigid lines in the decades after the end of Jim Crow segregation.
All four cities sit along the Tennessee River. Before the arrival of white settlers in the 1700s, Euchee, Cherokee, Chickasaw, and Shawnee Indian tribes inhabited the area. Some cite the legend of “The Singing River”—the river sounding like a female chorus—as the source of the region’s modern musical prosperity.
Florence was the birthplace and childhood home of W. C. Handy and Sam Phillips. The sounds of secular and sacred music by the area’s black population influenced their careers. The modern music industry of the Shoals began in 1951 when local musician Dexter Johnson opened the first recording studio. Florence businessmen and musicians James Joiner and Kelso Herston produced the area’s first commercial recording: Bobby Denton’s 1956 regional hit “A Fallen Star.” In 1958, partners Rick Hall, Billy Sherrill, and Tom Stafford formed Florence Alabama Musical Enterprises (FAME). Spar Records, their recording studio, attracted Alabama musicians, including Donnie Fritts, Spooner Oldham, David Briggs, Norbert Putnam, Terry Thompson, Earl “Peanut” Montgomery, Jerry Carrigan, and Dan Penn. The partnership dissolved, and Hall retained the rights to FAME, opening his own studio. Hall, working with local white musicians, recorded the R&B track “You Better Move On” in 1961, by Sheffield native Arthur Alexander, a black teenager who befriended the aspiring musicians. Dan Penn recalled, “We didn’t really consider Arthur as black.”
The crossroads where black culture and white culture meet is often a musical one. Alexander remembered, “We were trying to bridge the gap. We wanted it all, the country, the R&B, the pop. We only had this one thing in common: we all liked all types of music.” Alexander’s success provided FAME with industry recognition, allowing Hall the impetus to record Tommy Roe, the Tams, Jimmy Hughes, and Joe Tex. Quin Ivy, a DJ at WLAY in Muscle Shoals, opened Norala studios in Sheffield. Ivy recorded “When a Man Loves a Woman” by Sheffield native Percy Sledge, which reached number one on the R&B and pop charts in 1966.
Atlantic Records producer Jerry Wexler brought Wilson Pickett to FAME in 1966. Upon arriving in Muscle Shoals, Pickett recalled, “I couldn’t believe it. I looked out the plane window, and there’s these [black] people picking cotton.” Pickett’s first sessions, utilizing FAME’s house band, produced “Mustang Sally” and “Land of 1000 Dances.” Aretha Franklin recorded “I Never Loved a Man (The Way I Loved You)” at FAME in 1967. This recording, also using white studio musicians, launched Franklin’s reign as the Queen of Soul. Franklin’s subsequent soul hits utilized Muscle Shoals musical talent, although the recordings took place in New York.
FAME rhythm section—Jimmy Johnson, Roger Hawkins, David Hood, and Barry Beckett—backed King Curtis, Arthur Conley, Etta James, Clarence Carter, and others. This rhythm section left Hall in 1969, christened themselves the Muscle Shoals Rhythm Section (MSRS), later dubbed “the Swampers,” and opened Muscle Shoals Sound (MSS) in Sheffield. Many R&B artists recorded at MSS. According to Hood, the Staples Singers, who recorded “I’ll Take You There” and “Respect Yourself” at MSS, could not believe that the MSRS musicians were white. In the early 1970s, “The Muscle Shoals Sound,” heard on recordings by black artists, attracted interest from white pop/rock artists such as Paul Simon, Traffic, and Rod Stewart, because these artists thought that the MSRS musicians were black.
During the 1970s, FAME produced recordings for diverse artists, including Candi Staton, the Osmonds, Mac Davis, and Paul Anka. Many country acts, including Shenandoah, scored hits recording at FAME in the 1980s.
The annual W. C. Handy Music Festival began in 1982 and highlighted the region’s diverse music culture. In 1985, Malaco Records bought MSS, producing R&B, gospel, and blues hits from the studio.
Even as violence between whites and blacks often broke out in areas of Alabama during the civil rights struggle, the Shoals remained peaceful. The music produced in the Shoals by black and white musicians remains one of the greatest cultural testaments to racial equality.
CHRISTOPHER M. REALI
University of North Carolina at Chapel Hill
Matt Dobkin, I Never Loved a Man the Way I Loved You (2004); Christopher S. Fuqua, Music Fell on Alabama: The Muscle Shoals Sound That Shook the World (2006); Peter Guralnick, Sweet Soul Music: Rhythm and Blues and the Southern Dream of Freedom (1986; rev. ed., 1999); Vron Ware and Les Back, Out of Whiteness (2002); Richard Younger, Get a Shot of Rhythm and Blues: The Arthur Alexander Story (2000).
Although some of the most painful images of the civil rights movement involve police dogs attacking black teenagers in Birmingham and sheriff deputies assaulting black marchers in Selma, police brutality is not commonly associated with black life in the South. However, in the postwar migration of African Americans out of the rural South into the region’s urban centers, they came into contact with the most visible arm of the state: the police. African Americans throughout the South confronted repressive police departments that were threatened by black demands for equality after World War II.
In the urban South, police departments adhered to a strict policy of segregation and its ideological pattern of white supremacy that controlled black Americans in their relations with whites. While white supremacist organizations such as the Ku Klux Klan and the White Citizens’ Council used racial violence to maintain white control in rural areas, the local police department, with the support of politicians, segregationists, district attorneys, and judges, carried out this form of domestic terrorism. Consequently, the term “police brutality” was all-encompassing to African Americans. It included police homicides; unlawful arrests; assaults; threatening and abusive language; the use of racial slurs; sexual exploitation of black women; the beating of prisoners in police custody; racial profiling; police complicity in drug dealing, prostitution, burglaries, protection schemes, and gun smuggling; and the lack of justice available to black defendants in the courts.
A cursory examination of black newspapers reveals, almost on a weekly basis, graphic descriptions of police brutality and misconduct. Likewise, the archives of local and national civil rights organizations are filled with thousands of affidavits and letters relaying first-person experiences of police brutality. But African Americans in the urban South did not quietly tolerate wanton police misconduct. They utilized a variety of tactics in their response: sit-ins, boycotts, picketing, close supervision of police activity, armed confrontations, and, at times, killing and assaulting police officers. Further, they demanded major police reforms, including more black officers, more black officers in management positions, integrated patrols, a civilian review board, only-black-police-in-black-neighborhoods policies, and federal intervention.
Ironically, police brutality actually served as a tool to unify the community, since it was an issue that transcended class divisions, and in many cities an incident of police brutality was the catalyst for larger civil rights protest.
LEONARD N. MOORE
University of Texas at Austin
Glenn Eskew, But for Birmingham: The Local and National Movements in the Civil Rights Struggle (1997); Laurie Green, Battling the Plantation Mentality: Memphis and the Black Freedom Struggle (2007); Leonard Moore, Black Rage in New Orleans: Police Brutality and African American Activism from World War II to Hurricane Katrina (2010).
(1897–1933) COUNTRY MUSIC SINGER.
Generally acknowledged as the “father of country music,” James Charles “Jimmie” Rodgers, who was born on 8 September 1897 in Meridian, Miss., was a major influence on the emerging “hillbilly” recording industry almost from the time of his first records in 1927. Although Rodgers initially conceived of himself in broader terms, singing Tin Pan Alley hits and popular standards, his intrinsic musical talent was deeply rooted in the rural southern environment out of which he came, as seen in the titles of many of his songs: “My Carolina Sunshine Girl,” “My Little Old Home Down in New Orleans,” “Dear Old Sunny South by the Sea,” “Mississippi River Blues,” “Peach Pickin’ Time Down in Georgia,” “Memphis Yodel,” “In the Hills of Tennessee,” the original “Blue Yodel” (“T for Texas”), and others.
In adapting the black country blues of his native South to the nascent patterns of commercial hillbilly music of the day, Rodgers created a unique new form—the famous “blue yodel”—which led the way to further innovations in style and subject matter and exerted a lasting influence on country music as both art form and industry. Through the force of his magnetic personality and showmanship, Rodgers almost single-handedly established the role of the singing star, influencing such later performers as Gene Autry, Hank Williams, Ernest Tubb, George Jones, and Willie Nelson.
The son of a track foreman for the Mobile & Ohio Railroad, Rodgers in his twenties worked as a brakeman for many railroads in the South and West. Stricken by tuberculosis in 1924, he left the rails soon after to pursue his childhood dream of becoming a professional entertainer. After several years of hard knocks and failure, he gained an audition with Ralph Peer, an independent producer who had set up a temporary recording studio in Bristol, Tenn., for the Victor Talking Machine Company (later RCA Victor). There, on 4 August 1927, Rodgers made his first recordings. Within a year, he reached national popularity and received billing as “The Singing Brakeman” and “America’s Blue Yodeler.” In 1929, he built a home in the resort town of Kerrville, Tex., and moved there in an effort to restore his failing health. The onset of the Depression and increasing illness further slowed the progress of his career, but throughout the early 1930s he continued to record and perform with touring stage shows. By the time of his death in New York City at age 35 in May 1933, he had recorded 110 titles, representing a diverse repertoire that included almost every type of song now identified with country music: love ballads, honky-tonk tunes, railroad and hobo songs, cowboy songs, novelty numbers, and the series of 13 blue yodels. In November 1961, Rodgers became the first performer elected to Nashville’s Country Music Hall of Fame, immortalized as “the man who started it all.” The Jimmie Rodgers Memorial Museum in Meridian, Miss., hosts the annual Jimmie Rodgers Memorial Festival, which began in May 1953.
NOLAN PORTERFIELD
Cape Girardeau, Missouri
Bill C. Malone, Country Music, U.S.A.: A Fifty-Year History (1968); Nolan Porterfield, Jimmie Rodgers: The Life and Times of America’s Blue Yodeler (1979); Mrs. Jimmie Rodgers, My Husband, Jimmie Rodgers (1975).
Nothing made more clear the lie of segregation in the late 19th-century South than train travel. Now that southern people were moving—taking the train from small towns to other towns and cities, walking even from a farm to a crossing, flagging down the engine, and ending up far away—strangers became more common even in the smallest places. Trains took traveling pockets of anonymous urban social relations wherever the tracks went. Traveling forced passengers to deal with a world in which other people were not known, in which their identity could only be determined from their outward appearance, and in which lines of division and order, other than who could afford first class and who had to ride coach, became hopelessly confused.
From the 1880s to the 1950s, in magazine articles, essays, court cases, and novels, southerners referred again and again to this figure, the middle-class black, made visible through clothing, educated speech, and often a lightness of skin color, and made increasingly visible too beyond their numbers by this new ability to travel. In fact, the 1896 Supreme Court decision that upheld the constitutionality of segregation and made “separate-but-equal” the law of the land, Plessy v. Ferguson, turned on a man who perfectly embodied this confusion, Homer Adolph Plessy. Light-skinned and racially mixed, Plessy made a planned challenge to Louisiana’s 1890 law requiring segregated streetcars. His lawyer, Albion Tourgée, a northern white Reconstruction official and popular novelist, argued that the government did not have the right to determine the racial identities of its citizens.
Who but Plessy himself should say where the almost-white Plessy belonged? The Court, of course, disagreed, reasoning that racial differences lay before and outside the law, in human nature itself. Plessy would have to be placed on one side or the other, would have to be either black or white. The Plessy decision fully denied what African American writer Albert Murray later called the “incontestable mulatto” nature of American culture and set this lie at the very center of modern society. The Court simply added its voice to the increasingly racialist and white supremacist thinking that permeated late 19th-century American society, an attempt in part to ground Plessy-like people’s mutable identities in a concreteness of blood, bodies, science, and the law.
The middle-class, racially ambiguous person on the train made visible white southerners’ fear of making a mistake in identifying strangers. In 1889, a Tennessee newspaper turned this anxiety into humor. When “a bright and good-looking colored girl (or rather an almost white colored girl)” got on board a train in Nashville, a “flashily dressed white gentleman,” usually known as the “car masher,” began a flirtation. Wooing his “lady friend” with lunch and witty conversation, he did not realize his mistake until after she got off the train, and the other ladies still aboard laughed at him. The joke (and the incident would not have been at all funny to whites if the genders had been reversed) served as a warning about the dangers inherent in the first-class train car’s world of anonymous yet intimate social relations. For whites, the middle-class African American, now able and willing to travel—not the ragged riders in second-class cars—made segregation a necessity.
For African Americans, however, people who identified themselves or were identified by others as black, nothing demonstrated the lie of segregation’s premise of absolute racial difference, of white supremacy and black inferiority, like the figure on the train. Activists and writers Mary Church Terrell and Anna Julia Cooper describe their own encounters with travel in the South, what Cooper called America’s “out-of-the-way jungles of barbarism,” where young black girls and dignified colored ladies were routinely ejected from first-class cars by tobacco-stained, stinking white men in the years before Plessy. In 1885, white southern writer George Washington Cable took the figure to Century Magazine, describing a middle-class mother and child trapped in a car with chained convicts in his passionate plea for African American civil rights. In his 1901 novel The Marrow of Tradition, Charles Chesnutt brought his readers along on a train ride from New York to North Carolina, exploring the experience of being “branded and tagged and set apart from the rest of mankind upon the public highways like an unclean thing.”
Writer, intellectual, and activist W. E. B. Du Bois referred often to the figure on the train and to his own travels: “I am in the hot, crowded, and dirty Jim Crow car where I belong. . . . I am not comfortable.” But he also went further, shaping the figure into an image that defined racial identity even as it pointed to the very impossibility of any “natural” racial categories. Asked how blacks could be both superior and the salvation of humanity if race was unreal, segregation a lie, he answered: “I recognize it [racial identity] easily and with full legal sanction: the black man is a person who must ride ‘Jim Crow’ in Georgia.” For African Americans and a few dissident whites, the middle-class person on the train made it clear that segregation created the very racial categories it was supposedly enacted to uphold.
GRACE ELIZABETH HALE
University of Virginia
Edward L. Ayers, The Promise of the New South: Life after Reconstruction (1992); Grace Elizabeth Hale, Making Whiteness: The Culture of Segregation in the South, 1890–1940 (1998); David Levering Lewis, ed., W. E. B. Du Bois: A Reader (1995); Eric Sundquist, To Wake the Nations: Race in the Making of American Literature (1993).
The cultural division between R&B and pop music manifested itself most directly in the association of soul music with the Black Power Movement in the late 1960s. While the term “soul” always invoked connotations of black power and cultural pride, musical developments in the later 1960s helped to bond the Black Power Movement and black popular music more closely. The 1967 release of “Respect” by Aretha Franklin (originally written and recorded by Otis Redding) and “Say It Loud (I’m Black and I’m Proud)” by James Brown in 1968 signaled to the American public a heightened sense of political and social awareness in black popular music and among some of its most popular purveyors. The term “soul,” then, came to represent a musical style, and, further, for many black Americans soul embodied an African mystique that became increasingly widespread as the 1960s progressed.
The modern elements of soul music coalesced in the mid-1950s, yet the fundamentals of this black American popular music style are traceable to the antebellum American South and West African traditions. The term “soul” often refers to a style and subgenre of secular music commonly associated with black Americans, particularly from the South. Musicians and audiences had used the word in connection with gospel music since the late 1920s. The term gained wider recognition for its musical associations when a famous black gospel quartet, the Soul Stirrers, used it in their name. The crossover appeal of the group’s lead singer, Sam Cooke, revealed a musical and aesthetic connection between the group and the genesis of the soul music style of the 1960s.
A shift away from integrationist rhetoric and the growing popularity of the Nation of Islam and Malcolm X helped give rise to the Black Arts Movement and its efforts to create an Afrocentric aesthetic. Starting with inner-city uprisings in 1964, “soul” became a household word in black communities. Black businessmen used the phrase “soul brother” posted in store windows to prevent destruction and looting. In the mid-1960s, black DJS identified their stations as “soul radio.” As Portia Maultsby noted, “Soul became associated with all forms of black cultural production.”
The mass media and music trade publications embraced the term around 1967. Time magazine and the Saturday Evening Post featured stories about soul music, which brought the term to a broader audience while revealing its cultural association with black America. Time featured Franklin on the cover in June 1968. The article quoted LeRoi Jones as saying, “Soul music is music coming out of the black spirit.” The 1969 Post article stated, “Across the country, ‘soul’ has become synonymous with ‘black’—as in ‘soul brother.’”
In June 1967, Billboard magazine issued its first annual feature titled “The World of Soul,” documenting “the impact of blues and R&B upon our musical culture.” On 23 August 1969, Billboard replaced the term “rhythm and blues” with “soul” to chart sales of music recorded by black artists. Writing in 1983, Maultsby commented that Billboard’s decision was “motivated by the fact that the term soul more properly embraces the broad range of song and instrumental material which derives from the musical genius of the black American.” Billboard used the term until 1982, when “black music” replaced it.
Authors contemporaneous with soul music’s national ascent reinforced the association between soul music and black America. In 1969, Phyl Garland wrote, “Soul music in all of its forms is the aesthetic property of a race of people who were brought to this country against their will.” Arnold Shaw, writing in 1971, stated, “Soul is black, not blue, sass, anger, and rage. Soul is black nationalism in pop.”
Seemingly contradictory to the idea that soul music expressed the views of black nationalism is the fact that soul’s three main production centers—Stax (Memphis), FAME (Muscle Shoals, Ala.), and Motown (Detroit)—utilized racially integrated groups of musicians in the creation of the music. Unaware of the accompanying musicians’ racial diversity, the listening public and those in the media directly associated the soul singer, who was black, with the song and its lyrics. White vocalists employing techniques similar to those used by black soul artists in the 1960s were categorized as “Blue-eyed Soul.” These artists included the Righteous Brothers, Bobbie Gentry, Tom Jones, Dusty Springfield, and others.
Soul Train, a weekly television program, began broadcasting nationwide in October 1971, giving many black soul artists broad exposure. During the 1970s, several major concert events, both nationally and internationally, furthered the globalization of soul music. In March 1971, many black American musicians performed at the Soul to Soul concert in Accra, Ghana, which directly linked African American and African culture. Stax Records produced Wattstax in 1972 as a “community-based” event commemorating the anniversary of the Watts riots. James Brown, B. B. King, Bill Withers, and others performed at the Zaire ’74 festival. Film crews documented the concerts, and the resulting movies were released publicly.
From the early 1960s through the early 1970s, independent labels such as Atlantic, Stax, Motown, King, and others led the soul music market. With the broadening appeal of soul music in the mid-1970s, major labels signed black artists as an attempt to cross over to a white market. According to Nelson George, “In the [crossover] process, much of what made the R&B world work was lost, perhaps some of it forever.” By the end of the 1970s, soul artists from the 1960s had disappeared from the charts, and disco replaced the soul aesthetic.
CHRISTOPHER M. REALI
University of North Carolina at Chapel Hill
Stanley Booth, Saturday Evening Post, 8 February 1969; Mellonee V. Burnim and Portia Maultsby, African American Music (2006); Samuel A. Floyd, The Power of Black Music: Interpreting Its History from Africa to the United States (1995); Phyl Garland, The Sound of Soul (1969); Nelson George, The Death of Rhythm and Blues (1991); Portia K. Maultsby, Journal of Popular Culture (Fall 1983); Arnold Shaw, The World of Soul (1971).
Founded in 1944, the Southern Regional Council (SRC) sought to ameliorate racial injustice in the American South through expanded dialogue and cooperation between the races. Unlike the NAACP, the SRC did not pursue litigation aimed at dismantling segregation, nor did it organize grassroots protest in favor of civil rights. Rather, it focused on bringing white leaders and black leaders together across the South to focus on issues that could ameliorate racial injustice without fundamentally challenging the segregated, social structure of southern society. Such an approach hinged less on “human relations,” noted Leslie Dunbar, than on “human resources,” meaning that the SRC focused less on explicitly racial issues like integration than it did on structural issues contributing to racial injustice, issues like jobs, social services, and education.
Because of its early focus on leadership elites, the SRC began as a distinctly upper-class institution, aimed at attracting predominantly white members of “commanding regional stature and authority.” However, the organization lost a sizable segment of its membership in 1951, when it took a public stance against racial segregation. From that point onward, the SRC assumed a slightly less elite, but decidedly more inclusive, academic, and arguably middle-class status, including many more black members than originally envisioned. For much of the 1940s and 1950s, the SRC became the voice of many southern liberals, particularly white liberals who sought to improve race relations in the region without enlisting the help of the federal government.
Foremost among the SRC’s declared goals were research, information, and interracial dialogue, a project that the council pursued by sponsoring a series of studies on the South, including George McMillan’s Racial Violence and Law Enforcement (1960), Howard Zinn’s Albany: A Study in National Responsibility, and James McBridde Dabbs’s Who Speaks for the South? (1964). In 1962, the SRC mounted the massive Voter Education Project, aimed at facilitating the registration of black voters across the South—a project that dovetailed nicely with voter registration campaigns mounted by more grassroots civil rights organizations like the Student Non-Violent Coordinating Committee in places like the Mississippi Delta in 1964 and Selma, Ala., in 1965.
ANDERS WALKER
Saint Louis University
Leslie W. Dunbar, Annals of the American Academy of Political and Social Science (January 1965).
During the civil rights years of the 1950s and 1960s, a number of administrative commissions and legislative committees were created by southern states’ governments to resist the civil rights crusades in the region. Though their names varied, these official agencies were bound together by common interests and purposes—defending the region’s cherished “segregated southern way of life,” devising both legal and extralegal means to circumvent the U.S. Supreme Court’s desegregation rulings, propagandizing the vindication of states’ rights ideology and racial separation, and suffocating any dissenters and deviators from the South’s racial norms.
In the early 1950s, reflective of the Dixiecrat revolt during the 1948 presidential election and in anticipation of the Supreme Court’s impending school desegregation ruling, some Deep South states began to create legislative committees to protect their public schools from racial integration. The first such strategy-mapping segregationist committee was created by South Carolina in April 1951 when the state legislature established the South Carolina School Committee, or the Gressette Committee—named after its chair, state senator L. Marion Gressette. Georgia then followed its neighboring state to organize the Georgia Commission on Education in December 1953.
Meanwhile, during the 1952 regular session, the Mississippi state legislature appointed a study committee officially called the Mississippi Legislative Recess Education Committee. The committee’s whole responsibility was to equalize the physical standards for black pupils with those for whites in the state’s elementary and secondary educational systems in the hope that this equalization movement would influence the expected Supreme Court’s school desegregation decision and would, if possible, circumvent any federal court rulings unfavorable to the continuation of racial segregation in Mississippi’s public schools. While reorganizing this study committee as the new State Education Finance Commission to have it supervise the construction of new public schools for black pupils and the consolidation of school districts in the state, the Mississippi legislature established the Legal Educational Advisory Committee in April 1954. In the name of preserving and promoting the best interests of both black and white Mississippians, the advisory committee, in substance, was vested with authority to draft segregationist laws to maintain racially separate schools in the state. The Legal Educational Advisory Committee would soon be converted into a tax-supported “permanent authority for maintenance of racial segregation” in Mississippi.
In the immediate aftermath of the Supreme Court’s Brown v. Board of Education ruling in May 1954, Louisiana joined other Deep South states by creating the Louisiana Joint Legislative Committee on Segregation, better known as the Rainach Committee—after an all-powerful state senator, Willie M. Rainach. A year later, the Supreme Court announced the implementation order of Brown, propelling southern states to organize “massive resistance”—an all-out resistance movement to what they termed “judicial tyranny” and to the ever-intensifying civil rights movement in the region. An overwhelming mood of defiance to the federal government dominated southern states’ legislative sessions in 1956. Virginia, Alabama, South Carolina, and Mississippi adopted the so-called interposition resolutions by the end of February, in which these states’ legislatures expressed their strongest determination to defend the South against the “illegal encroachment” of the federal government.
In Mississippi, with the defiance of the federal government at its most extreme and inspired by the issuance of its own interposition resolution, the state lawmakers then turned to creating a tax-supported implementation agency of the resolves expressed in the resolution. On 29 March 1956, with the blessing of Gov. James P. Coleman, Mississippi—the citadel of racial injustice in the South—created the Mississippi State Sovereignty Commission as part of the executive branch of its government, “to do and perform any and all acts . . . to protect the sovereignty of the State of Mississippi . . . from encroachment thereon by the Federal Government.” Though the State Sovereignty Commission in Mississippi was soon to be identified as the state’s “segregation watchdog agency,” neither the word “segregation” nor the word “integration” appeared in the carefully crafted bill that created the new agency. To be sure, however, federal “encroachment” was a periphrasis implying “forced racial integration,” and “to protect the sovereignty” of Mississippi from that “encroachment” was a sophisticated roundabout expression of the state’s resolve “to preserve and protect racial segregation” in Mississippi.
Mississippi thus became the very first southern state to reinstate the word “sovereignty” of states’ rights ideology in naming its anti-Brown and anti–civil rights state agency. With the aura of sophistication and respectability emanating from the word, the State Sovereignty Commission, for all practical purposes, was expected to maintain segregation at all costs and to wreck the NAACP and other civil rights organizations in both Mississippi and her southern sister states. From its inception in 1956 to its practical demise in the late 1960s, the Mississippi State Sovereignty Commission, maintaining both public relations and investigative departments, was the most emphatic prosegregation and pro–states’ rights governmental agency in the South. The Mississippi commission’s heyday came under the chairmanship of Gov. Ross R. Barnett.
After the September 1957 Little Rock, Ark., school desegregation crisis, though it brought wretched consequences to southern segregationists, “massive resistance” attained its height. Virginia created the Virginia Commission on Constitutional Government in 1958 as an official vehicle to carry on the state’s “respectable” resistance to the civil rights movement. Then in June 1960, Louisiana established its own state sovereignty commission and the Louisiana Joint Legislative Committee on Un-American Activities. While the grand missions of the Louisiana State Sovereignty Commission were to paint the state’s race relations in a rosy color and to alert the rest of the nation to the gradual encroachment on states’ rights by the centralized federal government, the Un-American Activities Committee, indicative of its name, took up the broadly defined “subversive hunt” in Louisiana. After all, the civil rights leaders, activists, and their sympathizers in the South could all be categorized as “subversives” in the sense that they willfully defied the region’s white establishment and its long-cherished “segregated way of life.”
In September 1962, the entire South witnessed the first dramatic and physical confrontation between a Deep South state and the federal government over the University of Mississippi desegregation crisis. Soon thereafter, at the behest of Gov. George C. Wallace, Alabama belatedly organized its state sovereignty commission as well as the Alabama Legislative Commission to Preserve the Peace, in 1963. As in the case of Louisiana, the Alabama State Sovereignty Commission devoted its time and energy mainly to public relations schemes, and the “Peace Commission” spied on civil rights activists.
Abominable racial incidents and irresponsible actions of die-hard segregationists occurred during the first half of the 1960s, and fatal blows were rendered to the white South’s resistance movement with the passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965. Combined, these facts resulted in robbing both the legality and the respectability of the South’s “massive resistance.” Having virtually outlived its usefulness by 1968 to defend the state’s racial status quo, the Mississippi State Sovereignty Commission, in its dying days, spent its resources on investigating anti–Vietnam War demonstrators, black nationalists, and campus radicals in the state, reflecting the transformation of the nation’s political and social trends in the late 1960s. For the purpose of cracking down on these “new subversives,” the state sovereignty commissions of Mississippi, Louisiana, and Alabama formed the Interstate Sovereignty Association in May 1968, but this cooperative scheme did not enjoy any longevity.
In June 1969, Louisiana’s state sovereignty commission and its Un-American Activities Committee were terminated. Having outlived the one in Louisiana, both the Mississippi State Sovereignty Commission and the Alabama State Sovereignty Commission faded away in 1973. By the time its death knell rang, Mississippi’s “segregation watchdog agency” had ended up spending over $1,542,000 to “protect the sovereignty” of the state.
YASUHIRO KATAGIRI
Tokai University (Japan)
Numan V. Bartley, The Rise of Massive Resistance: Race and Politics in the South during the 1950s (1969); Dan T. Carter, The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics (1995); Adam Fairclough, Race and Democracy: The Civil Rights Struggle in Louisiana, 1915–1972 (1995); Erle Johnston, Mississippi’s Defiant Years, 1953–1973: An Interpretive Documentary with Personal Experiences (1990); Yasuhiro Katagiri, The Mississippi State Sovereignty Commission: Civil Rights and States’ Rights (2001); Steven F. Lawson, in An Uncertain Tradition: Constitutionalism and the History of the South, ed. Kermit L. Hall and James W. Ely Jr. (1989); Neil R. McMillen, The Citizens’ Council: Organized Resistance to the Second Reconstruction, 1954–64 (1971); Jeff Roche, Restructured Resistance: The Sibley Commission and the Politics of Desegregation in Georgia (1995); William M. Stowe Jr., “Willie Rainach and the Defense of Segregation in Louisiana, 1954–1959” (Ph.D. dissertation, Southern Methodist University, 1989).
From a meager and tentative start in 1959 as Satellite Records, by the mid-1960s Stax had become the record label most responsible for defining the commercially successful and widely influential “Memphis Sound.” Memphis, as one of the most significant cultural crossroads in the South, had supported a number of vibrant live music scenes throughout the earlier part of the 20th century, and the city’s hybrid sounds had received their first major national exposure when artists initially recorded by Sam Phillips left Memphis for greener pastures. But it was Stax and its alumnus Chips Moman and his American Sound Studio that sent Memphis-recorded music to the top of the national pop charts. Stax, with its gritty, muscular “southern soul” or “deep soul” sound, provided an uncompromising rootsy counterpoint to northern urban black pop crossover attempts by Motown and soul crooners such as Sam Cooke and Ben E. King. Its sound and success also shaped the trajectory of the Muscle Shoals–area recorded music, an equally successful and influential southern sound. Booker T. & the MG’s, Sam and Dave, Otis Redding, and Wilson Pickett became and have remained household names, and Stax’s 1960s sound was famously commemorated for later generations in the 1981 movie The Blues Brothers and its sequel Blues Brothers 2000.
Although Stax struggled to stay afloat in racially torn Memphis after Martin Luther King’s assassination and finally folded in 1976, it was not before hitting a second stylistic and commercial peak with psychedelic soul and funk. The premier exponent of that sound, Isaac Hayes, won multiple Grammies and an Oscar in 1972 for his score of the classic blaxploitation movie Shaft and set the tone for black popular music for at least the next decade.
In 1957, Jim Stewart, a banker and an erstwhile country fiddler, started the Satellite record label in his wife’s uncle’s garage in Memphis. Next, convincing his elder sister, Estelle Axton, to invest in a recording venture by mortgaging her house, he bought an Ampex monaural recorder and moved to a deserted grocery store in Brunswick, 30 miles east of Memphis. Finding little talent in the small community, Satellite rented the closed Capitol movie theater at 924 E. McLemore Ave. in Memphis and rechristened it Soulsville U.S.A. White guitarist, songwriter, and producer Lincoln Wayne “Chips” Moman, from LaGrange, Ga., helped in securing that deal and proved integral to the fledgling operation before leaving to work at FAME studios in Muscle Shoals and later at his own American Studios back in Memphis.
Satellite did not have a focus on black music until the company’s first break came with WDIA DJ Rufus Thomas recording “’Cause I Love You” with his daughter Carla. The regional popularity of the record led New York’s Atlantic Records head Jerry Wexler to offer Satellite a purportedly distribution-only deal. Carla Thomas next scored the studio’s first national hit with her self-composed “Gee Whiz.” Wexler signed the teenaged Thomas to his own Atlantic label, on which he also released the song, thus starting an arrangement through which important Atlantic-signed artists, many from the North, would record at Stax and later also at Muscle Shoals–area studios and at Chips Moman’s Memphis-based American Sound Studio. Such arrangements became a significant route for southern sounds to enter the popular mainstream.
Satellite’s next hit came in July 1961 with the Mar-Keys’ instrumental “Last Night.” Released initially on Satellite, the record shot to number two on the national pop charts. Threats of litigation by another company with the same name prompted the renaming, to Stax, a title that borrowed the initial letters from Stewart and Axton’s last names. Controversy surrounds the details of which musicians were featured on the spliced-together record, but the band that toured in the wake of its popularity was all white and featured seven members, two of whom, guitarist Steve Cropper and bassist Donald “Duck” Dunn, would soon constitute half of the most famous studio house band in American popular music, Booker T. & the MG’s. Booker T. Jones and Al Jackson Jr., organist and drummer, respectively, were the earliest African American musicians to play sessions at Stax, and they formed the other half of that legendary combo. The Mar-Keys’ Wayne Jackson partnered with Andrew Love to form the Memphis Horns. Over the next six years, both in the studio and on the road, Booker T. & the MG’s and the Memphis Horns—and often the reconstituted Mar-Keys and the Bar-Kays—backed numerous Stax and Atlantic acts that featured on regional and national pop charts with increasing frequency. Among these were Carla Thomas, Rufus Thomas, William Bell, Otis Redding, Wilson Pickett, Sam and Dave, Eddie Floyd, Johnnie Taylor, and Albert King. Additionally, the exceptional success of the MG’s as an individual recording act inspired many studio backup bands, including the Bar-Kays, War, and MFSB (Mother Father Sister Brother), to launch individual careers and encouraged studio musicians to form “supergroups,” such as Stuff and Fourplay. Keyboardist, composer, and arranger Isaac Hayes and lyricist David Porter constituted a major songwriting and production team at Stax during this period, especially for Sam and Dave.
The year 1968 proved to be a landmark in Stax’s history. In February, Otis Redding’s “Sittin’ on the Dock of the Bay,” overdubbed and released posthumously after Redding’s plane crash in December 1967, became Stax’s first pop chart topper. The racial tensions resulting from the April assassination of Martin Luther King Jr. in Memphis and Atlantic Records’ decision to end its partnership with Stax, taking with it all the recorded masters, sent the company into a tailspin. Stax was eventually sold for over 2 million dollars to the Paramount Pictures subsidiary of Gulf-Western. Estelle Axton exited the picture, and African American executive vice president Al Bell, who had been hired in 1965, took increasing charge of the reins.
Bell and Stewart borrowed money from Deutsche Grammophon to buy back Stax in 1970 and negotiated a distribution deal with Columbia. While the 1964 to 1967 period is remembered as Stax’s creative zenith, Isaac Hayes and the Staples Singers found artistic and exceptional commercial success with their updated sounds during the post-1968 period. It must be noted, however, that Stax often sent its artists down to Muscle Shoals to record—among others, the Staples’ hit “I’ll Take You There” was recorded at the Muscle Shoals Sound studio for Stax. Other successful Stax acts from this period include Mel and Tim, Little Milton, the Soul Children, the Emotions, the reconstituted Bar-Kays, the Dramatics, Shirley Brown, and gospel singer Rance Allen. In 1972, Al Bell bought Stewart’s share in the company; Stewart continued as chief executive, however. Under Bell, Stax attempted to diversify, investing in a Broadway play, signing black comedian Richard Pryor to the new Partee subsidiary, and even recording an album by Rev. Jesse Jackson on the Respect subsidiary label. One of the company’s most ambitious projects was August 1972’s WattStax concert at the Los Angeles Memorial Coliseum during the Watts Summer Festival; the multiagency benefit event featured all the main artists on Stax’s roster, was attended by an audience of over 100,000, and comparisons to Woodstock were put forth.
Al Bell and Johnny Baylor, a black New York record executive Bell had hired in 1968, had drastically different and not always above-board approaches to running the now multimillion-dollar empire, contrasting with the amicable family-business environment of the early years. Even through increasing legal problems and multiple federal investigations, starting in 1973, the label continued recording, albeit with decreasing commercial success; only the Staples Singers charted in the Top 20 in its last three years. Based on petitions filed by creditors, a bankruptcy court shut down the Stax operation in January 1976. Although an environment of rapidly changing musical tastes precluded any subsequent historic landmarks for its remaining alumni—except perhaps for actor Richard Pryor, Steve Cropper, and, later, Isaac Hayes—Stax’s place in American popular music had long been secured.
AJAY KALRA
University of Texas at Austin
Rob Bowman, Soulsville U.S.A.: The Story of Stax Records (1997); James Dickerson, Goin’ Back to Memphis: A Century of Blues, Rock ’n’ Roll, and Glorious Soul (1996); Peter Guralnick, Sweet Soul Music: Rhythm and Blues and the Southern Dream of Freedom (1986; rev. ed., 1999).
(1941–1955) CIVIL RIGHTS MARTYR.
Emmett Till’s brutal murder and the heinous acquittal of his assailants ignited the civil rights movement in America. In mid-August 1955, 14-year-old Till, called “Bo,” traveled from his home in Chicago to the Mississippi Delta to visit relatives. Goaded by cousins to address a white woman flirtatiously, on 24 August Till went inside Bryant’s Grocery and Meat market in Money, Miss., to buy bubble gum. Here, allegedly, he wolf whistled at 27-year-old Carolyn Bryant, the owner’s wife, who accused him of indecent advances. On the evening of 28 August 1955, Roy Bryant, with his half-brother J. W. Milam and possibly others, kidnapped Till from his great-uncle Mose Wright’s house, savagely pistol whipped him, gouged out one of his eyes, ripped his tongue from his mouth, knocked the back of his head off, and then threw his body into the Tallahatchie River, where he was found three days later on 31 August with a 70-pound cotton-gin fan barb-wired around his neck.
On 10 September, after Till’s mother, Mamie Bradley Till, insisted her son’s body be returned to Chicago, she made the historic and brave decision to show what had happened to her son by leaving his coffin open. Photos of Till’s mangled face first appeared in the Chicago Defender and then nationally in Jet magazine and finally around the world, outraging readers, who demanded justice for this horrific crime. But the trial of Till’s murderers, 19–23 September, presided over by Judge Curtis Swango, only added to the injustice. The jury of 12 white men, some of whom contributed to a fund for the defense, found Bryant and Milam not guilty after only 67 minutes of deliberation. Till’s murder was one of the very early civil rights atrocities to win national media attention. It was also the first time a black man—Mose Wright—testified against a white person in Mississippi. Moreover, by unquestioningly identifying her son’s mutilated, rotting corpse, Mamie Bradley Till challenged the defense’s claim that since the body was unrecognizable there was no corpus delecti to prove Bryant and Milam’s guilt.
Exactly what Till said to Carolyn Bryant may never be known, but Mamie Till declared in court and in speeches over the next 35 years that she had told her son to blow a sound out like a whistle to stop his stuttering. In January 1956, William Bradford Huie published an interview with Bryant and Milam in Life magazine in which they shockingly admitted that they had murdered Till and reveled in how they had punished the alleged sexual aggressor. An investigation by the FBI would have been the only way to indict the two on federal kidnapping charges, but President Eisenhower, wary of offending southern states, and FBI director J. Edgar Hoover refused. However, in November 2004, the Justice Department reopened the case and then turned it over to the district attorney of Leflore County, Miss.
Till’s murder, the photo of his mangled corpse, and the trial are iconic images and events in civil rights history. Symbolically, Till was immortalized as a Christlike sacrificial lamb slaughtered for his people’s freedom. Politically, the NAACP characterized his murder as a lynching and demanded justice, as did notable figures around the world, including Eleanor Roosevelt. Claiming that she was inspired by Till’s murder, Rosa Parks refused to move to the back of a Montgomery, Ala., bus just three months after the trial. The Montgomery Bus Boycott was thus directly tied to the outrage over Till’s murder. Young people around the country went on Freedom Marches for Emmett Till as he became an inspiration for civil rights protests and sit-ins, even as his gruesome death terrified blacks that the same fate might await them. Till’s photo made a powerful impression on a new generation of African Americans who would fight for civil rights. Eldridge Cleaver, Stokely Carmichael, and Julian Bond maintained that Till’s brutal murder captured in the Jet photo profoundly influenced their lives. Till was enshrined along with Martin Luther King and Medgar Evers in the civil rights memorial in Montgomery as martyrs in the fight for justice and freedom.
Till’s legacy endures in music, literature, film, and memoirs. Songs about him were written by Aaron Kramer and Clyde Appleton, Bob Dylan, and Joan Baez. Till was the subject of poems by Langston Hughes, Audre Lorde, Gwendolyn Brooks, Sam Cornish, and Shirley Nelson. Lewis Nordan’s Wolf Whistle and Bebe Campbell’s Your Blues Ain’t Like Mine have fictionalized Till’s story. His life and death have also been dramatized by Toni Morrison, James Baldwin, and his mother, with David Bar III, in The Face of Emmett Till (1999). In 2005, Keith A. Beauchamp directed the documentary film The Untold Story of Emmett Till. The title of Mamie Bradley Till-Mobley’s memoir—The Hate Crime That Changed America—aptly expresses her son’s crucial role in the civil rights movement.
PHILIP C. KOLIN
University of Southern Mississippi
Adam Green, Selling the Race: Culture, Community, and Black Chicago, 1940–1955 (2007); Philip Kolin, ed., Southern Quarterly (Summer 2008); Christopher Metress, The Lynching of Emmett Till (2002); Harriet Pollock and Christopher Metress, eds., Emmett Till in Literary Memory and Imagination (2007); Mamie Till-Mobley, The Hate Crime That Changed America (2002); Stephen J. Whitfield, A Death in the Delta: The Story of Emmett Till (1986).
The United States Public Health Service Study of Untreated Syphilis (USPHSS) in the Negro male in Macon County, Ala., is the original name of the longest nontherapeutic study conducted in the United States and one that showed how race influenced medical science in the 20th century. This study is more popularly known as the Tuskegee Syphilis Study because the U.S. Public Health Service originally conducted it with cooperation from Tuskegee Institute (now Tuskegee University). The study continued from 1932 to 1972. Since its founding in 1881, Tuskegee Institute worked with members of the black community in Macon County and the surrounding Black Belt counties to improve the well-being and standard of living for black people who resided in those counties. The institute designed several extension programs centered on education, farming, animal husbandry, housing improvements, nutrition, and health. Prior to the opening of the Macon County Public Health Department in 1946, all public health functions by the state of Alabama for black people in Macon County and several surrounding counties were conducted at the John A. Andrews Memorial Hospital, a private hospital operated by Tuskegee Institute. The proposed research study of syphilis was originally sponsored by the Rosenwald Health Fund, as a diagnostic and treatment program in six southern states, including Alabama, to address the national concerns about the syphilis epidemic that was expanding throughout the country.
The stock market crash forced the Rosenwald Fund to reevaluate its program funding, and the original study was dropped. In an effort to salvage the program, the U.S. Public Health Service proposed a more limited study of untreated syphilis, as a comparison study to the retrospective study conducted in Oslo, Norway, in 1925. The USPHSS began with approximately 600 men, 400 diagnosed with syphilis and 200 without the disease. When the study began, arsenic and mercury were used to treat those people infected with the disease. The men with syphilis were not treated. A concerted effort by the U.S. Public Health Service, the Alabama State Health Department, and local physicians was coordinated to ensure that even those men without the disease, who were later presented with syphilis, were not dropped from the study but were switched and included with the infected population.
The USPHSS did not stop in 1943, even when the U.S. Public Health Service determined that penicillin was the most effective method for treating venereal diseases, particularly syphilis and gonorrhea. And the Nuremburg Trial and the declaration against human experimentation without the consent of the subject did not end the study. Nor did the scientific community raise any concerns about unethical behavior by the research team or plausible bioethical violations by the U.S. Public Health Service or the Centers for Disease Control (CDC) until the 1950s. Over the 40-year period (1932–72), several articles were published in the scientific literature about findings from the study. Additionally, information gathered from the study was included in the curriculum of many medical schools throughout the country.
By the late 1960s, several employees at the CDC had expressed grave concerns regarding the study. Nonetheless, the study continued until 1972, when an Associated Press article, written by Jean Heller, was published in the New York Times. The revelation resulted in a class-action suit filed in July 1973 by noted civil rights attorney Fred D. Gray and, among other study participants, Charlie W. Pollard. A settlement of approximately 10 million dollars was reached in 1974.
On 16 May 1997, President Bill Clinton apologized to the surviving men and their families, the Tuskegee community, and the African American population on behalf of the country for the study. At a White House ceremony in Washington, D.C., Clinton said the government did something that was “wrong, gravely and morally wrong.” Five of the remaining seven study survivors, including Charlie Pollard, were present. As a part of the apology, Clinton mandated that the Department of Health and Human Services support the establishment of the Tuskegee University National Center for Bioethics in Research and Health Care on the campus of Tuskegee University and that a legacy museum be launched. In 1999, the Bioethics Center opened on the campus of Tuskegee University to conduct research, education, and community engagement activities related to bioethics, public health ethics, health disparities, and health equity. After more than a decade, the Bioethics Center continues its mission.
RUEBEN WARREN
Tuskegee University
Fred Grey, The Tuskegee Syphilis Study (1998); James H. Jones, Bad Blood: The Tuskegee Syphilis Experiment (1993); Ralph V. Katz et al., American Journal of Public Health (June 2008); Susan M. Reverby, Examining Tuskegee: The Infamous Syphilis Study and Its Legacy (2009), ed., Tuskegee’s Truths: Rethinking the Tuskegee Syphilis Study (2000).
The U.S. Gulf Coast is home to approximately 200,000 Southeast Asian Americans, those who fled their homelands in the aftermath of the Vietnam War and surrounding conflicts in Cambodia and Laos. Vietnamese Americans represent the vast majority of Southeast Asian Americans (and Asian Americans in general) within the Gulf Coast states of Texas, Louisiana, Mississippi, and Alabama. Their arrival and resettlement in the region began shortly after the surrender of South Vietnam to North Vietnamese Communist forces in late April 1975. In the lead-up to the Communist takeover of the South, the United States orchestrated a massive evacuation of South Vietnamese elites, particularly military officials and others who had served the U.S. occupation. These evacuees are known as “first wave” refugees, and they were taken to government bases in Guam, Thailand, Wake Island, Hawaii, and the Philippines as part of Operation New Life and then transferred to emergency relocation centers in the United States: Camp Pendleton in California, Camp Chafee in Arkansas, Elgin Air Force Base in Florida, and Fort Indiantown Gap in Pennsylvania. From there, refugees would be dispersed to various U.S. cities for permanent resettlement, including the Gulf Coast cities of New Orleans and Houston.
Not all first-wave refugees were among the elite. In 1975, tens of thousands of devout Vietnamese Catholics—many of them farmers and fishermen—were also evacuated. Vietnamese Catholics considered themselves particularly susceptible to acts of Communist reprisal as a result of the Catholic Church’s support of the U.S. occupation (and for its soft stance on the previous French colonial rule). The Catholic Church coordinated efforts with U.S. forces to ensure that many of its most devout members were evacuated from South Vietnam. Among them was the Catholic community from the northern Vietnamese villages of Bui Chu and Phat Diem. These parishioners had originally fled together from North Vietnam in 1954, following the final defeat of French colonists by the Communist-led national liberation effort. Together, they resettled to the southern Vietnamese villages of Haing Tau and Phuc Thinh, located southeast of Saigon, where for the time being Communist forces held little sway. But with the complete Communist victory of 1975, there was no future safe haven within Vietnam. And so they evacuated and resettled together once more—this time to the United States. Many were taken to Elgin Air Force Base and Camp Chafee. They were paid a visit by Archbishop Philip Hannan, then head of the New Orleans archdiocese, who was intrigued by the story of the devout Catholic refugees who had traveled together since 1954. The archbishop extended an open invitation to the refugees to resettle in New Orleans. New Orleans was a Catholic city, and it also possessed a tropical climate and seafaring opportunities, for which the Vietnamese were well suited. In the decades following resettlement, the refugees would indeed find steady work as Gulf Coast shrimpers.
The epicenter of the New Orleans Vietnamese American community is in New Orleans East, in a neighborhood known as Versailles, located approximately 20 miles east of downtown. Lacking the density of other parts of the city and surrounded by wetlands, Versailles has allowed the Vietnamese Americans to reestablish cultural and religious institutions, as well as create what some describe as a distinct Vietnamese American landscape. Over the years, the Vietnamese Americans would emerge as an economically diverse community. Many found steady work as shrimpers, but others would languish in working poverty in manufacturing firms. Others would subsist on welfare for multiple generations. The poverty rate among Vietnamese Americans in New Orleans is 31.7 percent, compared to 27.9 percent for the city as a whole. Those who were able to save and pool together resources began purchasing small businesses during the 1980s. As such, Vietnamese American–owned lunch counters, grocery stores, and butcher shops are now common throughout the city.
After Hurricane Katrina flooded New Orleans in 2005, the residents of Versailles were among the first communities to return and rebuild their neighborhood. Some attribute their remarkable rate of return to a history of resilience that dates back to their original displacement from North Vietnam in 1954; others attribute it to the strong networks developed through the Catholic Church; and still others point to the way in which the Versailles community developed strong multiracial alliances, particularly with their African American neighbors, who represented nearly 80 percent of New Orleans East prior to the arrival of the refugees. Indeed, in the post-Katrina moment, the Vietnamese Americans seemed to defy their image as a quiescent and conservative immigrant community. Their leader excoriated the Bush administration for its mishandling of the rescue and recovery efforts. Meanwhile, its residents—both young and old—engaged in a bold civil disobedience action to close a toxic landfill in Versailles that city officials had opened in order to dump one-third of the Katrina debris.
The establishment of a Vietnamese American community in New Orleans provided a gateway to the settlement of other Southeast Asian American communities along the Gulf Coast, most notably Gulfport-Biloxi, Miss., and Bayou La Batre, Ala. Together with New Orleans, these two cities constitute what some scholars refer to as the manufacturing periphery of the “Deep South triad,” once booming port cities that in recent decades have been left in the shadow of the New South. While cities such as Houston, Atlanta, and Charlottesville became destinations for industries abandoning the Rust Belt, the manufacturing periphery did not attract large corporations that could offer new opportunities for the upwardly mobile while also attracting new waves of immigration from throughout the world. In Louisiana, Mississippi, and Alabama, the foreign-born population has remained at approximately 2 percent or less since the 1990s. As such, the arrival of Vietnamese Americans can be considered the lone “new immigration” event in the Deep South triad over the past three decades. The combined Vietnamese American population in Mississippi and Alabama is smaller than that of New Orleans, at approximately 10,000. It is made up of many second-wave refugees: those who arrived in the United States after the passage of the Refugee Act of 1980, through which Congress authorized the resettlement of tens of thousands of Vietnamese, Cambodians, and Laotians, who for years had been surviving in United Nations refugee camps. Second-wave refugees tended to be from poorer backgrounds than their first-wave counterparts, and many of them had witnessed atrocities and experienced harrowing journeys of escape before reaching UN camps. There are significant, if subtle, indicators of second-wave refugee presence in Biloxi/Gulfport and Bayou La Batre, including the arrival of many non-Catholic (mostly Buddhist) Vietnamese Americans, as well as Cambodian and Laotian populations of approximately 1,700. Still, these communities have much in common with the Vietnamese Americans of New Orleans. The fishing industry is central to livelihoods of Southeast Asian Americans in both Mississippi and Alabama. Vietnamese Americans account for one-third of all commercial seafood workers in the Gulf Coast, and at least 80 percent of the Southeast Asian community is tied to the seafood industry. So too, poverty is a reality for many Southeast Asians; the poverty rate among Vietnamese Americans is 25 percent in Mississippi and 19 percent in Alabama. The average poverty rate among Cambodian Americans in both states is approximately 20 percent.
Hurricane Katrina caused significant damage to the Southeast Asian communities of Gulf Port/Biloxi and Bayou La Batre. Some have argued that the damage was actually far worse in these cities than in New Orleans. But unlike the case in New Orleans, the Vietnamese Americans in these cities were not backed by a powerful archdiocese that could coordinate relief efforts and hold accountable those in power. These cities did not draw the same post-Katrina national attention that was showered upon New Orleans.
In 2010, on the eve of the fifth anniversary of Katrina, the Deep Water Horizon oil spill would significantly impact Southeast Asian Americans of Louisiana, Mississippi, and Alabama who are employed by the fishing industry. These include commercial fishermen, as well as those who harvest and process shrimp, crabs, and oysters. Not only does the fishing industry provide employment, but the daily catch is also a means of sustenance, as many workers eat what they catch. Vietnamese Americans consume more Gulf seafood than average southerners, making them even more susceptible to high levels of contaminants. Residents have complained of respiratory and dermatological problems. Moreover, studies have shown that the mental health of oil-spill victims has deteriorated. Relief efforts, including promises of monetary compensation, have been difficult or ineffective because of language barriers that have prevented some Southeast Asians from navigating compensation programs. Meanwhile, finding employment in other industries is difficult because of language barriers and a skills mismatch. Fortunately, many local community groups have organized to demand justice for Southeast Asian Americans’ lost income and altered lives. Efforts include translation services and advocacy for governmental recognition of the full impact of the spill on Southeast Asian communities.
The Houston area also has a significant Southeast Asian American population. But unlike the cities of the Deep South triad, Houston is a major center of commercial activity in the New South. Vietnamese resettlement to the area has therefore followed a different trajectory. Most Vietnamese who live in Houston and its surrounding suburbs are employed in a variety of industries, mainly manufacturing and retail and other services. Yet those who live in nearby coastal cities such as Galveston Bay and Rockport have historically made a living in the fishing industry, much like their counterparts in the Deep South triad. Indeed, Vietnamese American fishermen made national headlines in the early 1980s when they came under attack by white shrimpers who saw the newcomers as a threat to their livelihoods. Before long, the Ku Klux Klan was brought into the fray, initiating a campaign of intimidation that included burning Vietnamese American–owned boats. The Klan’s efforts were ultimately undone by a federal lawsuit filed by the Southern Poverty Law Center, arguing that its actions not only violated the civil rights of Vietnamese Americans but also the state’s arcane antitrust laws.
ERIC TANG
University of Texas at Austin
Harvey Arden, National Geographic, September 1981; Jean Shiraki, Asian American Policy Review 21 (2011); U.S. Census Bureau, Census 2000.
Two things above all others have changed the modern South: air-conditioning and the Voting Rights Act. Unfortunately, Americans have a better understanding of how air-conditioning functions than they do the Voting Rights Act.
Because discriminatory administration of state laws and constitutional amendments undermined federal protection of the rights of minority voters, Congress passed the Voting Rights Act in 1965, changing the landscape of electoral politics in America and overthrowing three generations of disfranchisement. After the Civil War and emancipation, Reconstruction brought to formerly enslaved African Americans freedom, citizenship, and the right to vote under the Thirteenth, Fourteenth, and Fifteenth Amendments. Yet, when Reconstruction ended, these constitutional amendments did not assure a fair and equal vote. Recalcitrant whites, including members of organizations such as the Ku Klux Klan, used terrorist and fraudulent antisuffrage activities to deny African Americans the right to vote. A series of court cases systematically dismantled the civil and voting rights legislation of the first Reconstruction. Legal methods of disfranchising African Americans included gerrymandering, at-large elections, registration and secret ballot laws, the poll tax, literacy tests, and the white primary. By the early 20th century, these methods had effectively disfranchised millions of African Americans. In 1958, the Civil Rights Commission reported that there were 44 counties in the Deep South where there was not a single black voter registered. Many of these counties had large African American populations; some had black American majorities.
The 1965 Voting Rights Act banned literacy tests, facilitated lawsuits to prohibit discriminatory laws or practices, and sent federal voting registrars into intractable areas. In addition, section 5 of the Voting Rights Act required “covered jurisdictions,” all initially in the South, to obtain “preclearance” from the Department of Justice for any change in their electoral procedures. An immediate effect of more minority voters was the replacement of blatant bigotry in electioneering with more subtle racial appeals. A longer-term effect has been the election of minority citizens to almost every level of government.
South Carolina, joined by other southern states, challenged the Voting Rights Act in 1966 in South Carolina v. Katzenbach, claiming that the act violated its right to control and implement elections. After the Supreme Court rejected this challenge, Mississippi and Virginia filed Allen v. Board of Elections (1969), contending, again unsuccessfully, that the act protected only the right to cast a ballot, not the right to have nondiscriminatory election structures, such as district elections. Congress renewed all the provisions of the Voting Rights Act in 1970 and 1975, amending it in 1975 to include, in section 203, provisions to protect language minorities, such as Asian, Hispanic, and Native American voters.
After its initial victories in court, the Voting Rights Act began to suffer defeats. In Beer v. U.S. (1976), the Supreme Court ruled that section 5 of the act did not prevent discriminatory election laws generally but only those that resulted in a “retrogression” of minority influence. For instance, after African Americans were enfranchised by the act, a local jurisdiction could shift district lines in order to ensure a continuation of all-white government, and the Department of Justice had to allow the change to go into effect. Even more significant, the U.S. Supreme Court ruled in Mobile v. Bolden (1980) that no election law violated section 2 of the act or the Fifteenth Amendment to the U.S. Constitution unless it could be shown that the law had been adopted with a racially discriminatory intent. During the First Reconstruction, in 1874, Mobile, Ala., had instituted at-large elections; after the passage of the act in 1965, many other southern localities switched from district to at-large elections. In such elections, because whites who outnumber minorities generally vote for whites (that is, racial bloc voting), minorities had a much more difficult time getting elected, and under Bolden, minority plaintiffs had a much more difficult time winning lawsuits.
In 1982, Congress not only renewed the preclearance provision of section 5 for 25 years, it also effectively overturned Bolden by making clear that a proof of intent was unnecessary to win a section 2 case. Moreover, it weakened Beer by instructing the Justice Department to not preclear state or local laws that were discriminatory in either intent or effect. Ironically, in view of the heated two-year struggle in Congress, this strongest version of the act passed by much more overwhelming congressional majorities than ever before. Even more surprising, within two days of the signing of the renewed act, the Supreme Court in Rogers v. Lodge announced an effect standard for the act that was nearly identical to that just passed by Congress and that implicitly repudiated the Bolden decision of 1980.
Along with the one-person, one-vote ruling of the Supreme Court in Reynolds v. Sims (1964), the Voting Rights Act has added another dimension to the politics of redistricting following each decadal census. Once a secretive, unchallengeable practice, redistricting is now played out in courtrooms as well as in backrooms, often ending up before the U.S. Supreme Court. The most startling Supreme Court decision was Shaw v. Reno (1993). Disfranchisement had prevented African Americans from electing a single member of Congress from North Carolina from 1898 to 1965; after 1965, the state’s leaders had repeatedly redrawn district boundaries to keep the 11-member delegation all white in a 23 percent black state. But after the 1982 amendments strengthened the Voting Rights Act, a newer generation of North Carolina leaders, under pressure from the Department of Justice, drew two districts in which 54 percent of the voters were African American. In order to preserve the seats of white Democratic incumbents, North Carolina legislators drew new black-majority districts in even stranger shapes than the districts they replaced. Ignoring previous prowhite racial gerrymandering in the state, five members of the Supreme Court denounced the most integrated congressional districts in North Carolina’s history as “segregated” and declared them unconstitutional. White-majority districts could take any shape, the same five justices wrote in a later case from Texas (Bush v. Vera, 1996), but black-majority districts could not look “bizarre” to judges. And in a Georgia case, Miller v. Johnson (1995), the Supreme Court by the same 5–4 vote announced that black-majority districts could not be drawn with a predominantly racial intent and that white-majority districts could not be challenged under this standard. Finally, in two cases from Bossier Parish, La., the five-person Supreme Court majority ruled that the Justice Department under section 5 of the act had to preclear any election law change, unless it made minorities worse off than before the change. Bossier’s school board could thus remain all white.
The Voting Rights Act rid the country of the most outrageous forms of voter disfranchisement. Equal voting rights has meant representation for a large minority of citizens and has brought a tremendous increase in minority elected officials, particularly Native Americans in the West and Hispanics in California and Texas and the election of literally thousands of African Americans to offices across the old Confederacy. The Voting Rights Act is a success story. Designed to increase minority voter registration, it has done so. It has also reduced election-related violence, increased responsiveness and the provision of services to minorities, made the political talents of the minority community, especially African Americans in the South, more available to society as a whole, made it possible for southern solons to support civil rights, made racial politics unfashionable, and opened opportunities for minorities to pursue careers in politics. Despite its significant weakening by a 5–4 majority of the U.S. Supreme Court in the 1990s, the Voting Rights Act continues to have a tremendous influence on American, and especially southern, political life.
ORVILLE VERNON BURTON
University of Illinois at Urbana-Champaign
Chandler Davidson and Bernard Grofman, eds., Quiet Revolution in the South: The Impact of the Voting Rights Act, 1965–1990 (1994); David Garrow, Protest at Selma: Martin Luther King Jr. and the Voting Rights Act of 1965 (1978); Nick Kotz, Judgment Days: Lyndon Baines Johnson, Martin Luther King Jr., and the Laws That Changed America (2005); J. Morgan Kousser, Colorblind Injustice: Minority Voting Rights and the Undoing of the Second Reconstruction (1999); Steven F. Lawson, Black Ballots: Voting Rights in the South, 1944–1969 (1976).
(b. 1944) WRITER.
Alice Walker’s The Color Purple is saturated with the atmosphere of the South, the rural Georgia farmland of her childhood. Walker, who has written more than 29 books of poetry, fiction, biography, and essays, finds strength and inspiration in the land and the people: “You look at old photographs of Southern blacks and you see it—a fearlessness, a real determination and proof of a moral center that is absolutely bedrock to the land. I think there’s hope in the South, not in the North,” she says. In her work, the human spirit conquers racism.
Alice Walker was born in 1944 in Eatonton, Ga., the youngest of eight children. Her parents were poor sharecroppers. As a child, she read what books she could get, kept notebooks, and listened to the stories her relatives told. She attended Spelman College in Atlanta and graduated from Sarah Lawrence College in Bronxville, N.Y., where her writing was discovered by her teacher Muriel Rukeyser, who admired the manuscript that Alice had slipped under her door. Rukeyser sent the poems to her own editor at Harcourt Brace, and this first collection of Walker’s poetry, Once, was published in 1965. From 1966 through 1974, Walker lived in Georgia and Mississippi and devoted herself to voter registration, Project Head Start, and writing. She married Mel Leventhal, a Brooklyn attorney who shared her dedication to civil rights in his work on school desegregation cases. Their daughter, Rebecca, was born in 1969. After they left the South, Walker and Leventhal lived for a while in a Brooklyn brownstone, and then they separated. Alice Walker now lives in rural northern California, which she chose primarily for the silence that would allow her to “hear” her fictional characters.
Alice Walker is the literary heir of Zora Neale Hurston and Flannery O’Connor. Walker has visited O’Connor’s home in Milledgeville, Ga., and Hurston’s grave in Eatonville, Fla., to pay homage. Walker’s novels The Third Life of Grange Copeland (1977), Meridian (1976), and The Color Purple (1982) and short stories “In Love and Trouble” (1973) and “You Can’t Keep a Good Woman Down” (1980) capture and explore her experiences of the South. She draws on her memories and her family’s tales of Georgia ancestors in creating the portraits of rural black women in The Color Purple. Their speech is pure dialect—colloquial, poetic, and moving. Walker’s poems too are filled with the rich landscape and atmosphere of the South.
Consciousness of the South has always been central to Alice Walker. The flowers and fruits in her California garden recall her mother’s garden back in Georgia, a place so important to Walker that it became the inspiration for her collection of essays entitled In Search of Our Mothers’ Gardens: Womanist Prose (1983). Her mother’s creativity was a compelling example to Alice Walker as well as a constant source of beauty amid the poverty of rural Georgia. Her mother died in 1993 at the age of 80. Her headstone reads “Loving Soul, Great Spirit.”
Among her many accomplishments and honors, Alice Walker has been Fannie Hurst Professor of Literature at Brandeis University and a contributing editor to Ms. magazine. In her writing and teaching she continually stresses the importance of black women writers. She edited a Zora Neale Hurston reader and wrote a biography of Langston Hughes for children. In 1984, Walker launched Wild Trees Press in Navarro, Calif., publishing the work of unknown writers until 1988. The film version of The Color Purple was released in 1985 to much acclaim. In 2004, the musical version of The Color Purple premiered in Chicago, and it opened on Broadway in 2005. Alice Walker continues to champion vital issues such as female genital mutilation, which is central in her 1992 novel Possessing the Secret of Joy. Alice Walker’s literary awards include the Rosenthal Award of the National Institute of Arts and Letters, the Lillian Smith Award for her second book of poems, Revolutionary Petunias (1972), and the American Book Award and the Pulitzer Prize for fiction for The Color Purple (1983).
ELIZABETH GAFFNEY
Westchester Community College, SUNY
David Bradley, New York Times Magazine, January 1984; Robert Towers, New York Review of Books, 12 August 1982; Alice Walker, Atlanta Constitution, 19 April 1983; Evelyn C. White, Alice Walker: A Life (2004).
(1913–1998) AUTHOR.
Margaret Walker, who was born in Birmingham, Ala., played an active role in American arts and letters for at least seven decades. She was a distinguished poet, respected essayist, groundbreaking novelist, and award-winning educator. Her final collection of poetry, This Is My Century, accurately describes the wide range of themes and issues encompassed in her work, with racial concerns inevitably central. The 20th century became Margaret Walker’s century, as she “saw it grow from darkness into dawn” (“This Is My Century”). Her writings demonstrate vestiges of the Harlem Renaissance of the 1920s and 1930s, traces of the Black Arts Movement of the 1960s and 1970s, and markings of what some might call the Womanist Renaissance of the 1980s.
Walker wrote across literary genres, but she is most accomplished as a poet. She began publishing poetry in local vehicles at the age of 12 and gained her first appearance in a national publication by age 19 when “I Want to Write” appeared in Crisis, under the editorship of W. E. B. Du Bois. Just a few years later, at age 22, “For My People” was printed in the November 1937 issue of Poetry: A Magazine of Verse, which launched her career as a poet. In 1940, Walker collected 26 poems under the title For My People as her master’s thesis, and it was published as a collection in 1942. She produced four other significant collections of poetry: The Ballad of the Free (1966), Prophets for a New Day (1970), October Journey (1973), and This is My Century: New and Collected Poems (1989). Prophets for a New Day celebrates the civil rights movement; October Journey (1973) takes its name from a piece written in honor of her husband, whom she met in the month of October and who died in the month of October after 37 years of marriage. This Is My Century: New and Collected Poems (1989) was her last collection of poetry, and it presents 100 poems—37 of which had never appeared in print.
Walker chose to write in only three forms: (1) narratives as stories or ballads, (2) lyrical songs and sonnets, and (3) the long line of free verse punctuated with a short line. Within these three forms, she pays attention to an assortment of issues and themes. At times, she elegizes the South, as in “Southern Songs,” in which she writes that she longs to have her “body bathed again by southern souls” and to “rest unbroken in the fields of southern earth.” In other pieces, she memorializes the acts of cultural heroes, such as Paul Lawrence Dunbar, Harriet Tubman, Mary McCleod Bethune, and Owen Dodson, and she struggles to place her own life and work within a collective black American experience in pieces such as the legendary “For My People,” “A Litany of Black History for Black People,” “A Litany from the Dark People,” and “They Have Put Us on Hold.” What remains constant throughout is Walker’s ability to capture everyday experiences of the common and the legendary with effective cadences and striking imagery.
Many of these same characteristics are visible in her one novel, Jubilee (1966). Walker labored over Jubilee from 1934 to 1966, constructing it as a fictional tribute to the life of her maternal grandmother, Margaret Duggans Ware Brown, who was born into enslavement. Readers are able to follow the biracial protagonist, Vyry, through enslavement and Reconstruction and closely follow her ascent out of the pit of slavery. Jubilee won the 1966 Houghton Mifflin Literary Fellowship Award, and she saw the novel go through 40 printings, sell over 2 million copies, be published in 7 foreign countries, and be adapted as an opera. Walker unsuccessfully sued Alex Haley for copyright infringement of Jubilee with his publication of Roots.
All of Walker’s writing is geographically and ideologically grounded in the South. She taught at Jackson State University for 30 years (1949–79), where she established a Black Studies program and retired as professor emeritus. She acknowledges the South as a critical part of her artistic aesthetic, claiming that “my adjustment or accommodation to this South—whether real or imagined (mythic and legendary), violent or nonviolent—is the subject and source of all my poetry. It is also my life.” Even though Walker maintained a love for the South throughout her life, she credits Langston Hughes for telling her parents to get her out of the South, so that she could develop as a writer. Partially because of Hughes’s urgings, Walker’s parents sent her to Northwestern University where she earned a B.A. in English in 1935.
Margaret Walker was successful in gaining publishing opportunities for diverse forms of writing, and in 1988 she published a psychobiography of Richard Wright, Daemonic Genius, which grew out of their long and tumultuous friendship. In 1990, she published How I Wrote “Jubilee” and Other Essays on Life and Literature. And, in 1997, with the help of Maryemma Graham, she published a final collection of speeches and essays, On Being Female, Black, and Free. In the last decade of her life, she won countless Mississippi and national awards honoring her work, including the National Book Award for Lifetime Achievement (1993).
ETHEL YOUNG-MINOR
University of Mississippi
Amiri Baraka, Nation (4 January 1999); Maryemma Graham, ed., Conversations with Margaret Walker (2002); Margaret Walker, This Is My Century (1989).
(1856–1915) EDUCATOR.
Booker Taliaferro Washington was the foremost black educator of the late 19th and early 20th centuries. He also had a major influence on southern race relations and was the dominant figure in black public affairs from 1895 until his death in 1915. Born a slave on a small farm in the Virginia backcountry, he moved with his family after emancipation to work in the salt furnaces and coal mines of West Virginia. After a secondary education at Hampton Institute, he taught in an upgraded school and experimented briefly with the study of law and the ministry, but a teaching position at Hampton decided his future career. In 1881, he founded Tuskegee Normal and Industrial Institute on the Hampton model in the Black Belt of Alabama.
Though Washington offered little that was innovative in industrial education, which both northern philanthropic foundations and southern leaders were already promoting, he became its chief black exemplar and spokesman. In his advocacy of Tuskegee Institute and its educational method, Washington revealed the political adroitness and accommodationist philosophy that were to characterize his career in the wider arena of race leadership. He convinced southern white employers and governors that Tuskegee offered an education that would keep blacks “down on the farm” and in the trades. To prospective northern donors and particularly the new self-made millionaires such as Rockefeller and Carnegie, he promised the inculcation of the Protestant work ethic. To blacks living within the limited horizons of the post-Reconstruction South, Washington held out industrial education as the means of escape from the web of sharecropping and debt and the achievement of attainable, petit bourgeois goals of self-employment, landownership, and small business. Washington cultivated local white approval and secured a small state appropriation, but it was northern donations that made Tuskegee Institute by 1900 the best-supported black educational institution in the country.
The Atlanta Compromise Address, delivered before the Cotton States Exposition in 1895, widened Washington’s influence into the arena of race relations and black leadership. Washington offered black acquiescence in disfranchisement and social segregation if whites would encourage black progress in economic and educational opportunity. Hailed as a sage by whites of both sections, Washington further consolidated his influence with his widely read autobiography, Up from Slavery (1901), the founding of the National Negro Business League in 1900, his celebrated dinner at the White House in 1901, and control of patronage politics as chief black adviser to presidents Theodore Roosevelt and William Howard Taft.
Washington kept his white following by conservative policies and moderate utterances, but he faced growing black and white liberal opposition in the Niagara Movement (1905–9) and the NAACP (1909–), groups demanding civil rights and encouraging protest in response to white aggressions such as lynchings, disfranchisement, and segregation laws. Washington successfully fended off these critics, often by underhanded means. At the same time, however, he tried to translate his own personal success into black advancement through secret sponsorship of civil rights suits, serving on the boards of Fisk and Howard universities, and directing philanthropic aid to these and other black colleges. His speaking tours and private persuasion tried to equalize public educational opportunities and to reduce racial violence. These efforts were generally unsuccessful, and the year of Washington’s death marked the beginning of the Great Migration from the rural South to the urban North. Washington’s racial philosophy, pragmatically adjusted to the limiting conditions of his own era, did not survive the change.
LOUIS R. HARLAN
University of Maryland
W. Fitzhugh Brundage, ed., Booker T. Washington and Black Progress: “Up from Slavery” 100 Years Later (2003); Louis R. Harlan, Booker T. Washington, 2 vols. (1972, 1983); David H. Jackson Jr., Booker T. Washington and the Struggle against White Supremacy: The Southern Educational Tours, 1908–1912 (2008); August Meyer, Negro Thought in America, 1880–1915 (1963); Robert J. Norrell, Up from History: The Life of Booker T. Washington (2009); Raymond W. Smock, ed., Booker T. Washington in Perspective: Essays of Louis R. Harlan (2006).
In fall 1948, WDIA in Memphis, Tenn., became the first radio station in the South to adopt an all-black programming format. The station was owned by two white businessmen, but the man most responsible for the format change at WDIA was Nat D. Williams, a local black high school history teacher. Williams was brought into the station to do his own show on an experimental basis; it proved to be an overnight sensation. He was the first black radio announcer in the South to play the popular rhythm-and-blues records of the day over the airways. His show was so successful that within six months of its debut WDIA had changed its format from a classical music station to one appealing solely to black listeners and advertisers.
In addition to initiating an entirely new music format, Williams launched a wide variety of programming innovations at WDIA and recruited other talented blacks onto the airways. His first recruits were fellow high school teachers A. C. Williams and Maurice Hulbert. Both men went on to have long and distinguished careers in black radio. Nat Williams’s most famous recruit was a youthful B. B. King, who used the exposure on WDIA to initiate his career as the country’s premier urban blues artist. Rufus Thomas became one of the station’s most popular on-air disc jockeys. In addition to these black males, Nat D. Williams also recruited the South’s first black female announcers to WDIA’s airways; two of the best known were Willa Monroe and Starr McKinney, both of whom did programs oriented toward black women.
Gospel music, religious programs, and black news and public affairs shows were also prominent on WDIA. The most acclaimed public affairs program was called Brown America Speaks, which was created and hosted by Nat D. Williams. The program addressed race issues from a black perspective and won an award for excellence from the prestigious Ohio State Institute for Education in radio in 1949. With the success of WDIA, other radio stations around the country also began to adopt black-oriented formats, and black radio became a fixture in commercial broadcasting nationwide. WDIA still programs for a black audience in Memphis, making it the oldest black-oriented radio station in the country.
BILL BARLOW
Howard University
Bill Barlow, Voice Over: The Making of Black Radio (1999); Louis Cantor, Wheelin’ on Beale: How WDIA-Memphis Became the Nation’s First All-Black Radio Station and Created the Sound That Changed America (1992); Robert Gordon, It Came from Memphis (1995); Margaret McKee and Fred Chisenhall, Beale Black and Blue: Life and Music on Black America’s Main Street (1981); Charles Sawyer, The Arrival of B. B. King: The Authorized Biography (1980).
(1862–1931) JOURNALIST AND SOCIAL ACTIVIST.
For Ida B. Wells-Barnett, “southern culture” was an embattled site of identification. She was a native of Holly Springs, Miss., born a slave in 1862. There she attended Rust College, run by the American Missionary Association, and was strongly influenced by its “Yankee” teachers. Wells-Barnett was baptized in the Methodist Episcopal Church. After her parents’ death in the yellow fever epidemic of 1878, she moved to Memphis, Tenn., around 1880 and lived there until 1892. That year, she published her most important writing, a pamphlet entitled “Southern Horrors: Lynch Law in All Its Phases.” This essay placed southern codes of honor in the context of the horror of the lynching-for-rape scenario, part of a violent, morally hypocritical, crassly economic system of white supremacy. White men justified the murder of “bestial” black men by claiming the role of protectors of “weak” white women; Wells-Barnett proved that, statistically, the rape charge was rarely in play during actual, documented lynchings. Instead, the cry of rape was often a cover to punish black men who in any way challenged the social, political, or economic status quo of the South. She also pointed out that white women sometimes participated in both mob activity and consensual sex with black men. When a death threat appeared in print in 1893 because of Wells-Barnett’s newspaper criticism of lynching and southern honor, the region became off-limits for her and she left for the North. She returned only once, in 1917, to investigate the plight of 16 Arkansas farmers imprisoned for labor-organizing activity and sentenced to die in Helena, and then she went in disguise.
Ida B. Wells-Barnett became famous—to opponents, infamous—for her critique of the South, but she accomplished the work largely outside of it. In 1895, she settled in Chicago, married lawyer Ferdinand L. Barnett, and raised four children. She died there in 1931. She arguably achieved greatest prominence outside the United States during the years 1893 and 1894, when she traveled to England and Scotland to mobilize opposition to lynching in the United States. At strategic points, however, she referred to herself as a “southern girl, born and bred,” or by the pen name “Exiled.” Such identifications established her credibility as a native witness to history, especially since a black woman’s moral authority was by definition suspect in U.S. society. After a difficult period of political retrenchment in Chicago and the brutal race riot of July 1919, Wells-Barnett again accented her southern roots and reached out to the progressive elements of the white South in renewed efforts toward interracial understanding in the region, but this offer likely did not even reach the ears that had long since tuned her out.
Ironically, some of the best evidence of Ida B. Wells-Barnett’s sparsely documented personal life dates from the 1880s, when she lived in Memphis and participated in a wide array of activities that mark her as a product of the post-Reconstruction New South. She left a diary dating from December 1885 to September 1887, which provides vivid details of her life during this dynamic period. Entries describe a context not, perhaps, stereotypically “southern” or dominated by folkways. She studied Shakespeare and elocution, attended lectures by national figures like Dwight Moody, and was present at gender-and racially inclusive meetings of the Knights of Labor. The diary further documents her anger at injustice and violence directed at African Americans, some of which touched Wells-Barnett directly, as in her forced removal from a railroad “ladies” car. She was also the godmother of a child whose father was murdered, along with two business associates, during a conflict in spring 1892. This triple lynching in Memphis was a life-changing event that directed her attention to full-time anti–mob violence protest.
Ida B. Wells-Barnett organized against southern violence outside of the region, resulting in scores of local antilynching committees and the founding of the National Association of Colored Women (1896) and the NAACP (1909). Her efforts successfully positioned antilynching as a legitimate focus of national reform, but based in the urban North. In that context, individuals and groups more securely positioned than she by academic credentials, social status, or political connections in publishing, philanthropy, and government assumed leadership of the issue in the World War I era. Although Ida B. Wells-Barnett’s southernness enabled her powerful voice to emerge in the 1890s, she was eclipsed by the competitive, money-driven, and consolidating trends that came to characterize social reform in the United States over her lifetime.
PATRICIA A. SCHECHTER
Portland State University
Miriam DeCosta-Willis, ed., The Memphis Diary of Ida B. Wells: An Intimate Portrait of the Activist as a Young Woman (1995); Trudier Harris, ed., Selected Works of Ida B. Wells-Barnett (1991); Patricia A. Schechter, Ida B. Wells-Barnett and American Reform, 1880–1930 (2001); Ida B. Wells-Barnett, Crusade for Justice: The Autobiography of Ida B. Wells (1970).
(1923–1953) COUNTRY MUSIC SINGER.
Widely acclaimed as country music’s greatest singer and composer, Hiram Hank Williams was born on 17 October 1923 at Olive Hill, near Georgiana, Ala., the son of a sawmill and railroad worker. He was introduced to music in the Baptist Church where he was faithfully taken by his mother. According to popular legend, he learned both songs and guitar chords from a black street singer in Georgiana, Rufus Payne (“Teetot”). Williams often recorded country blues and is a prime example of the influence of African American music on country music.
Williams’s evolution as a professional performer and composer began at the age of 14 when he won a talent show in a Montgomery theater singing his own composition, “WPA Blues.” He obtained his first radio job in the same year, 1937, at WSFA in Montgomery. When World War II—that crucible that integrated country music’s disparate regional styles and ultimately nationalized it—came, Williams worked in the Mobile shipyards and sang regularly in the honky-tonks of south Alabama. By the time the war ended, Williams had experienced eight hard years of performing and had built a style that reflected the composite musical influences of his youth: gospel, blues, and old-time country. Professionally, he acknowledged a debt to the Texas honky-tonk singer Ernest Tubb and to the Tennessee mountain singer Roy Acuff, whose styles Williams fused in a way that reflected a similar synthesis in the larger country field during the war and immediate postwar years.
Williams’s ascendance to fame began shortly after the war when he became associated with Fred Rose, the famous Nashville songwriter and publisher. Rose encouraged Williams’s natural songwriting abilities and published his songs; helped him obtain recording contracts with Sterling and MGM Records; persuaded Molly O’Day, one of the greatest singers of the time, to record some of Williams’s compositions; and helped him get a position on KWKH’s Louisiana Hayride in Shreveport. The Hayride, which was then second only to the Grand Ole Opry as a successful country radio show, was the vehicle that launched Williams on the road to performing fame.
Hank Williams’s national ascendancy came in 1949 when he recorded an old pop tune, “Lovesick Blues,” which featured the yodeling he had learned from another Alabama singer, Rex Griffin. Williams soon moved to the Grand Ole Opry, where he became the most popular country singer since Jimmie Rodgers. In the brief span from 1949 to 1953, Williams dominated the country charts with songs that are still considered classics of country music: “I’m So Lonesome I Could Cry,” “Cold Cold Heart,” “Your Cheating Heart,” “Honky Tonk Blues,” “Jambalaya,” and many others. With his band, the Drifting Cowboys, Williams played a major role in making country music a national phenomenon. With a remarkably expressive voice that moved with equal facility from the strident yodeling of “Long Gone Lonesome Blues” to the gentle lyricism of “I Just Told Mama Goodbye,” Williams communicated with his listeners in a fashion that has only rarely been equaled by other country singers. The word “sincerity” has no doubt been overused in describing the styles of country musicians, but in the case of Williams it means simply that he as a singer convincingly articulated in song a feeling that he and his listeners shared.
As a songwriter—not as a singer—Williams played a most important role in breaking down the fragile barriers between country and pop music. Williams’s singing was quintessentially rural, and his own records never “crossed over” into the lucrative pop market. His songs, though, moved into the larger sphere of American popular music and from there, perhaps, into the permanent consciousness of the American people. Like no earlier country writer’s works, Williams’s songs appeared with great frequency in the repertoires of such pop musicians as Tony Bennett, Frankie Laine, and Mitch Miller. For good or ill, this popularization in pop music continues.
Commercial and professional success did not bring peace of mind to the Alabama country boy. A chronic back ailment, a troubled marriage, and a subsequent divorce and remarriage accentuated a penchant for alcohol, which he had acquired when only a small boy. After being fired by the Grand Ole Opry for drunkenness and erratic behavior, he returned to the scene of his first triumphs—the Louisiana Hayride. He died of a heart attack on 1 January 1953, but his legacy lives on in his songs and in the scores of singers, including his immensely talented son, Hank Jr., who still bear his influence.
BILL C. MALONE
Madison, Wisconsin
Colin Escott, Hank Williams: The Biography (1995); Chet Flippo, Your Cheatin’ Heart: A Biography of Hank Williams (1981); Paul Hemphill, Lovesick Blues: The Life of Hank Williams (2005); George William Koon and Bill Koon, Hank Williams, So Lonesome (2002); Bill C. Malone, Country Music, U.S.A.: A Fifty-Year History (1968; rev. eds., 1985, 2002); Roger M. Williams, Sing a Sad Song: The Life of Hank Williams (1981).
(1908–1960) WRITER.
Born near Natchez, Miss., on 4 September 1908, Richard Wright, like the famous protagonist of his first novel, was a native son in a region obsessed with race. The child of a sharecropper who deserted the family in 1914, young Richard moved with his mother during his early years from one to another of the extended family’s homes in Arkansas and Jackson, Miss., living in Memphis after he completed the ninth grade. Poverty and the fear and hate typifying post-Reconstruction racial relations in the Lower South, more than the sustaining power of black culture or education in segregated schools, prepared him to be an author. If he omitted from his autobiographical record his experience with middle-class values in his mother’s family, or the effect of the motions and rituals of the black world, there was psychological truth in his record of nativity as written in Black Boy (1945). He was surely a product of the older South and of the great black migration to the cities; his distinction lay in his refusal to be simply a product.
In “The Ethics of Living Jim Crow,” published in a WPA writer’s anthology, American Stuff (1937), which first appeared in the year he moved from Chicago to New York City, Wright revealed the dynamics of his life’s work as an author. Caste, he wrote, prescribed his public behavior; but though he knew its requirements, he would not accede. Terror could not induce him to adopt the pretense that he knew his place. Conflict was unavoidable and its only resolution was violence.
Uncle Tom’s Children (1938, and expanded 1940), the collection of novellas with which Wright won his first literary success, indicates by an irony of its title the goal southern whites had for southern blacks. The stories are united by the theme of collective response to racist terror, as the children of Uncle Tom refuse to accept the popular stereotype.
Lawd Today was the first example of Wright’s extension of southern learning to life in the migrant black communities of the North, but this apprentice novel was not published until 1963. Native Son (1940) first carried his insights to a large and appreciative audience. A Guggenheim Fellowship to complete the novel, its selection by the Book-of-the-Month Club, and its arrival within weeks at the top of the best-seller list attested to the appearance of a major American author. In the compelling character of Bigger Thomas, Wright created a complex symbol of a rising awareness that no risk is too great in order to become master of one’s own life. Through creating sympathy for Bigger’s violent actions, Wright carried the tradition of protest to new lengths.
His insider’s view of Jim Crow earned Wright acclaim for his use of literary naturalism. His projection of violence and rebellion against social conditions led to his emergence as a major literary voice of black America.
Wright’s next book, 12 Million Black Voices (1941), presented a folk history extending from slavery’s middle passage through the development of an Afro-American culture in the South and the hope of a black nation as a result of migration north. On the other hand, Black Boy, an ostensible autobiography representing the birth of the artist, necessarily suppressed the importance of group experience in order to focus on the power of the individual sensibility. Wright forged his identity among his people on southern ground but sought room to write by passage into modern life, symbolized by northern cities. This strategy becomes even clearer in the second part of the autobiography, published as American Hunger in 1977.
In time, Wright found that Jim Crow knew no regional boundaries. Chicago and then New York constrained him as much as had Mississippi. So in 1946 he moved with his wife, Ellen Poplar, whom he had married in 1941, and their daughter to Paris. Suggestions have been made that the self-imposed exile, which was to last until Wright’s death in 1960, sapped his creativity. To be sure, distance prevented intimate knowledge of contemporary changes in his native region and the black migrant communities, yet he created two novels concerning American racial relations and politics even after his exile. The Outsider, presenting an existentialist antihero living in Chicago and New York, appeared in 1953, and The Long Dream, a comprehensive reimagining of coming-of-age in Mississippi, appeared in 1958. Other fiction from the exile years includes Savage Holiday (1954), an experiment in raceless fiction, and the collection of stories, old and new, posthumously published as Eight Men (1961). This record of production hardly suggests flagging creativity.
Even more important to Wright’s career, however, was the energy he found in exile to undertake four studies on a global scale. Black Power (1954) relates observations on his travels in the Gold Coast shortly before it became the nation of Ghana; The Color Curtain (1956) reports on the anticolonial positions developed at the conference in Bandung; Pagan Spain (1957) records a trip into a culture Wright viewed as a survival of premodern Europe; and White Man, Listen! (1957) collects essays on race in America and the European colonies.
Despite the apparent departure from the experience of the American South in these later works, continuity exists between the original treatments of Jim Crow and the commentary on historical change in Africa and Asia. The prevailing subject remains race relations between whites and blacks, but beyond that is the more profound connection Wright saw in the special history of “colored” peoples. To be black in America, he believed, was to be marched forcibly into the pain of the modern world. As a representative black American, Wright already had lived the historical experience that awaited the Third World. By the power of literary imagination, Wright, with unmatched skill, drew forth the significance of his southern education for world citizenship.
JOHN M. REILLY
State University of New York at Albany
Charles T. Davis and Michel Fabre, Richard Wright: A Primary Bibliography (1982); Michel Fabre, The Unfinished Quest of Richard Wright (1973); Eugene E. Miller, Voice of a Native Son: The Poetics of Richard Wright (1990); John M. Reilly, in Black American Writers: Bibliographical Essays, ed. M. Thomas Inge and Maurice Duke (1978); Hazel Rowley, Richard Wright: The Life and Times (2002).