Chapter 5

Modernist Revolts: 1890–1920

John Brown’s raid on Harpers Ferry on October 16, 1859, and the publication of Darwin’s On the Origin of Species a little over a month later, set off a chain of events that would forever transform American intellectual life. That same year, John Dewey, the pragmatist philosopher and progressive reformer, was born in Burlington, Vermont. It was a perfectly unremarkable event at the time—the birth of a baby boy—but this one would grow up to have a rather remarkable career as a philosopher, educational reformer, and social advocate. Dewey’s labors, like Brown’s and Darwin’s before him, would greatly influence many Americans’ notions of free will and determinism, truth and falsehood, and the possibility for a shared morality in a pluralistic universe. Having come of age in the intellectual world that the Civil War and Darwinian ideas forged, Dewey constructed a philosophy that would similarly challenge old ethical certainties, while emphasizing the individual’s ability to effect positive change in an inscrutable cosmos.

Dewey and his contemporaries worked within and against the dramatic fin-de-siècle realignments of American intellectual life. It was then that the possibilities, as well as terrors, of the “blooming, buzzing confusion” of modernity came into fuller view, and with them, what Dewey would call a radically “new intellectual temper,” which he himself exemplified.1

Not only were intellectual tempers and the ideas they wrestled with changing dramatically during this period, but so too were thinkers’ conceptions of themselves and their social functions. Up until this time, “intellectual” was part of the stock words of American English, but only as an adjective to describe a type of intelligence, mental style, and erudition. But during the heated controversy of the Dreyfus Affair in France, Émile Zola and his fellow “Dreyfusards” produced their Manifest des intellectuels in 1898, and with it, a new sociological term to consider the thinker’s relationship to a broader public. No matter that Charles Maurras threw back “intellectuels” in his Action française as a term of snickering derision. For a new generation of emerging writers and thinkers in America, witnessing the whole messy affair had one benefit: it gave them a crucial term of self-definition for their roles in modern society.

For many younger thinkers and writers, the academy and the established presses were too implicated in a business culture driven by profit, and so they adopted the term intellectual as an oppositional, anti-institutional badge of honor. They looked to Europe to reveal clues about this new social type, and her or his relationship to the broader culture. There was no shortage of examples: Karl Marx, Friedrich Nietzsche, Oscar Wilde, and even Zola himself. To be sure, Marx was forced to flee Germany and later France, Nietzsche ended up clinically insane, Wilde was imprisoned, and Zola had to flee to England to avoid a prison sentence and heavy fines. But this only heightened the allure of the oppositional intellectual and provided them with a romantic image of themselves as the conscience of American culture.

Both affiliated and freelance intellectuals shared a number of concerns, however, which they recognized were particular to the role of the professional thinker in the United States. What are the duties and limitations of a democratic intellectual? If class and caste do not confer status, then what are the sources of authority for the intellectual in a democracy? How could the worlds of higher education and mass opinion be bridged without compromising the former and alienating the latter? The term intellectual invested them with a sense of responsibility to help their fellow Americans accept a modernizing world of social change and dissonance while finding new grounds to negotiate their differences and extending that spirit of democratic negotiation to the wider world.

The World’s Columbian Exposition: A Festival of Ideas

The 1893 World’s Columbian Exposition held in Chicago commemorated the four hundredth anniversary of Columbus’s arrival in the New World. Its organizers wanted to pay tribute to America’s imperialistic origins while also announcing its own grand entrance onto the world stage as a global economic power. The fair showcased sixty-five thousand international artifacts and inventions while also celebrating American technological prowess. Visitors could marvel over a “moveable sidewalk”; a dishwashing machine; all the ingredients for pancakes in one box and a friendly new character, “Aunt Jemima,” to advertise them; and the world’s first Ferris wheel, a behemoth of iron and steel, to rival the Eiffel Tower built for the Paris Exposition of 1889. The noted Chicago architect Daniel Burnham served as the fair’s project director and hired Fredrick Law Olmsted to work his magic as the fair’s landscape designer. With its “White City” of gleaming neoclassical buildings, “marble” facades and columns (made of white paint over plaster of Paris, glue, and hemp), decorative fountains and sculptures, and carefully groomed and maintained grounds, the fair celebrated, above all, Victorian values of order and refinement. The animating belief of the fair’s organizers and visitors was that material progress and moral progress went hand in hand.

The fair was a celebration of technological wonders and commercial delights, but it was also a festival of ideas, displaying the intellectual commitments and preoccupations of the period. Some of the exhibits were more retrospective than prospective, such as the ethnographic displays built on the evolutionary logic of Victorian-era anthropology, then a budding field of the social sciences. Visitors could walk along the Midway Plaisance and see how different ethnic, racial, and national groups (in the form of live human displays) were organized from the most “primitive” to the most “advanced,” thus providing a scientific apology for civilizing, imperial missions abroad and virulent ethnocentrism and racism at home.

But a number of the events showed a self-consciously “progressive” side of American thought, providing a window onto new intellectual worlds emerging at the time. The World’s Parliament of Religions, a congress held at the fair, became the subject of great fascination and curiosity in the press. It demonstrated the emergence of a cosmopolitan sensibility and an appreciation of religious diversity by introducing fair visitors to the different worldviews and spiritual practices of people from all corners of the globe. Speakers included Buddhist monks from Japan and Sri Lanka, a Zoroastrian priest from India, an archbishop from the Greek Orthodox Church, a “Yankee Mohammadan” (i.e., an American convert to Islam), a bishop of the African Methodist Episcopal Church, a rabbi, and more. Papers on “The Essential Oneness of Ethical Ideas among All Men,” “Points of Contact between Christianity and Mohammedanism,” and “Religious Duty to the Negro” reflected the shared desire among participants to find universals among all religions, and to ensure that those religious universals were brought to bear on a modernizing world. As one Unitarian contributor put it: “One is born a Pagan, another a Jew, a third a Mussulman. The true philosopher sees in each a fellow-seeker after God.”2

The organizers hoped that the Parliament of Religions would be more than a “one and done” gesture of religious ecumenism and solidarity, and they got what they wanted in the form of establishing interfaith relationships that would flower in years to come. It was at the conference that a twenty-seven-year-old student by the name of D. T. Suzuki, who accompanied his Zen Buddhist mentor as an interpreter and secretary, first met Paul Carus, a German American philosopher and publisher, who would be instrumental in exposing Suzuki to Western philosophy and the American religious landscape. A few years after the Parliament, Suzuki moved to Carus’s residence in LaSalle, Illinois, for what turned out to be an eleven-year stay, where he worked as a translator and household help. Carus introduced him to William James, whose Varieties of Religious Experience (1902) helped Suzuki formulate his claim years later that experience—not scripture, theology, or ritual—was the cardinal feature of Buddhism. And he exposed Suzuki to the very religious pluralism in America that James had written about. Suzuki returned to Japan for the next three decades but came back to the United States midcentury, where he showed the value of those eleven years of learning the landscape of American culture. Suzuki returned as the most prominent ambassador of Zen Buddhism in the West, influencing figures as diverse as Martin Heidegger, Carl Gustav Jung, Alan Watts, Karen Horney, Erich Fromm, and the Beat poets, among so many others.

image

Figure 5.1 One of the most highly anticipated events of the World’s Columbian Exposition in Chicago, the World’s Parliament of Religions of September 1893 provided a forum for international religious leaders to share their faith traditions with other clergy and fairgoers. With more than forty religious traditions represented and a daily attendance of several thousand participants, the parliament helped formally inaugurate modern interfaith dialogue in the United States. Presbyterian Historical Society, Presbyterian Church, Philadelphia

The Chicago Columbian Exposition also provided a platform for the latest in modern historical research. At a meeting of the American Historical Association held in conjunction with the fair, Frederick Jackson Turner, a thirty-one-year-old history professor from the University of Wisconsin, delivered his seminal lecture, The Significance of the Frontier in American History (1893), which would transform historical scholarship, as well as ideas about American culture and character. Turner wrote this piece in response to the 1890 federal census, which reported that there was no more Western land to be settled and so the frontier had closed. For Turner, this demanded a reckoning with the meaning of the frontier for American history.

Turner asserted that life along the frontier created antipathy toward any organizing efforts or assertions of power by an external, centralized authority. The result is that to the frontier the American intellect owes its striking characteristics. That . . . practical, inventive turn of mind, quick to find expedients; that masterful grasp of material things, . . . ; that restless, nervous energy; that dominant individualism, working for good and for evil, and withal that buoyancy and exuberance which comes with freedom—these are traits of the frontier, or traits called out elsewhere because of the existence of the frontier.3 This restless, individualistic impulse set the terms for the nascent forms of modern democracy in the United States and shaped a particular mindset and personality among the people.

Though often remembered as a piece of triumphalist Americana, Turner’s frontier thesis reflected his deep ambivalence about the fate of democracy with the closing of the frontier, and the fate of an America that prized practical knowledge over speculative and introspective thought. And though often considered a document about American democracy, it was also very much a meditation on the American mind. For Turner, the sheer abundance of land helped create an American way of thinking that was fundamentally individualistic, restless, practical, and impious toward traditions of the past. There is much to suggest that these intellectual habits were good for survival on the frontier in the nineteenth century, but little to reassure his audience at the World’s Fair that these were auspicious ways of thinking for progressive nation-building at the dawn of the twentieth century.

Pragmatism: A New Theory of Knowledge and a New Idea of “Truth”

Had Turner been speaking in the new language of turn-of-the-century American philosophy rather than history, he might have used another word to describe the intellectual processes and temperament he was after: pragmatism. Alongside Transcendentalism, pragmatism became the most important and influential philosophical tradition ever produced in America. Indeed, pragmatism can be seen as the philosophical heir of the Transcendentalists’ antiestablishment impulse, with its recognition of the artificiality and potential tyranny of intellectual conventions. But pragmatism took this logic two steps further by developing both a more rigorous epistemology and methodology. It focused on the products of mental activity, as well as on the processes by which they are created. Pragmatism abandoned the search for universal, timeless truth and emphasized instead that a proposition is true if the practical consequences it implies or predicts do in fact follow in experience. A philosophy that welcomed the dynamism of truth, pragmatism reflects the vibrant, contested, and democratic society from which it came, while seeking to advance the better angels of its nature.

It is never easy to pinpoint with certainty all of the intellectual factors in play when a dramatically new way of viewing the world comes into existence. But in the case of pragmatism, Darwin’s account of evolution, and the revolution it wrought in all areas of late nineteenth-century thought, played a crucial role. Darwin had offered a vision of the natural world as ever-changing and argued that randomness and chance, but also utility, are the forces that make for its dynamism. He did not just theorize his view of evolution; he also provided evidence to support his claims. This impulse to lay bare the dynamic workings of the world by use of scientific testing and evidence would go by many names in the late nineteenth-century world: scientism, naturalism, scientific naturalism, positivism, and, of course, Darwinism. But central to all of them was that in an unstable natural world, the range of inquiry must be limited to focus on truth claims that can be empirically verifiable.

John Dewey best summed this up in “The Influence of Darwinism on Philosophy” (1909). He held that Darwin’s impact was to make “the principle of transition” the basis of all inquiry—not just biology, botany, and zoology, but epistemology and ethics as well. Darwinism jettisoned “absolute origins and absolute finalities” (as well as a prioris and a telos) from the theory of knowledge, demanding that modern inquirers “explore specific values and the specific conditions that generate them.” Dewey recognized that abandoning all preformulated ideas about where scientific discoveries, claims to truth, and values should lead was a daunting challenge. But he preferred to advocate Darwinism as “philosophy that humbles” inquirers to actually test how their claims “work out in practice.” “In having modesty forced upon it,” Dewey averred, “philosophy also acquires responsibility.”4

While John Dewey was still a high school student in Burlington, Vermont, in 1872, a group of Harvard researchers were beginning to feel those Darwinian influences at work on their ideas. The group, which called itself the Metaphysical Club, was organized by the logician Charles Sanders Peirce, and included the lawyer and future Supreme Court justice Oliver Wendell Holmes Jr., future Harvard professor of psychology and philosophy William James, and Chauncey Wright, a lecturer at Harvard whom the younger men referred to as their intellectual “boxing master.”5 Together they accepted the Darwinian account of the universe as one marked by contingencies and uncertainty and thus were determined to rethink ethics, truth, and meaning accordingly. They shared no strict doctrine on the meaning of truth, but rather a common conception of it as a tool human beings use as they make their way in the world. As a result, they came to regard truth claims and beliefs as nothing more than propositions that needed to be tested. Determined not to let any religious or scientific explanatory schemes sneak through on their credentials, the pragmatists insisted that all ideas must be certified by experience to be classified as true. James referred to this new philosophy as “pragmatism” (while Peirce preferred “pragmaticism” to distinguish his theory of truth from James’s), which looked at truth by way of consequences, not origins, and by its practical results, not theory. The pragmatist philosophers believed that notions of mind and morals could no longer be based on timeless foundations, because, as they learned from Darwin, no such things existed.

William James, the first public face of pragmatism, also made a name for himself as one of the founders of the field of American psychology with his Principles of Psychology (1890). He brought his insights on human psychology to his work as a philosopher, which helped him to see that an individual’s temperament, not just mind, played an important role in the making of her or his philosophical commitments. In his 1896 essay “The Will to Believe,” James referred to this as one’s “passional nature,” which steps in to make decisions when the individual is faced with a dilemma between positions for which there is insufficient evidence to resolve.6 He saw how science and religion had become warring ideals that were particularly ferocious when their battleground moved from colleges and seminaries into the innermost reaches of an individual’s conscience. Yet he believed that at the outermost reaches of both explanatory schemes awaited the promise of innovative new directions for modern research as well as creative possibilities for a meaningful life. Radically pluralist in his ethics and his epistemology, he stressed that there was no single account of the universe, only notions of truth that proved useful to the believer. James thus moved the study of religion away from dogmatics and focused instead on what he titled his magisterial work of 1902, “the varieties of religious experience.”

James worked out his philosophy while teaching at Harvard and speaking on the American and European lecture circuit, and it was from a series of lectures that he produced his major work of philosophy, Pragmatism: A New Name for Some Old Ways of Thinking (1907). In it, he set out the dual role of pragmatism as both a method for coming to truth and a theory of truth. It was as hard to stick to as it was simple in conception: one is to get rid of abstractions, fixed principles, closed systems, dogmas, and foundations, replacing them with a testing of specific, concrete claims. It was to move from making philosophical assertions to examining how well they line up with actual human beings’ actual experiences. James thus advocated a method to work out the truth in “ambulando, and not by any a priori definition.”7

His theory of truth was really just a logical extension of this methodology. If one cannot say what truth is from the outset but must wait to see what proves itself to be true, then truth is just that: an idea that proves itself to be true. The result was a notion of truth that is contingent, perspectival, pluralist, and dynamic, so that truth changes from context to context, person to person, as well as over time. As James put it: “The trail of the human serpent is thus over everything. Truth independent; truth that we find merely; truth no longer malleable to human need; truth incorrigible, in a word; such truth exists indeed superabundantly . . . but then it means only the dead heart of the living tree . . . and may grow stiff with years of veteran service and petrified in men’s regard by sheer antiquity.” James regarded all truths to be “plastic”: they are never absolute, but particular; they are not transcendent, but immanent in the daily workings of the world.8

From this the moral imperatives of pragmatism were clear: no one person, nation, religion, or scientific theory had a lock on truth. “Hands off,” James challenged moderns in 1899, “neither the whole of truth nor the whole of good is revealed to any single observer . . . even prisons and sick-rooms have their special revelations.” Pragmatist thought thus recommended to moderns an ethics so simple and straightforward and yet so difficult: one’s truth can be one’s truth “without presuming to regulate the rest of the vast field.”9

Notwithstanding James’s extraordinary philosophical range, it was John Dewey, the youngest of the first-generation pragmatists, whose ideas had the widest influence on twentieth-century American intellectual life. The sheer compass of Dewey’s pragmatism (or what he called “instrumentalism”) was extraordinary, addressing issues of logic, psychology, epistemology, moral philosophy, and aesthetics, as well as curriculum, educational policy, and social theory. He joined the newly founded University of Chicago in 1894, when the city was the scene of the dramatic social and economic dislocations of an uneven modernization. There he brought his theory of instrumental knowledge to bear on his desire for social reform through education by founding the Laboratory School in 1896. This experience helped launch his career-long effort to test his own philosophy in the everyday world of politics, art, and public life. His Quest for Certainty of 1929 offered a vigorous challenge to epistemological and ethical foundationalism, advancing the proposition that truths are no more, but also no less, than experimental efforts to enable human beings to purposefully negotiate a variable universe. What more useful proposition could there be for Americans to negotiate their own pluralistic, indeterminate, variable America? Dewey thus lived his own pragmatic gospel by encouraging modern intellectuals to dispense with the problems of philosophy and address themselves instead to the “problems of men.”10

From Pragmatism to Progressivism

While James and Dewey recognized that pragmatism was a product of Darwinian thought, they also believed it could be used to conquer social problems underwritten by its offshoot—a dominant strand of social Darwinism, which accepted those problems as the unalterable facts of existence. Darwin considered the British philosopher Herbert Spencer’s “social Darwinism” anathema and distanced himself from Spencer’s advocacy of the “survival of the fittest” (a phrase Spencer invented and used to justify his laissez-faire economic and political ideas). Observers from all corners of American society could see that industrialization was a messy affair rife with bitter labor conflicts, extreme disparities in access and quality of education for children, poverty at one end of the economic spectrum and huge concentrations of wealth at the other, and cities polluted by unchecked industrial growth. But following Spencer’s assurances, many people came to believe that the process of a modernizing America, tooth and claw as it seemed, was simply following the iron law of social development. They believed that the hardships, smut, and suffering were prices worth paying for progress.

The birth pangs of modernization at the turn of the century drew the attention of increasing numbers of educated, progressive, middle-class critics, who rejected the tenets of laissez-faire and survival of the fittest, as well as the nonexistent or ineffectual governmental oversight they produced. Highly educated in the most up-to-date academic research, progressives sought to merge scientific principles and practices of experimentation, organization, and efficiency for the moral betterment of society. Largely reformist, not radical in temperament, they did not seek to reject capitalism and industrial democracy but rather to root out its excesses and weaknesses. The vast majority of progressive reformers had grown up in racially and religiously homogenous small towns, and they longed for that sense of community and belonging in the bustling, anonymous modern city. Through their writings and their activism, they posed a counterforce to Gilded Age immoderation and fragmentation and provided a new narrative for modernization that could knit together immigrant groups and native-born Americans, the haves and have-nots, a thriving industry and a robust social democracy. An emerging reform-minded journalist and former student of William James at Harvard, Walter Lippmann, wrote the landmark text of progressivism, Drift and Mastery: An Attempt to Diagnose the Current Unrest, in 1914. Showing the close relation between pragmatist impulses and progressive desires, Lippmann stressed, “We can no longer treat life as something that has trickled down to us. We have to deal with it deliberately, devise its social organization, alter its tools, formulate its method, educate and control it.”11

Though the reach of progressive reforms extended to rural communities, America’s industrializing cities became the main focus of their energies as they turned university campuses, civic institutions, and urban streets into laboratories for social improvement. A most vivid embodiment of pragmatist thought and urban progressive action was one of America’s foremost social reformers, Jane Addams, a pioneering social worker, activist, feminist, and pacifist. In 1889, together with her friend Ellen Gates Starr, she founded Hull House in Chicago, America’s first social settlement. Hull House primarily drew educated, native-born, middle-class women, who came to live and work with Chicago’s poor and immigrant communities. It offered a range of services, including day care for children, a library, employment assistance, classes in English and citizenship, and job training. Hull House reformers thus took seriously their work as urban scientists and the settlement house as a social laboratory.

Much like the Social Gospel and Christian Socialism of religious liberals and radicals of the same period, Addams maintained that progressive interventions would refashion economic, political, and public life to be more equitable, transparent, and just, while also transforming the individuals who worked within them. But Addams’s work was not linked with a church (although she did view it as an expression of what she called the “renaissance of the early Christian humanitarianism”). Nor was it charity or philanthropy, both of which are unilateral acts of assistance, not reciprocal ones, and therefore underscore social and economic imbalances rather than remedy them. She described the impulse as a “coöperative ideal” of mutual assistance, a form of exchange more in line with a democracy of equals. If she sought a collaboration between America’s haves and have-nots, it was only to foster the conditions that could help end this distinction. The path to social improvement was not simply being nice; it was to be a rigorous social scientist analyzing, testing, and implementing strategies that work. “We must learn to trust our democracy, giant-like and threatening as it may appear in its uncouth strength and untried applications,” she wrote.12 Both a good friend of and influence on John Dewey, Addams demonstrated how progressive reforms were an example of pragmatic testing of democratic ideals.

Progressive intellectuals were never content merely to observe social problems at a distance. Even those based at the university made regular ventures into city centers and rural communities to conduct fieldwork and gather data. Some carved out spaces for their work by founding independent “little magazines” and journals of opinion. Others frequented public libraries and labor halls by day and participated in makeshift “salons” and couch surfed in friends’ apartments at night, as they scraped together a living from their writing. But as the proliferation of the extraordinary progressive social criticism of the period demonstrates, all proved to be invaluable sites from which to study the vibrancy of American life and to consider ways to close the gap between democratic theory and social practice.

The Politics of Cultural Pluralism

The untapped possibilities and the perils of American pluralism preoccupied some of the most influential early twentieth-century progressive intellectuals. It was at this time that many white, Protestant, propertied elites disparaged the influx of “unwashed” masses from Southern and Eastern Europe, and satirical magazines such as Puck and The Wasp circulated images of Irish laborers as apes and Chinese laborers as hordes of locusts. Somehow they had forgotten that once upon a time, their people were immigrants to America, too.

Opposing racial and ethnic chauvinism to embrace the “melting pot” (a phrase introduced to Americans in 1908 by Israel Zangwill’s play of that name) would not be easy. It required persuasive arguments and evidence to challenge the social Darwinist and hierarchical thinking supporting the mainstream prejudices of the day. This was especially daunting as some of the most progressive American thinkers themselves harbored such chauvinisms. The pioneer sociologist and progressive advocate of workers’ rights Edward A. Ross was also a flagrant racist who in the most prestigious scientific journals of the day warned against immigration for fear of Anglo-Saxon “race suicide” (a term he coined in 1901).13 It is no wonder that the philosopher and sociologist W. E. B. Du Bois, coming of age when Ross’s ideas were at the height of both academic and popular fashion, maintained that “the problem of the Twentieth Century is the problem of the color-line,” while lamenting the “double-consciousness” of the African American forced to “[look] at one’s self through the eyes of others, of measuring one’s soul by the tape of a world that looks on in amused contempt and pity.”14

The German-born American anthropologist Franz Boas emerged as a crucial force working to undermine the authority of scientific racism. While still in Germany, he had written his dissertation using psychology and physics to understand perception of the color of water. This may not announce itself as a particularly momentous topic, but it turns out that for Boas, and for the field of anthropology he later entered, it was. In his study, he showed that the perception of water color is dependent on the viewer’s standpoint, assumptions, and experience. In other words, color perception was context dependent, even learned, and was not innate in the viewer or in the properties of water. As he moved into the emerging field of anthropology, then a speculative discipline among its Victorian practitioners, Boas drew these insights from his training as a physicist and approached human cultures similarly as dynamic and situational. Rather than accept anthropology as a theoretical enterprise producing generalizations about a universal “culture,” Boas turned to a group-specific, site-specific ethnography of “cultures” in an effort to make the “science of humans” a truly empirical enterprise. “Civilization,” Boas stressed repeatedly over his long career, “is not something absolute, but . . . it is relative.”15

While university-sponsored research could help challenge racial prejudice, for a black woman like Ida B. Wells, born into slavery and with limited access to formal schooling, the strategies for the progressive struggle against racism looked quite different. Her medium was not scholarly books on ethnography but rather a pamphlet on the perversion of lynching, A Red Record: Tabulated Statistics and Alleged Causes of Lynching in the United States, 1892-1893-1894 (1895). But like Boas, Wells turned to evidence rather than theories, and examples rather than generalizations, to debunk racist arguments. Hers was the first study to use statistical evidence to demonstrate how lynching had become a race-specific form of vigilantism. After the Civil War, the number of whites lynched fell precipitously, while the number of African Americans lynched increased dramatically, with an extraordinary 1,111 murdered by hanging between 1882 and 1894. Her choice to publish her material in pamphlet form was determined not by her lack of resources to publish it as a book, but by the fact that it could be produced cheaply on a wide scale. To change public opinion, she had to first create her public, which needed to include whites and blacks, people of means and those without.

As Boas’s anthropology and Wells’s statistical analyses show, the progressive agenda—to root out white Americans’ squeamishness about diversity and to recognize it instead as the very feature that recommends American democracy to the world—came in a variety of forms. Understanding that the fight for diversity needed to open imaginations and encourage thoughtful reflection, the cultural critic Randolph Bourne stepped up to the challenge by crafting some of the most lyrical essays—and powerful arguments—of the progressive movement. Before his untimely death at the age of thirty-two in 1918, Bourne was widely celebrated as the professional spokesman and prophet of a new generation of young intellectuals. He had grown up despising the austere Presbyterianism and stifling Victorianism of his New Jersey middle-class upbringing. When he arrived in New York City to study at Columbia University in 1909, he felt that for the first time in his life, he “breathe[d] a larger air.”16 Within a few short years, the essays that flowed from his pen covered topics ranging from youth culture and friendship to university education and national and international politics. But it was his 1911 essay, “The Handicapped—By One of Them,” that reveals Bourne steadying himself to take on the challenge of creating a more inclusive American culture, one less fearful of difference, whether it be cultural, racial, or physical. With a face misshaped by a botched forceps delivery, and hunchbacked and dwarfed from spinal tuberculosis at age four, Bourne, an outsider, had some sense of what it must feel like to be a new immigrant or black in America. It is to be always, invariably, “discounted at the start.”17

During World War I, Bourne turned his sights to the belligerent nationalism and cramped nativism coursing through American society. With “Trans-National America” (1916), he skewered the provincialism of Anglo-Americans, unwilling or unable to recognize not just the moral bankruptcy but also the intellectual slackness of their arguments for immigration restrictions, Americanization campaigns, and Jim Crow. Even the “melting-pot” ideal, Bourne argued, was an outdated form of “forced chauvinism,” for it took Anglo-Saxon culture as the measure against which all others needed to conform. “We are all foreign-born or the descendants of foreign-born, and if distinctions are to be made between us they should rightly be on some other ground than indigenousness.” Bourne reminded his readers that “the early colonists did not come to be assimilated in an American melting-pot. They did not come to adopt the culture of the American Indian. . . . They came to get freedom to live as they wanted to.” Already in 1916 he called for a “dual citizenship” not as an exception but as a basic fact of American identity. He saw this as preparation for a cosmopolitan “international citizenship,” the likes of which would have helped Europe and the United States to avoid the calamity of an international war. Rediscovered as an inspirational, if romantic, blueprint for American multiculturalism in the second half of the century, Bourne’s “Trans-National America” was, in fact, a study in hard-headed realism. “Let us face realistically the America we have around us. Let us work with the forces that are at work. Let us make something of this trans-national spirit instead of outlawing it. Already we are living this cosmopolitan America.”18

Randolph Bourne died a month after the armistice, a victim of the global influenza pandemic of 1918–19. He did not live long enough to see America come down from its war hysteria, but he did live long enough to be devastated by its fury. He had been one of the very few progressive intellectuals who thought that the United States entering the war—even a “great” one—was a gross abandonment of pragmatist practices and progressive ideals. John Dewey, arguably the most prominent and influential intellectual in the period, thought otherwise. And where Dewey went with his support for American entry, so went a nation that just six months earlier voted President Wilson back into office as a reward for keeping them out of war.

During 1917, Bourne produced a series of fierce and stinging antiwar articles, skewering “War and the Intellectuals” and “The Collapse of American Strategy,” and challenging the wisdom of his former Columbia professor and erstwhile intellectual hero Dewey, and Dewey’s prowar progressive followers, in “Twilight of Idols.” Bourne indicted Dewey for what he believed was his former mentor’s wielding pragmatic instrumentalism instrumentally only, using it to concede power to the is of massive global conflict rather than to the ought of fostering global peace and more democratic values at home.

Bourne confessed to feeling “left in the lurch” by a philosophy that had so inspired other Americans to think themselves out of the narrow confines of inherited conventions and into a way of being in America—and the world—that appreciated difference while fostering understanding. It was indeed pragmatism that just a year earlier enabled him to dream of a pluralistic America that embraces its transnationality. The Bourne of 1917 was in no way proposing to scrap pragmatism, but to try for a pragmatism that recognizes that “vision must constantly outshoot technique.”19

The response from fellow progressives, almost all of whom threw their support behind Wilson, was swift and punishing. Bourne’s colleagues at the New Republic stopped publishing his pieces, the Seven Arts little magazine he helped found folded because one of its main financial backers was outraged by Bourne’s antiwar stance, and The Dial, where he had served as editor, kicked him off the masthead. Only one thing rivaled the ferociousness of his fellow progressives’ response: the coruscating passion, acumen, and beauty of Bourne’s vision of a pluralistic, tolerant, and peaceful America, whose arguments against war are as relevant today as they were fearsome in 1917.