9

THE COLLEGIATE REVOLUTION

THE ACADEMIC TRANSFORMATION THAT PRODUCED American universities was accompanied by a parallel transformation of the undergraduate experience. Beyond the triumph of electives and academic disciplines in the classroom, student life outside of class assumed a richness and intensity that imparted new significance to the meaning of college. Any connection between these two developments would appear to be circumstantial. Campuses that clung to the old regime—Dartmouth, Princeton, Yale—in fact were the leaders of the collegiate revolution. Although the timing of the two revolutions was remarkably similar, the transformation of collegiate life had a dynamic all its own. The trend toward greater student autonomy and activities that was evident before the Civil War strengthened markedly in the 1870s. Just as the next decade was the tipping point for the academic revolution, it was the same for collegiate life.1 The years around 1890 marked an indelible change, most noticeable in the burgeoning popularity of intercollegiate football, but encompassing all aspects of college life. Most remarkable was the rapid spread of the chief forms of collegiate activities geographically and institutionally. After 1900, idealization of the collegiate experience was further enhanced—cherished by graduates and celebrated in the middle-class media. Research had become an end in itself at a handful of universities; now collegiate life was for many an end in itself. Whereas research and PhDs were the hallmarks of the academic revolution, football and fraternities were hallmarks of the collegiate revolution.

THE HIGH COLLEGIATE ERA

A generation separated Lyman Bagg (Yale 1869), introduced in chapter 6 as the author of Four Years at Yale, from Henry Seidel Canby (Yale 1899), whose recollections in Alma Mater idealize undergraduate life at the zenith of the high collegiate era. On no campus was college life pursued with more intensity and ostensible joy than the Yale that each author described, a distinction recognized by contemporaries. A curious Harvard instructor, George Santayana, paid a visit in 1892 to investigate. “Nothing could be more American,” he concluded, than the Yale Spirit, by which “young men are trained and made eager for the keen struggles of American life.” Compared to Harvard, Yale exhibited “more unity, more energy, and greater fitness to our present condition.”2 These qualities were not unique to Yale, of course, but they were concentrated there by insularity, competition, and a pervasive moral code. Canby described the college as “a state within a state…. [A] student body, aware really of only themselves, their own life, their own ideals.” The mainspring of this spirit was competition for achievement, for success and recognition outside the classroom. Seemingly, every student internalized an expectation to “do something for Yale” and to be judged by those contributions, whether “heeling” for the Yale Daily News, volunteering in the YMCA, joining athletic teams, or any of the other myriad organizations that students formed and ran: “the immediate goal was to be regarded as a success by your friends”; the ultimate goal was election to one of the three senior secret societies. A belief in “Yale democracy” was an article of faith—that success resulted from personal effort and character, not wealth or family status. A second article of faith was that education occurred outside the classroom. Canby described Yale’s retrograde curriculum as “a rotted house about to fall in and in parts already fallen” and any ideas or intellectual stirrings as “mere byproducts.” It was collegiate experience that achieved “a moulding [sic] of character and intellect, and a complete shaping of behavior.” And this formation, all believed, was the best preparation for the competitive struggles of American life.3

Several factors helped make Yale’s unity and moral code so powerful. Yale was the most national of colleges in recruitment of students, but geographical diversity produced social homogeneity. Like Canby, whose father was a banker, most students came from well-to-do families and had attended private prep schools. Students from humble backgrounds who became big men on campus, like baseball prodigy Amos Alonzo Stagg, were cited as proof of Yale democracy, but backgrounds like Canby’s were almost de rigueur for possessing the social skills and wardrobe needed to fit in. In Canby’s time each class of 300+ competed for 51 positions in sophomore societies, 120 in junior fraternities, and 45 in the three senior societies. The pressure to achieve and impress was thus continual for those who aspired to these honors. Other colleges had their own forms of distinction. Harvard was dominated socially by Boston Brahmins, and its hierarchy of clubs reflected both social status and merit, and members of Princeton’s exclusive eating clubs were chosen largely on social criteria, sometimes before matriculation.4 At most colleges, social standing was refracted through the fraternity system, and such arrangements encouraged tribal competitions instead of Yale’s individual strivings. Colleges varied as well in the proportion of students who identified with the collegiate ethos.5 Here Yale’s structure was unique. During Canby’s time, Yale College comprised just under half of enrollments in Yale University. Even in the college, Canby dismissed some students as “socially impossible,” others as “grinds.” The professional schools were “attended by hard-working meager creatures with the fun drained out of them,” many of whom were not college graduates.6 And the nearly 600 students in the Sheffield Scientific School were literally beyond the pale. They may have enjoyed their college experiences too, despite having to learn math and science, but not in the company of Yale College students.7 Taken together, Yale University may have had a similar proportion of ardent collegians as the universities of Michigan or California.

Still, Yale College exaggerated characteristic features of the high collegiate era, for good and for ill. A less sentimental nonalum provided some perspective. Edwin Slosson noted in 1909 that the first requirement for success at Yale was conformity—fulfilling the local definition of a gentleman, being clubbable, and piously observing student traditions and customs. Conformity also extended to opinions; heterodox views were entertained at one’s peril. Combined with the dismissive attitude toward classes, the result was decidedly anti-intellectual. Honors students were underrepresented by almost 50 percent in the senior societies. Slosson reported that senior society members were twice as likely as other graduates to achieve prominence after graduation—but the same was true of honors students. As sons of wealth and privilege increasingly patronized the college, social discrimination increased as well. This was most evident in attitudes toward Jewish students. In the 1880s it could be reported that there were no ill feelings toward Jewish students, but in the next decade anti-Semitism began to be evident. By 1911, the newest senior society unanimously voted “That Jews should be denied recognition at Yale.”8 Such attitudes reflected a national trend in Yale’s social base but apparently did not affect classroom treatment. However, it signaled the hardening of social lines at colleges and universities, especially after 1900. This also distinguished the high collegiate era from the late collegiate era.

The transformation of college life from the 1870s to the 1890s was a reciprocal process. On one side, colleges gradually relaxed the rules and discipline governing student behavior, though in some places this was a protracted and contested affair. On the other side, once allowed greater freedom, student activities and customs spontaneously proliferated. Colleges approved of some of this new behavior and opposed other aspects but over time became less able to affect either. Once students broke free from the anachronistic disciplinary regime, they found themselves in a virtual vacuum of authority, where their organizations need conform to few rules. Only in the twentieth century would universities develop administrative offices to reassert some measure of control.

The old regime sought to preserve the discipline of recitations and the piety of compulsory chapel. It persisted into the postbellum years at virtually every college but with a crumbling foundation. Where church ties remained strong, so did the will to uphold the old regime. But colleges drew their students and financial support not from organized churches but from the lay membership that included successful urban alumni. Student piety on campus rejected compulsory church services in favor of student chapters of the Young Men’s Christian Association. Faculty had long chafed under the burdens of enforcing student discipline, but postbellum professionals largely avoided this task. Moreover, lectures and the elective system lessened the need for recitation discipline. Presidents, of course, were the ultimate authority, and the colleges long remained small enough for them to exercise this oversight. But growth forced university presidents to delegate. Charles Eliot appointed a dean for students in 1890; in 1901 the University of Illinois created the first official dean of undergraduates, later dean of men. These officials, though charged with enforcing rules and curbing excess, were generally warm supporters of collegiate activities. Finally, colleges and universities had only tenuous control over students because most no longer lived on campus. Henry Tappan initiated this change when he turned Michigan students out of university buildings to find lodgings in the town. In fact, most nineteenth-century colleges were too poor to build dormitories, and growing universities had other priorities. Students living in rooming houses (sanctioned or not) and procuring their own meals had considerable freedom in how they used their time.9

Organizing and choosing their own activities, students wedged open these cracks in college authority. College glee clubs were first established for the pleasure of their members but soon serenaded nearby towns as well as the campus. By the early 1880s, glee clubs regularly embarked on lengthy tours. Belonging to the glee club thus offered the pleasures of singing and camaraderie but also the excitement of such excursions.10 Theater was long forbidden to collegians by the descendants of the Puritans, but such restrictions were overcome in the 1880s. The first plays may have been serious renditions of Sophocles, but Gilbert and Sullivan quickly followed. College journalism had a longer history, but its forms began to multiply as literary journals, yearbooks, newspapers, and humor magazines appeared. Newspapers originally tended to follow an official line, supporting college policies, scholarship, and piety. However, by the 1880s they largely assumed a purely collegiate focus, preoccupied with the full range of student activities but especially promoting athletic chauvinism. The proliferation of such activities had some benign effects for colleges. Student preoccupation with their organizations may have distracted them from pranks and other forms of mayhem. Appeals to avoid sullying the honor of the college or a student’s particular organization were often effective. Class loyalties still remained powerful, especially in the East, and class rituals represented an entire category of activities. But collegewide organizations tended to break down the isolation of the classes. On some campuses various forms of “rushes” in which freshmen and sophomores fought bloody battles for possession of some symbolic token were abandoned or suppressed. At Dartmouth, for example, the last “cane rush” was held in 1883.11 New students came to be perceived as potential members of fraternities, the YMCA, or athletic teams. This was only one way in which these organizations placed a distinctive stamp on the high collegiate era.

Organizations of religious students existed at many midcentury colleges, but in 1877 the YMCA organized some forty of them into chapters of a college division. The intercollegiate Y grew like wildfire: by 1900, 559 chapters claimed 32,000 members, 31 percent of all male college students. Members had to belong to an evangelical church, but associate members could also be voted in. The Y found members in all regions and types of institutions. More than half of the men at historically black colleges belonged.12 The Christian leadership of the Y sought to address a dual college problem: first, too many college students—the future leaders of the country—were not observant Christians; second, college men were surrounded by constant temptation. To address the first, the college Y was consciously evangelical, sponsoring periodic revivals and weekly prayer and Bible-study meetings. For the second, Ys competed with less wholesome campus gatherings by sponsoring their own activities. Dedicated social centers were erected at Princeton (1879), Yale (1887), and Cornell (1890). By 1915 thirty-six Y chapters had their own campus buildings.

The college Ys enhanced their presence by providing a multitude of student services, beginning with newly arriving freshmen. The Y chapters printed handbooks, financed by local advertisements, that gave students information on college history, customs, organizations, churches, directories, and much else. They also maintained housing and employment bureaus. Y members would meet arriving students at the railroad station, transport them to campus, and help them find lodging. A substantial portion of college students made use of these services to find rooms or part-time jobs. The vigor of the college Y, despite the secularization of the college curriculum, the declining influence of established churches, and student resistance to compulsory religious observance, represented a melding of traditional religiosity with the collegiate revolution. The Y, above all, sought to be fully engaged with school spirit and campus life but to bolster moral and religious values at the same time. This was a potent combination that buoyed membership. The religious influence of the Y peaked at campuses like Illinois before 1910, although membership grew nationally until circa 1920.13

★ ★ ★

Fraternities were in many ways emblematic of the high collegiate era, but this period was actually one phase in their long evolution. The early fraternities described in chapter 6 were relatively small and tended to congregate in rented rooms. They changed dramatically beginning in the 1870s as they acquired their own residential chapter houses. After 1890, some fraternities entered a more opulent age, characterized by impressive, custom-built houses and larger memberships. To their critics, fraternities were inherently unlikable for their secrecy, exclusiveness, snobbery, hedonism, and disregard of academics. They nonetheless became campus fixtures for the ensuing decades, especially at men’s colleges, where membership sometimes reached 90 percent. The large role of fraternities (and their mirror images, sororities) affected four different facets of college life.

First, secret Greek-letter fraternities were a unique cultural construct for dealing with basic needs for food, shelter, and companionship. Most colleges had abandoned the responsibility of feeding and housing some of or all their students. Fraternities often formed when students in boarding houses created a fraternal organization, with or without national affiliation. Colleges and universities without dormitories recognized the inevitability of the fraternity system and generally considered them superior to the arrangements that students could make independently.

Second, a clear social bias determined which young men hived together. Students from wealthy, urban families sought each other’s company and reinforced their social mores. How this worked depended on the social composition of the student body. As wealthier students flooded campuses at the end of the century, some fraternities mimicked the style and pretentions of urban gentlemen’s clubs. At Cornell, where affluent students set the tone, such houses topped a steep prestige hierarchy of fraternities. At Wesleyan, with a more middle-class student body, a 90 percent rate of fraternity membership undoubtedly diluted elitism. As was the case at Yale, social selection became more rigid after 1900. Jewish students at the larger schools countered by establishing their own fraternities.

Third, fraternity men played a central role in supporting “school spirit” and the entire menu of collegiate activities. Students joined fraternities for entrée into these kinds of activities; fraternities, for their part, sought to dominate such endeavors as part of the ubiquitous competition for campus prestige.

Fourth, and most controversial, were the positive and/or negative effects fraternities had on their members. On the negative side, fraternity men on average had inferior academic records; they were variously accused of monopolizing competitions for student offices; and their proclivities for drink and dissipation were notorious. Positively, membership brought immediate companionship and psychological support as well as enduring relationships, including useful networks; it clearly served the manifest purpose of socializing members into the refined culture of the haut bourgeoisie, and it was praised by students and college leaders as a fitting preparation for careers in the business world. Although the overall trend might be similar, these factors had different effects across different types of institutions.

At the New England men’s colleges, fraternities dominated both social functions and quotidian life. Where nearly everyone belonged, a certain amount of differentiation existed among fraternities, even given the relative social homogeneity of students. Fraternities were integral to the college ethos, supported and often praised by presidents, who themselves were graduates and brothers. These arrangements amounted to student self-government, with the colleges exerting some leverage through the granting of recognition and campus privileges.14

Conditions differed in the western public universities. Where the social origins of students were more diverse, greater social stratification existed between Greeks and non-Greeks. Fraternities were more controversial under these conditions, with ongoing tensions between Greeks and “Barbarians” over student offices, newspaper positions, and the organization of social events. Wisconsin and Michigan were typical in offering no university housing for men until well into the interwar years, making fraternities attractive simply for room and board. Fraternities intentionally emphasized social distinctions but were not entirely exclusive. A faculty inquiry at Wisconsin reported that one-third of fraternity members were partly or wholly self-supporting. These growing universities were largely middle-class institutions in which Greek membership reflected social aspirations as well as social distinctions. Fraternity membership peaked in the 1920s, when about one-third of male students at western public universities belonged to fraternities, and perhaps 20 to 25 percent of women were in sororities.15

In Southern universities fraternities tended to develop more unevenly, often facing religious or political opposition. At the University of Virginia, where student housing was provided, almost half of students joined for purely social reasons by the end of the century, even though fraternities were overlain by more socially exclusive “ribbon societies” that largely dominated campus offices. After 1900, sumptuous chapter houses were erected, and a majority of students joined fraternities. At Duke, on the other hand, no chapter houses were permitted until 1930. In Mississippi, opposition to fraternal organizations as undemocratic led to their legal proscription until 1926, but students at Mississippi State invested military companies with exclusive selection and initiation rites, much like fraternities.16

Across the landscape of American colleges, conditions for Greek organizations varied enormously. Traditional church-related campuses were likely to remain opposed, and acceptance of fraternities sometimes became a part of the battle over modernization. At Indiana-Asbury, renamed DePauw University in 1884, three-quarters of the men belonged to fraternities by that date. Earlier, in 1870, women who had been turned down for membership organized what is recognized as the first women’s fraternity (as they were then called)—Kappa Alpha Theta.17 At Carleton College, petitions to form fraternities were denied by the faculty in the 1880s. Instead, campus literary societies long thrived. By 1923, the college had nine societies for men and seven for women, and they operated a great deal like … fraternities. They elected members, leaving many excluded, had elaborate initiations, and engaged more in social than literary activities. It could be difficult to avoid the drawbacks of fraternities even in their absence.18 Above all, fraternities exaggerated features that were inherent to the high collegiate era.

★ ★ ★

Athletics, in contrast, added a whole new dimension to the college experience. The antecedents were laid in the early collegiate era when eastern colleges established rowing clubs, an import that had first become popular in England. In 1852 a railroad seeking to promote the resort on Lake Winnipesaukee, New Hampshire, invited Yale and Harvard crews to stage a boat race. Thus the first intercollegiate athletic contest was a product of commercial sponsorship. But the experience soon inspired a succession of ad hoc races in the 1850s involving Yale, Harvard, and other schools. Over the next two decades, college athletics grew significantly but haphazardly, led by crew and baseball. The euphoric effects of winning—on classmates as much as athletes—inspired teams to improve skills, training, personnel, and strategies. In this process, the several class teams were superseded by a “university club,” thus rallying the support of the entire student body. Faculties were often alarmed by these activities, but the limitations they imposed slowed developments only slightly. A greater impediment was the confused state of competition. Only at the end of the 1870s did it become fairly clear who would play whom under what conditions. For crew, the most popular sport of the 1870s, multischool regattas lost their appeal when Harvard and Yale dropped out. They were principally interested in competing against each other and avoided contests that they could not dominate. In baseball, well-organized teams like Harvard played mostly professional teams in the 1870s. At the end of the decade, the American College Base Ball Association was formed to structure competition among Eastern schools. It lasted 7 years before Harvard, Yale, and Princeton broke away to compete against each other. Baseball was the most popular college sport through most of the 1880s, until it was eclipsed by football.19

Although the first intercollegiate football game was played between Princeton and Rutgers in 1869, the game developed only after a team from McGill University taught Harvard to play a rugby-style game in which players ran with the ball. Harvard and Yale first played this game in 1875, but it took a decade of rule adjustments before the basic game of American football stabilized in the mid-1880s. The person most responsible for devising common rules was Walter Camp, who had played for Yale from 1876 to 1882 and served thereafter as an unofficial coach until 1909. Under Camp’s tutelage, Yale dominated eastern football, winning 95 percent of its games over 34 years. Still, the games that really counted were against rivals Princeton and Harvard. The unique stature of the Big Three lent prestige and popularity to the game of football, and football in turn gilded the reputations of the three schools. Annual Thanksgiving Day games in the New York City Polo Grounds between (usually) Yale and Princeton drew huge crowds of socially prominent spectators—23,000 in 1886 and soon more.20

After 1890 American football spread across the country. The University of Michigan, for example, had experimented earlier, playing the first ‘western’ intercollegiate game against Racine College in 1878. Despite eastern trips to battle the Big Three in 1881 and 1883, athletics did not become formally organized there until 1891. As elsewhere, “missionaries” from Yale and Princeton helped to teach the basics of the game. In 1895, the major public universities of the region organized the Western Conference (later, the Big Nine), bringing a common set of rules and regularly scheduled games. William Rainey Harper may have drawn on his Yale experience in using football to speed the recognition of his new university. Under coach Amos Alonzo Stagg, Chicago became a powerhouse in the Western Conference. Only 2 years after the university opened, he led the football team on a 6,200-mile trip to the West Coast, where Stanford and California had begun play in 1892. In 1902, Michigan played Stanford in the first Rose Bowl on New Year’s Day. Southern universities embraced football at about the same time as the Midwest. The University of Virginia began tentatively in the late 1880s before enthusiastically committing to the sport after 1890. By the end of the century, football had assumed an iconic status throughout American higher education, too important, in fact, to be left to students.

Like other activities in this era, athletic teams were organized and run by students. Team manager was a prestigious position responsible for logistics and finances; the elected captain was in charge of practices and games. While no doubt encouraging student initiative, responsibility, and spirit, student management hampered the establishment of common rules or the formation of stable leagues. In baseball, the perennial problem was professionalism—ringers hired by college teams or collegians playing professionally in the summer. Teams largely refused to accept eligibility rules that worked to their disadvantage. Football, too, was plagued with “tramp athletes,” but the overriding problem was brutality, caused by increasingly aggressive mass plays, like the flying wedge, and blatantly unethical conduct. Football had a rules committee, but it was long dominated by Walter Camp, whose Yale team benefited most from the status quo. This issue rose to a crisis in 1894, but Walter Camp quieted public outrage by publishing a deceptively benign evaluation of the game, Football Facts and Figures (1894). Numerous gridiron deaths caused public concern to rise to another crescendo in 1905, prompting an ineffectual intervention by President Theodore Roosevelt, an admirer of the game. This time, a consensus of universities was able to overcome Camp and the Big Three to organize the National Collegiate Athletic Association and impose more sensible rules. After 1900, in fact, universities gradually asserted authority over athletics, hiring coaches and athletic directors. College athletics largely embarked on a history of its own. However, the influence of college athletics, and especially football, went to the heart of American higher education. Achievement on athletic fields overshadowed not just other collegiate activities, but all manner of academic work. Athletics thus validated the anti-intellectual bias of the collegiate revolution and projected a powerful image of the college life, especially to alumni.21

★ ★ ★

The high collegiate era stimulated closer involvement of alumni with their colleges and universities. Earlier alumni groups formed social clubs in major cities, but now younger alumni sought to stay connected with campus activities. Walter Camp was not the only graduate to maintain a relationship with a college team. More obviously, athletic contests created a sentimental link between graduates and alma mater, as well as an excuse to return to campus. Fraternities also provided a tangible link. Each house would typically have some alumni members who maintained ongoing relations with the chapter. Donations from devoted alumni built the sumptuous chapter houses that burnished pride and prestige for both students and graduates.

Alumni served as agents of change, applying pressure to accelerate collegiate activities and atmosphere, especially where these things were resisted. As in the Young Yale movement, described previously, the younger alumni identified most strongly with the collegiate spirit. On several occasions, alumni sought to force modernization by deposing old-style presidents. Most notorious was the “trial” of Dartmouth president Samuel Colcord Bartlett (1877–1892). A conservative Congregational minister, Bartlett was a throwback to the era of submission and control. He valued only the “old college” and regarded Dartmouth’s schools of science, engineering, and agriculture as “parasites, who were eating her life out.” In 1881 the New York Alumni Association expressed the general unhappiness by asking the trustees to investigate Bartlett’s governance of the college. This quickly escalated into petitions for his removal from alumni, faculty, and graduating seniors. The alumni were led by a writer for The New York Times, so the controversy and the resulting formal hearing received blanket press coverage. Each of the twenty-five charges against Bartlett was, in isolation, somewhat trivial; what his accusers found most objectionable was the entire reactionary regime. Focusing only on the specific charges, the trustees failed to find sufficient grounds for his dismissal. Stubborn by nature, he remained another decade but with little authority to enforce his views. Similar trials occurred at Union (1882) and Hamilton (1884) colleges.22 However, in the long run alumni found more effective ways to exert their influence.

Historian W. Bruce Leslie has depicted how church-related colleges (including Princeton) were weaned from their denominational moorings by financial dependence on their younger alumni. In the last decades of the century, neither church organizations, older clerical alumni, nor the denominational faithful could provide the funds needed to hire more faculty and erect new buildings. Rather, younger, more secular alums with successful business careers were the most likely source for large donations. They naturally advanced a collegiate agenda as the price of their support. William Bucknell, for example, a prosperous Philadelphia Baptist, used his fortune to induce a reorganization of the University of Lewisburg in 1882. The institution then acquired fraternities, intercollegiate athletics, support from Philadelphia alumni, and, in 1886, the name of its benefactor.23 The influence of alumni nonetheless had the most consequential impact on Harvard, Yale, and Princeton—the Big Three.

The collegiate revolution was connected with the consolidation of a Protestant upper class at the end of the nineteenth century. The 1880s saw the creation of the Social Register, gentlemen’s clubs, country clubs, and exclusive summer resorts, all of which flaunted cultural superiority by excluding Jews and most other non-Protestants. In education, the most significant innovation was the establishment of seven new elite boarding schools between 1883 and 1906. These schools immediately joined the principal feeders of the Big Three. They also espoused the Victorian notion of “muscular Christianity.” This vague doctrine combined evangelical Christianity with the idea that physical vigor, and particularly sports, had an independent moral value. This viewpoint harmonized with social Darwinism to uphold a Christianity for the strong, rather than the weak, which had an obvious appeal for wealthy Americans. This image supplemented rather than replaced the previous ideal of a cultivated, Christian gentleman. Together, these notions formed an ideology that conjoined Christianity, physical vigor, service, achievement, and character—frequently united under the rubric of “manliness.” These were precisely the virtues claimed for the collegiate ideal.24

Manly virtues had long been claimed for a college education. Such claims rationalized the cultural value of college since the Yale Reports of 1828. After the Civil War, the New England colleges increasingly invoked masculinity as a defense against the new education, practical or academic. Mark Hopkins in 1868 defined a liberal education as “the cultivation of man as man,” and college presidents increasingly defended their purpose as forming the “whole man.” This sentiment was, by 1900, a rationalization that disguised the bankruptcy of their educational mission, a surrender of intellect to the ascendant values of the high collegiate era, which enthroned the principal features of manliness.25 The Y represented respectable Protestantism without the distraction of theology; fraternities provided manly brotherhood with the exclusion of social and religious deviants; the competition for campus distinction supplied symbolic achievement; and athletics, especially football, demonstrated manly courage. By 1900 this collegiate ideal had spread across most of the country’s colleges and universities, with nuances of interpretation everywhere. However, its earliest and strongest expression occurred at the Big Three, where it bonded with upper-class culture. The boarding schools that set the tone literally prepped students for this role. The schools emphasized strenuous athletic programs, nondenominational Protestantism, Spartan living conditions, and, above all, an emphasis on the development of manly Christian character—all in anticipation of attending Harvard, Yale, or Princeton. By the 1890s, 74 percent of upperclass Boston and 65 percent of upperclass New York sent their sons to one of these three schools.26

Harvard was the country’s most distinguished university, and Yale, the preeminent college, but Princeton’s rise reversed a dismal nineteenth-century record. While President McCosh did much to rehabilitate Princeton’s academic life, the great leap in public image occurred during the presidency of Francis L. Patton (1888–1902). A conservative theologian and incompetent administrator, Patton is generally considered to have retarded the institution’s development into a university (although it assumed that title in 1896). However, Patton had a knack for public relations. He expanded support for the university by extolling athletics and pandering to wealthy alumni. Princeton’s success on the gridiron was second only to Yale’s, and Patton played up the virtues of sports for instilling “lessons of manliness,” even mental discipline. He was especially adept at charming the New York alumni with such rhetoric, which also brought favorable publicity in the city’s newspapers. Alumni responded with growing donations—and by enrolling their sons. During his presidency, Patton allowed the exclusive eating clubs to proliferate, from two to eleven, and in the process they came to dominate the social life of the university. He consoled the faculty: “Whether we like it or not, we shall have to recognize that Princeton is a rich man’s college and that rich men frequently do not come to college to study.” This complacency toward student learning damaged the university’s academic reputation and ultimately led to Patton’s downfall. However, it did little to harm Princeton’s collegiate image and support from alumni. Princeton assumed its status as one of the Big Three during Patton’s tenure largely through its athletic ties with Harvard and Yale, the national publicity they generated, and the support provided from loyal alumni.27

Athletics made the activities of college students of compelling interest to wider audiences—first alumni, then readers of metropolitan newspapers and, by the turn of the century, of national middle-class magazines. The exploits of teams from Harvard, Yale, and Princeton no doubt ignited this process, not only stimulating school spirit but also advancing an image of the college man widely covered in major newspapers. The heroes of the gridiron or crew projected the manliness admired by that age onto the entire student body. In the last decades of the century, the popular image of a college student as an effete, studious character, memorizing Greek and Latin to prepare for teaching or the ministry, was displaced by one explicitly joining manliness, athletic prowess, Christian character, and worldly success.

After 1900 the collegiate ideal was equated with the qualities required to succeed in business. Nor was this association fanciful, as the number of graduates of the Big Three pursuing business careers surged at the end of the century. At this juncture, historian Daniel Clark has shown, the themes belabored by college presidents at alumni dinners were adopted by middle-class magazines with circulations in the hundred thousands. The collegiate ideal, in fact, harmonized with the zeitgeist: Theodore Roosevelt gave his famous speech advocating the strenuous life in 1899, and the Spanish-American War linked manliness with war and imperialism. American colleges, so it seemed, were shaping the type of men needed by the country and the economy. The Saturday Evening Post (circulation reaching 1,000,000 in 1909) published a “College Man’s” issue in 1899, with a cover depicting two collegians, one in football gear and the other in academic robe carrying books. The picture was emblematic of the interpretation of the collegiate ideal that these mass-circulation publications conveyed to the middle class through articles and fiction: college students could be manly men and the extracurricular life was at least as important as the classroom and more likely to prepare one for success in the world of business. This message no doubt found a receptive audience, since publishers and advertisers knew their readers. But it also had serious implications: college was now perceived as the appropriate training ground for business, something that had scarcely been true before the end of the nineteenth century.28

After 1900, the impact of the collegiate revolution produced paradoxical results, simultaneously reinforcing privilege and offering democratic opportunity. Harvard, Yale, and Princeton became dominated socially by the eastern upper class. Sons of the most privileged families attended these schools to gain further advantages of cultural refinement, elite socialization, potentially valuable connections, and, if they chose to make the effort, superior intellectual training. These schools, of course, were more or less open to students from more humble circumstances: Harvard more, Princeton less, and Yale somewhere in between. But their upper-class constituency would reward these schools bounteously for generations to come for honoring this contract.29

The collegiate revolution sent a different signal to middle-class America. It depicted an institution that was open to talent, where any hardworking son (for daughters, see below) could achieve success on campus and acquire the skills and judgment to succeed in America. These rewards were sweetened by the possibility of acquiring some degree of culture and socialization as well, the attributes to achieve respectability in the upper middle class. College was open to all who qualified for admission, not simply members of the professional class or students who had studied Latin and Greek. These students too came to college in increasing numbers as the twentieth century progressed. Mass magazines no doubt helped to spark this development through the turn of the century, reinterpreting the cultural meaning of college studenthood. However, this opening of American higher education could not have occurred without a fundamental restructuring of the college’s role.

HIGH SCHOOLS, COLLEGES, AND PROFESSIONAL SCHOOLS

An American college education was assumed to take 4 years ever since Henry Dunster extended the Harvard course in 1650, though this standard was often clipped. Colleges of the Early Republic sometimes failed to form four classes, and everywhere students entered with advanced standing. However, in the years around 1890, this venerable pattern was challenged more seriously than ever before or since. Johns Hopkins had introduced a 3-year AB, and Eliot advocated the same for Harvard. Many BS courses were also just 3 years, including Yale’s Sheffield School. Critics who emphasized the preparatory function contemplated shortening it further. The transition from sophomore to junior years was seen as a natural break point between foundational and advanced subjects. Daniel Coit Gilman and Andrew D. White recommended starting collegiate training 2 years earlier, with university work to follow. John Burgess at Columbia appropriated the senior year to begin graduate or legal study, and similar schemes were introduced at Michigan and elsewhere. William Rainey Harper defined the freshman and sophomore years as junior college, and both he and David Starr Jordan contemplated dropping it from their universities. A reconfiguration of the 4-year college course seemed in these years to be an appropriate response to the academic revolution. However, the college course not only survived these challenges intact, it emerged stronger after 1900.

This outcome was the result of three separate developments. The first was the collegiate revolution, just described. Given the prominence assumed by collegiate life, institutions could scarcely truncate its duration merely for curricular reasons, and college proponents after 1900 advocated strengthening instead, as will be seen. Second was the gradual emergence of high schools as the principal form of preparation for college. With the recognition of a threshold between secondary and higher education, their respective roles became more clearly defined. Third, the elevation of professional education, by requiring first some college and, eventually, a college degree, restored the foundational role that had atrophied in the nineteenth century. What emerged was the modern American pattern of high school–college–graduate/professional school, but these developments evolved slowly after about 1880.

Serious consideration of altering the college course was largely an Eastern phenomenon. In the West, Michigan’s experiment fizzled, the pragmatic Harper abandoned the notion of affiliated junior colleges after a poor response, and Jordan’s disdain for underclassmen helped ease him into a ceremonial chancellorship. Similarly, the problem of preparation for college assumed different forms in the East and the West. The East possessed a rich array of private schools, academies, and boarding schools, all of which provided preparation in Latin, Greek, mathematics, and English—the subjects of college entrance examinations. Eastern high schools before the 1880s were known as “people’s colleges,” teaching modern subjects to students intending commercial careers. In the West, preparation for college was more difficult to obtain. Outside of cities, private schools were few and far between, and nearly every college was compelled to maintain a preparatory department to ensure its own enrollments. As urban high schools were established, although poorly staffed and furnished in the smaller cities, the larger schools gradually assumed a dual role of serving local educational needs and providing a possible path to college. Hence, the first initiatives for linking secondary and higher education occurred in the West.30

COLLEGE ADMISSIONS. In 1871 faculty from the University of Michigan began visiting high schools in the state to determine if their graduates were qualified for admission without examination. The faculty’s interest in assuming this onerous task was not simply student recruitment but rather bolstering the quality of the high schools and the students they graduated. Throughout the country, colleges were saddled with the burden of poorly prepared students who depressed the level of teaching. Would-be universities, in particular, needed more qualified students if they were to raise instruction to a university level. James B. Angell, who became Michigan’s president that year, strongly supported school certification as part of the university strategy, but also to build a unified educational system with the university at the pinnacle. In the first year, 6 high schools were approved, and 50 students were admitted on the basis of their diplomas or certificates. By 1890, 82 schools were on the approved list, about half of Michigan’s high schools. That year, 164 students were admitted by certificate, compared to 131 by examination. Faculty inspectors could be severe in their written judgments of high schools, but they were lenient in granting approvals. In fact, they had limited leverage with local school boards. Outside of the 6 principal feeder schools that produced 80 percent of certificate students, the remaining 82 high schools were sending few students to any college. But the certificate program had other attractions. Localities valued the seal of approval from the state university, and it also relieved high schools of reviewing old material for college entrance examinations. For universities, it simplified the process of admission, was thought to bring more students (although here opinions differed), and defined a systemic relationship between high schools and colleges that had not hitherto existed.

Michigan’s certificate system spread throughout the Midwest and West and, soon, the entire country. The University of Wisconsin adopted it in 1876, although some thought it incompatible with maintaining a preparatory department. President Bascom was able to close that department in 1880 with no ill effects, a step repeated at other state universities in the ensuing decades. The University of California established its certification system in 1884, the same year Michigan began inspecting and certifying schools outside the state. The certificate system was also adopted in the Northeast, although the principal private universities stood apart, preferring to admit students only through examinations. And, a watered-down certificate system was adopted in the South. By 1890 the practice was so widespread that universities and even private colleges accepted students from schools that were on the approved lists of major universities. Such arrangements were not a system, however, but rather the beginnings of one that required further elaboration.

These developments reflected rapid change in public high schools. In the 1880s, high school enrollments almost doubled, and the number preparing for college grew from a trickle to about one-third of graduates. In 1890, 6,500 high school graduates were prepared to enter college, compared with 5,000 private school graduates and about 8,000 products of college preparatory departments. A decade later, these last two sources of potential students had grown by 20 percent, but the high school number tripled, now supplying the majority of college students.31 High schools did not displace the traditional sources, at least not immediately, but rather grew from an independent dynamic. High school enrollments and graduates roughly doubled in each of the next three decades—and proved to be the growth engine for American higher education. But first, relations between the two sectors had to be regularized.

In 1885, the need for greater cooperation led to the formation of the New England Association of Colleges and Preparatory Schools, and 2 years later to a similar association for the Middle States and Maryland. The schools were frustrated by the different books and authors covered in each college’s examination. Some progress was made standardizing these matters and bolstering the certificate system, but individual colleges largely clung to their idiosyncratic requirements. It was clear that a more comprehensive approach was needed. Such an effort was launched when the National Education Association created the Committee of Ten in 1892. Led by Charles Eliot, the committee sponsored extensive research on secondary school curricula and proposed model courses of study as a national standard. The committee’s recommendations mirrored the academic offerings of the stronger schools. It was nonetheless heavily criticized for “college domination” by secondary educators, who were under pressure at this time to increase vocational offerings as well. The committee’s chief impact was in forwarding an issue that could not be avoided. A Committee on College Entrance Requirements followed in 1895, and it offered the fruitful suggestion for a unit of measure for high school coursework. By the end of the century, an organizational superstructure existed for at least addressing the admissions problem.

For years, reformers had called for the establishment of separate, independent examinations that any college might use for admissions. In 1899, Nicholas Murray Butler persuaded the Middle States Association to establish such a body to organize examinations on the subjects identified by the Committee of Ten. The resulting College Entrance Examination Board was a private organization that offered its first examinations in 1901. Although begun with few members, Harvard joined in 1904, and Yale and Princeton joined in 1909–1910. Despite a small membership, the College Board removed a huge obstacle for these institutions, which refused to participate in any admissions scheme that would limit their independence but also sought to widen recruitment. Institutions were free to specify which exams prospective students would take and what scores were acceptable.

Still to be resolved for the majority of institutions were the larger issues of standardizing high school offerings and moving from certificates to regional accreditation. The North Central Association (f. 1895) addressed these goals by defining the “unit course of study” for high school subjects in terms of class hours and subject requirements. Students needed fifteen yearlong units to graduate, and colleges were advised to demand the same for admission. It specified that three units should be in English and two in mathematics, but otherwise high schools were free to offer, and colleges to require, what courses best suited them. The Carnegie Foundation for the Advancement of Teaching (CFAT, f. 1905) adopted and popularized the unit system, which became known as Carnegie units, helping to spread it nationwide.32

Acceptance of the unit system finally established a definitive threshold between secondary and higher education. It greatly simplified both college admissions and high school accreditation, but its impact extended far beyond administrative convenience. Standardization efforts in the 1900s were intended above all to raise the level of teaching in both high schools and colleges—to raise high schools to the level of fifteen qualified units and to allow colleges to focus exclusively on advanced instruction. Abysmal conditions persisted in many parts of the country, where secondary and higher education remained blurred. In the South, many high schools offered only 2 or 3 years, and colleges competed with secondary schools for students, bringing down standards for both. Conditions were little better elsewhere in rural areas that lacked or had just established high schools. Many of the country’s “colleges” functioned more as secondary schools, as William Rainey Harper had noted in “The Prospects of the Small College” (1900). Of some 400 institutions, he judged that one-quarter were little more than academies and should assume that role.33 The CFAT set stringent criteria for colleges: absence of denominational control, fourteen Carnegie units for admission, and no preparatory department. The College Board too barred colleges with preparatory departments, and the number of Carnegie units required for admission became a benchmark of college quality. By 1910 a clear separation from secondary education had become a hallmark of the American college and university.

This last development was gratifying to Charles Eliot, who labored more than 20 years to achieve this result. Eliot helped form the New England Association in the 1880s, led the Committee of Ten, worked for the establishment of the College Board, served as a reforming president of the National Education Association, and chaired the CFAT board of trustees. These activities were intended not simply to resolve the incoherence of college admissions; rather, they stemmed from a comprehensive view of the place of the college in the proper ordering of American education. That order, Eliot felt, had to include high schools capable of preparing students to continue on to college. The fundamental aim of college, then, was mental training acquired through the study of liberal, or academic, subjects. A college degree, Eliot strongly believed, should, therefore, be a prerequisite for the mastering of practical bodies of knowledge in professional schools. However, as with secondary education, this required that existing patterns be reshaped. Restructuring professional education was a challenge that Eliot tackled from the first years of his presidency. Harvard consequently played a leading role in the long process of upgrading professional schools.34

“Higher” education at the end of the nineteenth century appears in a different light when professional schools are included. In 1890 roughly 41,000 students were enrolled in schools of theology, law, and medicine, compared with 64,000 college students of all types. Yet, more professional degrees were awarded than bachelor’s degrees and more medical degrees than ABs. Moreover, fewer than 10 percent of medical students and one-quarter of students in theology and law had graduated from college. Professional studies and degrees were, in fact, alternatives to college, not a sequential stage of education.35 This condition was a general liability for professional schools, which with few exceptions lagged well behind the academic revolution, not the least because of unprepared students. But the winds of reform were being felt, particularly in medical education.

MEDICAL SCHOOLS. Eliot broke the mold of the autonomous proprietary medical school in 1870–1871 by incorporating Harvard’s school into the university. It subsequently established salaried, professional professors in place of practitioners and imposed a 3-year graded course with meaningful examinations and laboratory teaching of basic sciences.36 However, progress elsewhere was grudging for the next two decades. Predictably, Harvard suffered lower enrollments for the next decade, as did Yale when it set a 3-year course in 1879. In 1880, the fledgling American Medical College Association attempted to require a 3-year course for its members, but 2 years later it suspended the rule as impracticable and subsequently dissolved. Only the universities of Pennsylvania and Michigan were able to approximate the Harvard reforms by 1880. Yet, the 1880s witnessed accelerating advances of medical science, including European discovery of the bacterial basis of diseases. Reforming medical education on a scientific basis became imperative, but only Michigan and Harvard incorporated the beginnings of research by German trained professors and, in 1890, expansion to a 4-year course.

After 1890, though, the pace of reform accelerated. The American Association of Medical Colleges was reconstituted in that year and now was able to demand a 3-year graded course of all members. In 1893, the opening of the Johns Hopkins Hospital and Medical School presented a new and higher model. Hopkins offered a 4-year course and required a college degree for admission. In addition to offering laboratory-based scientific instruction for the first 2 years, it elevated clinical teaching to a scientific and observational basis for the final 2 years. Momentum shifted from resistance to reform to proactive efforts by the leading universities to strengthen medical education. University-based medical schools now emerged as a distinct sector that was advancing both the training of doctors and, to a lesser extent, the treatment of disease. Some of the more reputable private medical schools saw the advantage, if not necessity, of affiliating with universities. The College of Physicians and Surgeons united with Columbia in 1891, and similar mergers occurred at Western Reserve, Chicago, Illinois, and Pittsburg. Cornell opened an endowed medical school in 1898, and Wisconsin founded its school in 1907. The modernizing trend was bolstered in 1904 when the American Medical Association created the Council on Medical Education, consisting of five distinguished university professors. In advancing the university agenda of continual upgrading, the Council conducted an evaluation of the entire sector.

The reform of university medical schools alone could not resolve the failings of American medical education. Some 50 of the country’s 160 medical schools had university affiliations. This group was led by strong schools at Johns Hopkins, Harvard, Columbia, Michigan, and Minnesota, among others, but it also included woeful proprietary schools with only nominal university affiliations. Among the rest, the situation was far worse. The number of medical schools and the number of graduates continued to swell until the mid-1900s, when 162 schools disgorged about 5,500 graduates. Lax licensing laws that allowed these graduates to practice medicine kept proprietary schools operating and profitable up to that point. They also flooded the country with poorly educated physicians, almost four times as many per capita as Germany. When the Council on Medical Education inspected the nation’s medical schools, it found one-half to be “acceptable,” 30 percent might be improved to an acceptable level, and 20 percent were “worthless” and should have state recognition withdrawn. Even these ratings would prove generous. The council did not publish its findings because of scruples about criticizing fellow doctors, but it did ask Henry Pritchett if the CFAT would conduct an independent evaluation of medical schools. The request fit with Pritchett’s goal of raising standards in American higher education, and in 1909 he commissioned Abraham Flexner to examine the state of American medical schools.

Although possessing no medical training, Flexner was a dedicated educator with independent views.37 A classics graduate of the early Hopkins, he went first to the Hopkins Medical School to learn how medical education ought to be organized. Flexner held, above all, that medicine was a scientific undertaking—but one with two dimensions. Medical science rested on the physical laws of nature, which had to be taught in laboratories of the basic sciences; but medical art depended on acquiring an ingrained scientific approach to clinical practice through direct observation and hands-on learning. The ideal medical school required ample facilities of laboratories, teaching hospitals, and endowments; its teachers should be full-time professionals engaged in original investigations; and its students must possess at least 2 years of prior college science. Outside of Johns Hopkins, few medical schools could meet Flexner’s exacting standards. His 1910 Report was, consequently, a bombshell. Its two parts consisted of a lengthy overview of all aspects of medical education, followed by descriptions and evaluations of the 155 schools he had visited in the United States and Canada. His acerbic depictions of the worst proprietary schools rivaled the muckraking journalism of the era. Just 31 strong medical schools, he concluded, would suffice for the physician needs of the United States.

The Flexner Report is credited with delivering the coup de grâce to the proprietary schools. The total number of medical schools declined to eighty-five by 1920, but in fact the commercial schools were doomed by the tightening standards of state licensing boards. Twenty had already suspended operations from 1907 to 1910. As Flexner pointed out, it had “become virtually impossible for a medical school to comply even in a perfunctory manner with statutory, not to say scientific, requirements and show a profit.” Medical educators reacted to the negative tone of the report somewhat defensively, reminding that medical education was stronger than ever before. However, this criticism worked to their advantage. The better schools thanked Flexner and Pritchett for exposing their shortcomings. Flexner had provided a blueprint for improvements that they promptly used to raise the money for the expanded facilities that his model demanded. Yale and Washington University launched major fund-raising efforts; President James appealed to the Illinois legislature, which provided support for the first time for medical education; similar efforts occurred nationwide. Medical schools, even at leading universities, were heavily, often entirely, dependent on student tuition. The Flexner Report, by exposing glaring deficiencies, helped to usher in a new financial era for medical education characterized by philanthropy and state support.

Tuition dependency was also the reason why entrance requirements were the slowest element to change in medical school reform. Only Harvard in 1901 followed Hopkins in requiring a college degree for admission. Before that date, most schools had no entrance requirements at all, and those that did felt unable to ask for more than a high school diploma. Entrance examinations, where they existed, were notoriously lenient. In the first decade of the century, the main thrust among university medical schools was to require entering students to have studied college chemistry, physics, and biology. Medical educators were uninterested in bachelor’s degrees per se; in fact, they refused to honor them if the course lacked science. Before 1910, these schools were moving to embrace the 2 years of college science advocated by Flexner. In the years following the report, this became the accepted standard.38

Was the Flexner Report the turning point in the reform of American medical education? More accurately, the years from 1905 to 1915 marked a revolutionary transformation in which the best practices of the leading schools spread among university medical schools. The Flexner Report contributed enormously as stimulus and guide for these schools to institute improvements they had long sought in any case but had been unable to afford or accomplish. The assimilation of American medical schools into universities was a triumph of the university movement. The universities essentially imposed the institutionalization of the academic revolution on medical faculty and the integration of medical students with at least part of the college course. Medical schools, with an overriding preoccupation with health care, evolved as special enclaves, loosely linked with their universities, but academic science and collegiate preparation remained two firm points of attachment.

LAW SCHOOLS. By the twentieth century, Eliot could bask in the stature of the Harvard Law School. He told alumni, “If there be a more successful school in our country or in the world for any profession, I can only say that I do not know where it is.”39 Such effusive praise was backed by real accomplishment: Harvard Law fulfilled Eliot’s vision for the proper role of professional schools; it stood foremost and unchallenged among the nation’s law schools; and, most extraordinary, it transformed the content and pedagogy of its field.

Harvard founded the first true law school and, with two endowed professors and the redoubtable Justice Joseph Story (1828–1845), it was easily the most prestigious. But otherwise it differed little from the other twenty-one law schools operating in 1860. Practitioner-professors, dependent on student fees, lectured to a floating population of students, who derived most of their training in law office clerkships. Law school was an alternative form of undergraduate education for the minority of aspiring lawyers who cared to attend. This was the situation that Eliot sought to reform in 1870 by appointing Christopher Columbus Langdell as professor and dean. Described by a classmate as “a bookworm if there ever was one,” the impecunious Langdell had worked his way through the Law School, impressing both students and teachers with his learning and diligence. He then practiced for 16 years in Manhattan before Eliot’s call. He believed that students should learn the law much as he had, by studying the decisions of appellate courts in individual cases. This was the genesis of the case method that revolutionized the teaching of university law schools.40

Law school pedagogy had relied on the presentation of fully formed legal principles through professorial lectures and textbooks. Langdell regarded this approach as superficial and unreliable. He wished to derive the essence of legal principles from their original sources—the reasoning and decisions of actual judges. He made students read and study the cases and then subjected them to rigorous questioning in class—the so-called Socratic method. Langdell consistently emphasized “first that law is a science; secondly, that all available materials of that science are contained in printed books.” Hence, “the library is the proper workshop of professors and students alike,” akin to the laboratories of natural scientists. And students could be properly taught only by “teachers who have traveled the same road” in learning the law. Langdell concluded: “then a university, and a university alone, can furnish every possible facility for teaching and learning law.” Langdell’s case method thus taught the law as a theoretical subject with an empirical scientific foundation (books, cases) that belonged in the American university alongside other wissenschaftlich subjects.41

This approach required a different kind of law school. The intellectual demands of interpreting judicial decisions presupposed mature students with some liberal education. Part-time students could scarcely accommodate such a regimen, so the case method also required full-time students. Full-time professional teachers were needed who themselves had analyzed the sources of the law. Langdell immediately lengthened the Harvard course of study to 2 years and imposed examinations for advancement. In 1875 it was decreed that “the course of instruction in the School is designed for persons who have received a college education,” the first such stipulation in American legal education.42 The following year the course was extended to 3 years. As a professor, Langdell himself set an example of undistracted devotion to scholarship. Eliot signaled that this was the new criterion for faculty in 1873 when he hired Langdell’s protégé, the scholarly James Barr Ames, only 1 year after his graduation—a startling departure from the usual hiring of eminent practitioners. Ames later succeeded Langdell as dean (1895–1910) and chief advocate for the case method.

The demands of the Harvard model on students and professors hardly commended it to other law schools. Furthermore, the theoretical approach of the case method failed to train students in the practical aspects of lawyering. Enrollments at Harvard remained fairly steady, but as late as 1882, it had fewer students (138) than had enrolled under Justice Story. But then the superior quality of Harvard law graduates became evident. Proponents of the case method defended it as teaching students to “think like a lawyer.” However, Harvard Law was unique in more fundamental ways. It enrolled well-prepared college graduates; forced them to prepare for class; examined them in class on their understanding of the material; and subjected them to annual examinations.43 In short, it supplied the elements of a rigorous education that would be effective anywhere. Perhaps more important, it created conditions for what historian Bruce Kimball has labeled “academic meritocracy.” In 1885 Eliot reported that the school was “unable to fill all the places in lawyers’ offices which have been offered … for third-year students just graduating.” By the end of the decade, a leading Boston firm wrote Langdell that it preferred “young lawyers from the Harvard Law School” who rank “among the leaders of their respective classes.” Success in Harvard Law School was directly linked to successful careers in the commercial economy—the first solid exemplar of this phenomenon in American higher education.44 As recognition spread, enrollment shot upward, exceeding 450 by the time of Langdell’s retirement in 1895 and, 15 years later, topping 800.

The success of academic meritocracy forced the Law School to confront an issue that had never before arisen in American higher education: selecting candidates for a limited number of places. The difficulty of the entrance examination was ratcheted upward until it was eliminated in 1909, but this affected fewer and fewer applicants. In 1893 the faculty limited admission to graduates of a select list of sixty-five colleges, a not unreasonable approach considering the enormous variation in standards across American colleges. Rather than rating individual colleges, however, the list was based on institutions that had previously sent students to the Law School or Harvard College. No Catholic institutions were included, which enraged Catholic educators. In reaction, Eliot added Georgetown, Boston College, and the College of Holy Cross; but the last two were subsequently removed. Eliot, who was directly involved with this policy, had a deep aversion to Catholic colleges because, he felt, they did not teach sufficient science and because they presented philosophy as dogmatic truth instead of open inquiry. Langdell apparently concurred. Although deeply offensive to Catholic educators, this tempest raised deeper questions about academic meritocracy. Merit in the Harvard Law School was assumed to result from, first, the high requirements for admission and, second, from relative achievement in the classroom. However, selective admission, has been affected from its inception by the assumptions embedded in the process.45

After 1890, the case method began to be adopted by other leading universities, despite legitimate criticism that it was ill suited for average law students.46 The lure of academic meritocracy made these institutions aspire to join the circle of elite law schools favored by corporate firms. Events at Columbia in 1891 signaled the onset of a new era. Theodore W. Dwight was founder (1858) and autocratic dean of the Columbia Law School. He located the school downtown amid the courts and law offices, easily accessible to their employees. Splitting the school’s revenues with Columbia, he maximized enrollments and minimized teachers. During the 1870s he graduated more students than Harvard enrolled. But Dwight also resisted reforms that might threaten this model. He opposed entrance requirements, especially a bachelor’s degree, and held off adding a third year until 1888. When Seth Low became president in 1890, he immediately sought to raise the stature of the law school. He brought in a professor from Harvard and forced Dwight to retire. Dwight, along with his faculty, returned downtown to found the New York Law School on the old model, while the new faculty “Harvardized” the Columbia Law School. By the mid-1890s high admission standards, a 3-year course, and the case method could be found at Cornell, Stanford, and Northwestern as well. When William Rainey Harper launched the Chicago Law School in 1900, he consulted Eliot and Dean Ames before hiring a Harvard professor to be dean. By that date a distinct sector of reformed university law schools had emerged. These schools, and others hoping to move in that direction, organized the Association of American Law Schools (AALS) to raise standards in legal education.47 However, the tide was moving very much in the opposite direction.

Legal education mushroomed around the turn of the century. From 1890 to 1910, the number of operating law schools roughly doubled, graduates tripled, and enrolled students quadrupled. While a handful of university law schools focused on full-time students and strengthened admissions and coursework, the most rapid growth occurred in schools with low or no entrance requirements that catered largely to part time students. Although these two opposite types are distinct, there were multiple possible patterns. Some universities, like Georgetown, expanded their regular law course into evening classes to tap this burgeoning market. Numerous universities, like Southern California, took over commercial law schools to do the same. Such units attracted large numbers of part-time students and served as cash cows for the institution. New York University, for example, served every market segment with regular, late afternoon, and evening courses. Many of the part-time schools were purely commercial undertakings, like Dwight’s New York Law School and Chicago Kent. However, the YMCA was also a large participant in this market. Quite separate from the Campus Y, urban YMCAs sought to provide young, working men opportunities for self-improvement through education. In 1920 nine operating law schools were outgrowths of these efforts. In 1915, five of the six largest law schools (Harvard being the outlier) served the part-time evening market—Georgetown, New York University, Chicago Kent, Southern California, and New York Law School. A classification in 1920 found 21 percent of law schools to be “high-entrance, full-time schools”; 29 percent “low-entrance schools” with full-time courses; 39 percent “part-time” schools; and 11 percent “short-course schools.”48

This distribution reveals a curricular stratification of law schools, with the first category representing primarily university law schools employing the case method. However, it also indicated social stratification between college-educated students studying full time and those who could not afford college attending in late afternoons or evenings. After 1900, the elite of the legal profession, represented in the American Bar Association and the AALS, became increasingly alarmed by the influx into the profession of less-well-educated lawyers from urban proprietary schools, especially the large numbers of immigrants and Jews. The remedy they advocated was to raise intellectual standards to those of the AALS schools, particularly the requirement of some college for admission. Such reforms would, of course, address their social concerns as well. However, states would not raise requirements for the bar by mandating study in law schools, and most schools were averse to limiting their clientele by raising admission requirements. The situation in legal education was similar to that in medicine, but without the external pressure exerted by state medical boards. Nevertheless, the Flexner Report inspired a legal counterpart, and in 1913 Alfred Z. Reed of the CFAT embarked on a protracted evaluation of legal education, not published until 1921. By that date, the rise of nativism after World War I emboldened the impatient legal elite to seek to marginalize the part-time schools. A separate report engineered by the AALS claimed that only law schools provided valid legal education, that 2 years of college should be required for attendance, and that part-time programs should be lengthened to 4 years.49

Reed’s report appeared just 1 month later, with a radically different message:

Humanitarian and political considerations unite in leading us to approve of efforts to widen the circle of those who are able to study the law. The organization of educational machinery especially designed to abolish economic handicaps—intended to place the poor boy, so far as possible, on an equal footing with the rich—constitutes one of America’s fundamental ideals…. Inherently … the night school movement in legal education is sound. It provides a necessary corrective to the monopolistic tendencies that are likely to appear in every professional class … [and] constitute a genuine element of danger.50

Educationally, Reed found features to praise in the case method while criticizing its neglect of the practical aspects of a lawyer’s work. He also found the part-time approach to be deficient when it was not complemented by work in a legal office. However, his defense of the social opportunities presented by these wide-open night schools struck to the core of American belief in the democratic potential of higher education. Reed continued to champion the broad social access provided by part-time legal education through the 1920s. Enrollments in these institutions peaked in 1929. Ironically, legal education, which had been only tenuously linked with colleges and universities before 1870, pioneered three of the distinctive features of twentieth-century American higher education—academic meritocracy, selective admissions, and, with open, part-time evening schools, the first vestiges of mass higher education.

HIGHER EDUCATION FOR WOMEN, 1880–1915

What the Commissioner of Education called “the superior instruction of women” marked a milestone around 1890. That year, a slight majority of female enrollments were found in modern women’s colleges (Division A: 2,000) or coeducational colleges (9,000) rather than “premodern” Division B female colleges (10,000). This last category was already in terminal decline, with nearly half of institutions closing from 1880 to 1900 and enrollments stagnant. But enrollments accelerated at Division A and coeducational schools, rising by about 150 percent in the 1890s. However, these patterns were far from uniform. The remaining Division B schools were concentrated in the South, where they were not yet challenged by high schools and normal schools. Division A schools clung to the eastern seaboard, with two exceptions. Coeducation was the dominant pattern for the public universities of the Midwest and West. These last two regions had the highest enrollments of women in modern institutions, especially the West. There are several stories within the expansion of higher education for women.51

The two great innovations after the Civil War were coeducation in universities and the creation of endowed women’s colleges. Women’s educational experiences in these two domains were initially rather different. Antebellum coeducation had been confined to small evangelical colleges and, in Oberlin, one very large one. The decade following the Civil War saw a surge in coeducation in new or formerly male institutions and serious consideration of admitting women in others. Coeducation was largely imposed on the new western land grants by legislators in their original charters. Among private institutions, Methodist colleges were quick to adopt coeducation as well, including Wesleyan and the new Boston University in the East.52 By 1880, roughly 4,000 women attended regular colleges and universities (most of which remained all male), more than 10 percent of the total. These enrollments would grow, but few institutions joined this group after that date. Enthusiasm for coeducation diminished greatly, and a backlash later developed.

In 1865, Matthew Vassar’s eponymous college opened with the intention to “build a college in the proper sense of the word, an institution which should be to women what Yale and Harvard are to young men.” His sentiments were seconded by Sophia Smith and William Durant, whose creations, Smith and Wellesley colleges, opened in 1875. These three became the core of what would be known in the twentieth century as the Seven Sisters, the most prestigious women’s colleges, analogous to what would later become the Ivy League. Bryn Mawr joined this endowed elite in 1885, followed by the “coordinate colleges” of Radcliffe (Harvard) and Barnard (Columbia). Mount Holyoke, despite historical pride of place, required prolonged upgrading after it became a college in 1893. The success of these privileged colleges buttressed the residual hostility to coeducation in the East.53

The women who attended these institutions before 1890 are often referred to as the first generation of collegiate women. These pioneers endured psychological pressures from Victorian culture, which upheld mutually exclusive sex roles, with women occupying a separate sphere of domesticity. A stereotypical idealization of womanhood paralleled the coeval fixation with manliness and was conventionally interpreted as precluding the masculine regimen of traditional colleges. Besides this impediment, women faced hostile environments in formerly male colleges and a profound ambiguity surrounding the role of an educated woman in Victorian society.54

Prejudice against coeducation was endemic in the East, where the infamous Dr. Edward Clarke published his best-selling Sex in Education in 1873. Arguing that women were biologically unfit for the “brain work” required by a standard college course, Clarke detailed the baneful effects that such “identical education” could produce. Critics echoed these misconceptions for a decade, and they planted doubts in the minds of prospective students and their parents. But Clarke’s “science” was belied by contemporary experience. At Cornell, Andrew D. White sought to resolve the coeducation issue with a fact-finding tour of Oberlin, Antioch, Michigan, and Northwestern. His report to the trustees answered every objection: most important for his male audience, he found that the presence of ladies did not harm male students and might even have a positive impact. He found no evidence for a deleterious influence on the health of women students, and he defended their capacity to benefit from the identical course of studies as men.55 Despite proponents like White, nearly ubiquitous suspicion or disapproval placed a heavy psychological burden on the first generation of women students. They typically devoted themselves to their studies in part to prove the legitimacy of female higher education to a skeptical public—and to themselves.

Such efforts were probably reinforced by the hostile environment they entered. Presidents and faculty had nearly all been educated in all-male settings, mostly in the East (or in Germany), and were predominantly opposed to coeducation. The decision to admit women at public institutions was usually imposed from outside by legislators or, at Michigan and California, regents, out of vaguely democratic sentiments. At private schools, donors sometimes forced the issue. Henry Sage made Cornell truly coeducational by donating the first women’s dormitory at a coeducational university; Washington Duke made an endowment gift to Trinity College (later Duke University) contingent on admitting women; and Mary Garrett’s conditional donation forced the Hopkins Medical School to accept women. President Eliot, however, refused an offer by women to buy their way into Harvard. Male students were almost always opposed. They feared that coeducation would tarnish a college’s status but were probably most disturbed by the disruption to the social cohesion and camaraderie of the class. Young men were accustomed to dealing with females as sisters, as servants, or through the formal protocols of courtship, and they seemed unable to relate to them as fellow students. Resenting their presence, college men treated coeds condescendingly, when not ignoring them entirely. The first women students were often restricted from participating in college ceremonies and activities and segregated in classrooms, but otherwise they had a remarkable degree of freedom. Public universities were too poor to provide separate facilities or residence for women. Left to fend for themselves, ostracized from much of campus life, the first generation of college women had all the more reason to focus on their studies. But these psychological hardships also produced a high rate of attrition.

A further negative was the absence of careers for women outside of teaching. The universities of Wisconsin and Missouri created normal departments to accommodate their first women, but such units resembled inferior normal schools. Most collegiate women who worked did, in fact, become teachers at all levels of education, including women’s colleges. However, Victorian norms frowned upon paid work for genteel women, and work after marriage was taboo. First-generation women graduates who fashioned careers for themselves were thus exceptional, socially and psychologically. Most telling, they sacrificed marriage and family for careers. Roughly one-half of first-generation women graduates married, compared to a 90 percent national rate, and those that did had fewer children.56

The patrons of the new endowed women’s colleges and their advisors aimed, above all, to provide a liberal education identical to Harvard’s and Yale’s but at the same time to respect and protect the unique qualities of womanhood. They consequently sought to create a carefully controlled environment—quite the opposite of women’s experience at coeducational universities. Both Vassar’s original advisor and Wellesley founder William Durant looked to Mount Holyoke for inspiration. They resolved to immure their students in a single building, where communal activities, solitary study, and devotion could be fully programmed. Both patrons commissioned monumental, luxurious structures for this purpose, where students could be closely supervised by “lady principals.” Smith College sought to nurture femininity by housing its students in cottages, each with a “lady-in-charge” and a faculty member, to create a natural, family atmosphere. All three founders, especially William Durant, hoped to promote an evangelical spirit, but their charges proved resistant. In the absence of the evangelicalism that animated Mount Holyoke, they developed independent student cultures instead. The cloistered nature of Vassar and Wellesley, in particular, fostered intense personal relationships among students and rebelliousness against rules and their agents. Descriptions of their early years document an overwhelming student preoccupation with extracurricular activities. With the advent of the hypothecated second generation of students after 1890, the experience of women at both the endowed and coeducational institutions was transformed by the female version of the collegiate revolution.57

Women of the second generation at coeducational universities were more numerous, more confident, and more open to a fuller collegiate experience. Like their male counterparts, they organized a growing number of activities outside the classroom. The University of California, with almost 40 percent women, was in the forefront of these developments. In 1894 Cal coeds organized a governing body to coordinate and articulate women’s interests. At first it focused on mundane issues—bathrooms, lunchrooms—but it soon began to organize women’s clubs for sports, debating, drama, and music. It sponsored a “Woman’s Day” on campus with multiple activities, and it became the female arm of President Wheeler’s promotion of student government. A YWCA chapter also contributed to the organizational mix at Cal and elsewhere. In 1901 Cal women founded their own honor society, since they were not eligible for the men’s society, and in 1912 they staged a pageant like those performed at eastern women’s colleges. The 1890s also saw the establishment of the first four sororities at Berkeley. Like fraternities, they emphasized social distinctions but also channeled their members into campus activities. By offering superior room and board, they resolved that difficult issue for some women students. Women in these decades threw themselves into campus activities as fervently as men. As a precaution against overcommitment, several schools devised a point system to limit the offices one woman could hold.58

Universities had few funds available to invest in the needs of women students.59 The appointment of deans of women by major universities at least showed recognition of such needs. William Rainey Harper characteristically invented this post in organizing the University of Chicago, but these positions became widespread only after 1900. They were often filled by graduates of the Seven Sisters, who apparently brought the requisite femininity to the task. Deans of women were not disciplinarians—although always concerned with upholding standards—but rather served as advocates for women on male-dominated campuses and built consensus for the common weal among the different groups of women. In 1903 the deans of midwestern state universities began meeting as a group to discuss their responsibilities. Of greatest concern was housing. With few dormitories, most women lived in boardinghouses. Deans of women worked vigorously to impose reasonable standards upon the private proprietors. The proper regulation of social activities was another common concern. And, at the first meeting, the deans unanimously opposed intercollegiate athletics for women.60

The second generation of women adapted to the high collegiate era with a flurry of activities and organizations. However, women’s activities were largely segregated and everywhere subordinate to those of the men. Male students were especially covetous of prestigious positions. Women were denied academic honors, excluded from campus government, and denied newspaper posts. Even football games were a “distinctively masculine event,” where women fans were segregated and expected to watch passively. Prevailing cultural preconceptions of femininity imposed confusing expectations. Cal president Wheeler advised women to prepare for a future of marriage and motherhood; male students believed that study was harmful to femininity and produced schoolmarms; and the dean of women sought to identify careers for graduates other than teaching. And women themselves? Second-generation women tended to go to college for their own personal reasons, including intellectual fulfillment. A significantly larger percentage eventually did marry but still had fewer children than average. They also pursued a growing variety of still circumscribed careers. The collegiate revolution made their college experience more natural and fulfilling, despite a hostile environment, but aspirations for greater equality had to wait.61

All three endowed women’s colleges started slowly. In striving to emulate Harvard and Yale, they set entrance requirements in Latin and Greek well beyond the schooling then available to women. Vassar and Wellesley had to establish preparatory departments, and Smith’s first two entering classes had only 14 students. Smith then opened its doors to “specials” (nondegree students), as had the others, and its enrollments soared. Secondary education for women rapidly adapted to the stringent entrance requirements, and by the late 1880s, such growing pains were largely behind them. Wellesley enrolled 660 students in 1890—more than Johns Hopkins or the University of Illinois—and Smith and Vassar were not far behind. With large faculties, they were better equipped than most colleges to offer the elective curriculum demanded by the academic revolution. In 1906, Vassar had a larger income than Princeton and Wellesley, almost equal. Smith’s 1,500 undergraduates ranked fifteenth in the country. These colleges largely achieved the academic stature to which they aspired, but that was not what shaped their reputations.

The endowed colleges fostered an all-encompassing collegiate culture more intense than Princeton’s or Yale’s. They generated more than their share of the usual class and campus organizations and activities, and students engaged in them with unmitigated zeal. Intramural athletics flourished on these campuses, producing fiercely competitive interclass contests. Students delighted in ceremonies, both in planning and execution. Student dramas, all-female dances, and pageants embellished student life. Student energies were further absorbed with complex social relationships, both among cliques and social groups, as well as the “crushes” and “smashes” students developed toward one another. Students were completely immersed in “the life” that swirled within these campuses, or at least that was the image. Of course, not every girl had the money, talents, personality, or inclination to thrive in this environment.62

The most obvious fact about student life at these schools was the increasing predominance of wealthy students. Smith president L. Clark Seelye remarked in 1896 that “each year [the wealthy] are more largely represented.” Tuition, room, and board cost $450 in 1907, the ceiling for American higher education, but 80 percent of Wellesley students spent at least as much on personal expenses. Society on these campuses was dominated by the “swells” who came from affluent families and had attended the same schools. The swells dominated campus organizations and also formed exclusive clubs, societies, and “snobby cliques.” One Wellesley student lamented in 1896: “a person counts for absolutely nothing unless she is a Society girl.” Posh societies were allowed to erect their own houses on the Wellesley campus. Poorer students were given the opportunity to save money by living in an annex and joining a club for nonsociety members. Social cleavages could hardly be starker. High prestige also was accorded to “all-around girls,” who became leaders in campus activities by force of personality. Historian Helen Horowitz reveals “the most closely guarded secret of the women’s colleges, that in a college composed only of women, students did not remain feminine.” All-around girls, in particular, were admired for assuming masculine leadership roles, excelling in athletics, and playing male roles in dramas.63

Much the same could be said of these schools as of the Big Three: dominated socially by the eastern upper class, they specifically conveyed cultural refinement and elite socialization and—for those who chose this path—intellectual enrichment and leadership skills. But for women, too, as social exclusiveness rose, so did disdain for academic achievement and the stigmatization of “grinds.” Unlike the Big Three, however, success on campus had no analogue in successful business careers. The modal graduate of the endowed women’s colleges was expected to marry according to her social station, raise a family, and possibly assume a prominent role in voluntary organizations. Only Bryn Mawr adopted more ambitious ideals, despite being originally intended as an Orthodox Quaker female college, a counterpart to nearby Haverford College.

In the closed circles of Orthodox Quakers, members of the Haverford and Johns Hopkins boards of trustees overlapped. When these gentlemen met to discuss Quaker education in 1877, Joseph Taylor lamented the absence of a Quaker college for women. He then wrote a bequest that allocated his fortune of $800,000 to founding Bryn Mawr as a Christian college for Quaker women but admitting others “of high moral and religious character” and from “the higher and more refined classes of society.”64 Its all-Quaker board contained mostly pious Haverford trustees. However, the institution was virtually hijacked by the redoubtable M. Carey Thomas to fashion the most academically ambitious college for women.

The daughter of a Quaker trustee of Hopkins and (later) Bryn Mawr, the precocious Thomas completed a Cornell AB in 2 years. Unable to pursue graduate study at Johns Hopkins, she studied at Leipzig and was awarded a PhD in philology (1883) by the University of Zurich (which, unlike Leipzig, granted degrees to women). She then presented herself as a candidate for the presidency of Bryn Mawr. With educational and family credentials that could scarcely be ignored, she was instead appointed Dean of the College at the age of 26. In this position (the first of its kind) she seized the authority to shape the academic life of the new college. Visiting all the endowed colleges, she was disappointed by their lack of academic rigor. She resolved to model Bryn Mawr after Johns Hopkins, embracing the new academic spirit. Thomas’s academic aspirations were supported by the trustees, who thus conceded initiative to her, despite frequently disagreeing. But in one respect she remained true to the donor’s wishes: she, too, sought to educate the more refined classes of society.65

Thomas was able to set a high standard from the outset. She informed the board that “a college is ranked among other colleges by the difficulty of its entrance examination” and accordingly took Harvard’s as a model. She hired male professors for their scholarly abilities and established a graduate school. She opposed trustee suggestions to establish fellowships for poor students as wasteful, advocating instead graduate fellowships that would bolster prestige. In 1894 she was elevated to the presidency (1894–1922).66 She had made Bryn Mawr a conspicuous success, but it was still a tiny college. In 1898 she began a concerted effort to raise funds for more dormitories, a library, and expansion to a great college. She mobilized the support that she enjoyed among alumnae and obtained vital Rockefeller gifts by charming John D. Rockefeller Jr. Bryn Mawr became recognized as the academic leader of women’s higher education. Socially, however, it mirrored college life at the other three endowed colleges. Thomas fused both academic and social distinction in the ideal of a “Bryn Mawr woman,” who would add “scholarship and character [to] gentle breeding.” Bryn Mawr graduates did emerge as a distinctive type, characterized by genteel culture, achievement in learning, and unapologetic championing of women’s rights. When alumnae were added to the trustees in 1906, M. Carey Thomas’s domination of her college was complete.67

In 1899, a Charles Eliot speech advocating a distinctively female curriculum for women enraged Thomas. In a widely publicized rejoinder, she mocked the president of Harvard for implying the world of knowledge “existed only for men” and saying he might as well have proposed for women “a new Christian religion … new symphonies and operas … in short, a new intellectual heaven and earth.” She emerged as the most prominent and most forceful advocate for offering women the same higher education as men. She made a more comprehensive statement when asked by Nicholas Murray Butler to write a description of women’s higher education.

In Education of Women (1900), she ignored most institutions for women, discussing only those providing modern academic courses and emphasizing how past limitations had been overcome. Coeducation predominated as the most widespread and economical form and was now standard for publicly funded universities. But in the East, where separate women’s colleges could be supported, the four endowed “great colleges” provided a superior residential educational experience. Bryn Mawr was the smallest of them with 269 undergraduates, but it exceeded the others academically with 61 graduate students and 19 PhD graduates. Thomas paid special attention to graduate education, where she reported that only four reputable universities still excluded women (Catholic, Clark, Princeton, and Johns Hopkins universities). She looked forward to a similar acceptance of women into professional schools. This factual account concluded with an affirmation of equality: “an inferior education shall not be offered to them in women’s colleges, or elsewhere, under the name of a modified curriculum.” By the turn of the century, M. Carey Thomas had become the foremost spokesperson for women’s higher education and defender of unqualified equal opportunity.68 However, the trend in American higher education, at least in the short run, was moving in the opposite direction.

Male students by and large had not tempered their hostility to coeducation. On the contrary, the nature of the collegiate revolution tended to magnify their resentment. To disentangle rationalizations of prejudice from deeper motivations may be pointless, but several issues clearly bothered the boys. Fears of diluting athletics and manliness were prominent in complaints against coeducation. These were invariably linked with school prestige and, hence, the reputations not just of students but of younger alumni as well. On campus, the persistent steps to monopolize prominent extracurricular activities and class ceremonies testify to the value men placed on maintaining campus dominance. Psychologically, males appeared to have difficulty reconciling female cultural stereotypes with actual female classmates, who often outperformed them academically. However, this dissonance was rooted in social class. Refined, genteel women—suitable for marriage—prepped at exclusive academies and attended the Seven Sisters; coeds were predominately graduates of nearby high schools who would probably become teachers. That the most mean-spirited opposition to women was usually found in fraternities underlines the class-based nature of male hostility.

Among coeducational universities, Cornell was notorious for male disdain toward women, even coining the ugly term (and concept), “anticoedism.” Women were only 14 percent of Cornell students in 1900 but one-third of Letters and Sciences. By that date, most male students in professional schools hailed from wealthy urban families and were part of an extensive fraternity system. Cornell coeds were judged by no less than the New York Herald to be “out of place in fashionable college society,” and fraternities forbade their members from having any social contact with them. The forerunner to the dean of women reported in 1912 that the university had refused to commit to coeducation, was fearful that it diminished Cornell’s stature vis-à-vis Harvard and Yale, and was determined to be, “in curriculum and atmosphere, as distinctly a man’s institution as possible.” The plight of women was even worse at Wesleyan, where Methodists apparently no longer championed the education of women. Women had been admitted in 1872, but by the 1890s the chief preoccupation of Wesleyan students and alumni was status anxiety toward Amherst and Williams. The school’s difficulties were facilely blamed on coeducation. The customary activities of women were progressively circumscribed, and after 1900 women were virtually ostracized on campus. Far more than Cornell, Wesleyan coeds were locals seeking a relatively inexpensive education, and they were accordingly treated contemptuously. The university made a pretense of considering a coordinate women’s college, but, lacking funds, trustees voted to cease admitting women in 1909. Connecticut College was founded for women in nearby New London in 1911, in part to accommodate the Wesleyan coeds.69

The rapid growth of women students in the 1890s caused a different kind of reaction among university leaders. Both William Rainey Harper and Charles Van Hise suggested establishing segregated classes, at least for the first 2 years. Their anxieties, which were widely shared, stemmed from fears that women were driving men away from literary courses, and they imagined a threat to the masculine character of the campus. Harper’s initiatives faltered, and Van Hise provoked a barrage of intemperate criticism by merely raising the subject.70 Both gynephobes and women’s advocates wished for curricula specifically tailored to acceptable feminine roles. Early “domestic courses” teaching household skills, pioneered at Iowa State and Kansas State, had poor enrollments and reputations. Only much later, a movement to found a science-based field of home economics was initiated by Ellen Swallow Richards, the first female graduate and faculty member of MIT. In annual conferences at Lake Placid (1899–1907), she laid the foundation for applying chemistry (her field) to household economy. Home economics was readily incorporated into state universities (Wisconsin, Illinois, California) as an applied science promising careers for women. However, Carey Thomas essentially blackballed home economics as a “women’s subject” in liberal arts colleges.71

The same aversion to coeducation lay behind a number of efforts to establish coordinate colleges. These creations were motivated by contradictory aims: either to establish women’s education where all-male colleges would not tolerate coeducation or to solve the woman problem at coeducational schools through segregation. Carey Thomas identified five coordinate colleges in Education of Women. Four were of the first kind: Radcliffe, Barnard, Sophie Newcomb, and the Women’s College of Brown (Princeton’s feeble entry, Evelyn College, 1887–1897, had closed). Radcliffe had begun as the “Annex” in 1879, where women were taught by Harvard faculty volunteers, and it was incorporated at but not of Harvard University in 1894. Sophie Newcomb Memorial College for Women became a coordinate college of Tulane University when it was endowed by its namesake in 1885. The others were supported by the respective presidents, especially Benjamin Andrews of Brown, and combined separate women’s classes with some upper-level coeducation. Radcliffe and Barnard were quickly recognized as “Sisters.” Sophie Newcomb had no academic links with Tulane but aspired to bring the model of the Seven Sisters to the South.72

The other coordinate college was established by Western Reserve in order to eject its women. President Carroll Cutler (1872–1887) was a vigorous proponent of women’s education and had opened the college to them when he assumed office. But the men were never reconciled. In 1884 the entire faculty petitioned to end coeducation, and male students boycotted classes when Cutler convinced the board otherwise. His successor, however, wished to restore the college’s image as the Yale of the West. He feared that coeducation was discouraging wealthy Clevelanders from supporting the college. In 1888 the girls were exiled to the College for Women, which was launched with almost no resources. But the strategy apparently appealed to Cleveland’s moneyed elite, who began to support both the men’s and the women’s colleges with a stream of gifts. Cooperation developed between the two colleges. Coeducation, it seemed, was tolerable academically when each gender had its own separate social base and extracurricular life. A similar scenario occurred later at Tufts College, which had admitted women in 1892. In 1907, however, the president declared, “the average young man will not go to a coeducational institution if other things are anywhere near equal.” Jackson College was established for women in 1910 to banish the stigma of coeducation. Here, too, after a decade the colleges became increasingly integrated. Coordinate colleges were contemplated at many institutions (as at Wesleyan) as a solution to the supposed liabilities of coeducation but in most cases proved infeasible financially or politically.73

Of course, coeducation was anathema among more conservative constituencies, especially Catholics and Southerners. In both cases, support for women’s higher education came belatedly, for calculated reasons, and was tolerable only in protected all-female institutions. Catholics were motivated by the realization that large numbers of Catholic girls were attending secular colleges. The first Catholic college for women, The College of Notre Dame, was chartered in Maryland in 1896 and graduated its first class in 1899. Eighteen more colleges were established by religious orders by 1915 and another fifty-six, by 1930. These colleges were organized much like cloisters, seeking to shield their charges from the outside world and protect their faith. The destiny of students was marriage and motherhood, with the possibility of teaching. Academically, their offerings were limited, since the religious who taught generally had not themselves attended college.74

In the South, female colleges took three forms. An abundance of weak institutions that the Bureau of Education classified in “Division B” operated in the twilight zone between secondary and higher education. Only the strongest of these schools survived as women’s colleges, but even then they retained legacies of ornamental and vocational courses. In the 1890s, southern states became conscious of their enormous educational deficit and consequent need for teachers. They responded by establishing “normal and industrial colleges for white girls.” Tuition was remitted for students promising to teach for 2 years, and most students took this option. Private women’s colleges comparable to those in the North were few and late to appear. Sophie Newcomb Memorial College was the result of a windfall bequest, and Agnes Scott College in Georgia was raised from a traditional seminary only by the philanthropy of the Scott family, becoming a college in 1906. As late as 1903, Randolph-Macon Women’s College, an endowed institution opened in 1893, alone offered a 4-year course, although most students stayed for only 2. That year the Southern Association of College Women formed to raise standards in the region’s colleges. However, its first survey more than a decade later identified 7 “standard colleges” out of 140. Most women’s colleges remained largely finishing schools with inadequate resources spread thinly over several levels of programs.75

In 1915, a half-century after the opening of Vassar, advocates of women’s higher education had many reasons to be disappointed. By their record, women had refuted every argument that had been raised against their access to advanced education, only to be accused of fomenting “race suicide” because of the low fecundity of intelligent, Anglo-Saxon graduates.76 Women now possessed richly resourced liberal arts colleges, but their educational efficacy was clearly diminished by the frivolous preoccupations of their students. Women now formed the majority of literary college students, or close to it, at major state universities, but their experience was still circumscribed by male dominance of the campus. Women had proved their intellectual mettle in PhD programs but were given few chances to advance as scientists and scholars in coeducational universities. When normal schools are included, an equal number of men and women were obtaining advanced education in 1910, but 60 percent of the women were in normal schools or Division B schools, institutions inferior to colleges.77 The Association of Collegiate Alumnae recognized only twenty-four institutions as meeting its standards. Perhaps most galling, discussions of higher education still characterized all women by a single, undifferentiated, cultural stereotype.78 Above all, this stereotype impeded women from using higher education for economic or social advancement. Of college-educated women who worked in this era, a large majority became teachers, and the rest were mostly librarians, nurses, or social workers.79

In one sense, women were granted access to the world of academic knowledge through admission to graduate and professional schools. Women gained admission to doctoral programs in the early 1890s, led by Yale, Columbia, and Chicago. In that decade, 204 women earned doctorates. However, women scientists were relegated to marginal positions in the world of science—professorships at women’s colleges at best—or were segregated in women’s fields of home economics and hygiene. These occupational patterns hardened after 1910.80 Law and medicine were more difficult to enter, despite the unsettled state of professional education already described. Flexner declared, “No woman desiring an education in medicine is under any disability in finding a school to which she may gain admittance,” including Johns Hopkins, Cornell, or Michigan. Nevertheless, he found the number of female students and graduates to be declining. In 1920 women received 6 percent of MDs and comprised 6 percent of physicians and surgeons. The rise of law schools opened legal training to women, since preparation in a law office was impossible. Women increased from 1 to 6 percent of law graduates from 1900 to 1920, but in the latter year they comprised barely over 1 percent of lawyers, and many apparently did not practice.81 American society did not allow knowledge alone to provide access to careers. Some determined women naturally overcame these odds, but as long as the linkage between schooling and careers was tenuous, their numbers remained small. After World War I, the status of women would change markedly. They obtained the right to vote, greater personal freedom, relaxed codes of behavior, and greater participation in the labor force. And they also attended colleges and universities in ever-rising numbers. However, the cultural barriers that kept women from realizing comparable economic and professional rewards from investments in higher education would persist until well after World War II.

LIBERAL CULTURE

The academic revolution and the collegiate revolution were far-reaching transformations in the history of American higher education. That they occurred simultaneously made the fin de siècle era both pathbreaking and exhilarating. Both revolutions were implacable and irreversible, yet they drew universities and colleges in opposite directions.82 The consequent tensions were felt most acutely in undergraduate education, and by the turn of the century they could no longer be ignored. Critics of Germanic erudition attacked specialization, arguing that increasingly esoteric expertise had little to offer most college students. The proliferation of courses combined with the elective system, they charged, had produced “curricular incoherence.” An emerging body of humanists had a particular grievance, that the narrowly empirical approaches to philology and history had drained aesthetic sensibility from literary studies. Before 1890, a liberal education had meant the classical course; now there was no consensus on what studies should take its place, or why. On the collegiate side, the negative effects were all too apparent. The numerous extracurricular activities, particularly athletics, had become absorbing preoccupations and too often the source of boorish behavior. The supposed social benefits of college were curtailed for many by residential and social stratification. Too many students took advantage of lecture courses and electives to minimize effort. Most discouraging, disdain for study and learning was increasingly evident. But there was no turning back. Disciplinary knowledge had enriched the curriculum, and institutions now hired only teachers with advanced training. Collegiate life had infused enormous vitality into the student experience. Moreover, it had made college attractive to a far larger population by promising social advancement and productive careers. How could institutions reconcile scholarly, liberal, and collegiate values?

One model was at hand, if impossible to replicate: Oxford and Cambridge. Anglophilia was thoroughly embedded in the American upper class in the late nineteenth century, and educators were familiar with Oxbridge from an adulatory literature and occasional visits. There they imagined a harmonious blending of liberal learning, sport, student socialization, and academic community. These effects were produced in the residential colleges. The universities employed learned professors, but students were taught by fellows in their colleges. They could choose to study for a pass or an honors degree, with the latter requiring close work with tutors to prepare for rigorously graded general examinations. The colleges nurtured frequent interactions among students and fellows, fostering both cultural and intellectual socialization. And for honors students, at least, genuine learning was demanded. Much as German PhDs had for the previous generation, Oxbridge provided inspiration for reforming American universities.83

The conflict caused by the two revolutions was felt acutely by the Big Three, accustomed to lead in both academic excellence and collegiate endeavors and committed to nonvocational education in the liberal arts. Each faced somewhat different manifestations of the same root problem. At Harvard, students exploited the elective system to take easy courses and avoid serious study. At Yale, students were casually dismissive toward the retrograde curriculum and boastful about cheating. Princeton’s academic laxity under president Patton was a tacit scandal in academic circles. After 1900, each school felt compelled to address the abject state of student learning. In 1902 Harvard appointed the Committee on Improving Instruction and Yale an unnamed committee for the same purpose.84 Harvard discovered that students spent only half the expected time on their studies; the Yale committee concluded that “hard study has become unfashionable at Yale.”

The guiding hand of the Harvard committee was Abbott Lawrence Lowell, a critic of Eliot’s elective system and soon to become his successor. The committee report was a turning point toward a more structured curriculum and presaged Lowell’s dedication as president to greater student achievement. Yale too instituted piecemeal reforms to stiffen the course of study. Its new president, Arthur Twining Hadley (1899–1920), favored strengthening undergraduate education, but any actions had to come from the conservative Yale faculty, who in this case were motivated to act. As reforms were made, student scholarship appeared to improve from its nadir in the classes of 1904 and 1905. The class of 1904 claimed in its yearbook “more gentlemen and fewer scholars than any other class in the memory of man,” while the next class boasted that never had the Heavens witnessed “a class whose scholarship approached so close to naught.”85 At Princeton, in contrast, the new president not only acted decisively to raise the academic performance of students, he also promoted a new vision of the goals of undergraduate education.

Woodrow Wilson was associated with Princeton for 24 years as a student (1875–1879), a professor (1890–1902), and president (1902–1910). In this last position he sought to implement a vision for college education known as “liberal culture.” Largely formed by his earlier experience, aspects of liberal culture garnered growing sympathy, but Wilson molded them into a compelling set of ideas that he articulated with force and persistence as a new ideal for American colleges.86 Wilson possessed extraordinary gifts of intellect and magnetism, which were evident throughout his career. His student days at Princeton coincided with the blossoming of the high collegiate era, including intellectual pursuits like debating. Wilson indulged in them all, being elected to two of the most prestigious posts, editor of the newspaper and secretary of the football association. Wilson cherished this close-knit community and the formative experiences it offered. After Princeton, he studied law at the University of Virginia and was a student leader there as well. But he found legal practice distasteful, and in 1883 he decided to pursue an intellectual career by enrolling at Johns Hopkins in the department of history, politics, and economics. There he found the “minute examination of particulars” under Herbert Baxter Adams insufferable and worked instead on his own comparison of congressional and parliamentary government. Wilson impressed faculty and fellow students despite his aversion to German-style erudition and later became a regular visiting lecturer on government (1888–1898). In 1885 he was hired as a founding faculty member at Bryn Mawr on Adams’s recommendation. Although this post allowed him to advance his own scholarly work, his aversion to teaching women made this a trying time. An offer from Wesleyan provided a 2-year interlude and a chance to informally coach the football team while waiting for a call to his alma mater.87

Wilson emerged from these years with powerful convictions about American higher education. Remarkably, he opposed the dominant trends of the day—empirical research in the Germanic tradition, all forms of practical college training, coeducation, and the elective system. Instead, he affirmed the goal of liberal culture in various iterations over the remainder of his academic career. Liberal meant that the focus was entirely on the liberal arts, or what Wilson once called pure literature, pure philosophy, pure science, and his own specialty, history and politics. Culture meant that the object was to instill “the intimate and sensitive appreciation of moral, intellectual, and aesthetic values.” But such training was “not for the majority who carry forward the common labor of the world…. It is for the minority who plan, who conceive, who superintend, who mediate between group and group and must see the wide stage as a whole.” Thus, the end purpose was preparation for leadership and national service, and Wilson emphasized these ends as the goal for both Princeton University and its graduates. Finally, these effects could be gained only by a full 4 years in a “compact and homogeneous” residential college—“you cannot go to college on a streetcar and know what college means.”88 Liberal culture might, in theory, be open to all, but practically it was the province of those who could afford to reside for 4 years at a liberal arts college.

By the time Wilson rejoined Princeton, he had perfected the skills that made him one of the most effective orators of the era—not just in style but in substance. He easily became a popular professor at Princeton and lectured constantly to alumni, civic, and educational groups. In these speeches, he played to audience distrust of universities by disparaging pedantry, “narrow particularistic technical training,” the scientific method of investigation, or the “chaos” of unchecked electives. His audiences understood him perfectly when he lauded the kind of elite, liberal training that he envisaged for Princeton. His most spectacular triumph occurred at the gala celebration of Princeton’s sesquicentennial, where he addressed an audience of national and international celebrities on “Princeton in the Nation’s Service” (1896). The speech extolled the crucial role of Princetonians in the Revolution and the Early Republic, but particularly “the generous union then established in the college between the life of philosophy and the life of the state.” To rekindle that spirit, he explicitly rejected science as a method and instead advocated “full, explicit instruction in history and politics, in the experiences of peoples and the fortunes of governments,” so long as it was undertaken “like a man and not like a pedant.” Wilson’s message was greeted with thunderous applause. His combination of iconoclasm and idealism obviously struck a responsive chord, and Wilson emerged as the person who could realize the vision.89

Wilson assumed the presidency of Princeton in 1902 with a mandate to transform rhetoric into reality. His overriding goal was to enhance student learning and mold it to his conception of liberal culture. Given the neglect of the Patton administration, this required reorganizing faculty and courses. Wilson replaced the single Academic Department with eleven disciplinary departments in four divisions. He fired some of the most inept professors, making clear that rigorous teaching was now demanded of all instructors. He then worked with the faculty to require all students to major in an academic department for their junior and senior years. Although scarcely a novel concept, the Princeton curriculum established a balance between breadth and depth—concentration and electives—that served as a model for other colleges. With the academic house now put in order, Wilson sought to bolster this system with his most original innovation. He hired young scholars to work closely with students in small groups to guide and enhance their learning. Wilson called them “preceptors”—an English term, although the obvious inspiration was Oxbridge tutors. Enlisting the support of wealthy trustees and alumni, Wilson hired forty-five preceptors in 1905, augmenting the faculty by more than 40 percent with young, engaged scholars. With this audacious step, Wilson recruited emerging academic talents who served as productive faculty for decades afterward. It also placed the crown of success on Wilson as an academic visionary, locally and nationally.90

Wilson’s objectives then escalated from ambitious to utopian, and his leadership from consensual to dogmatic.91 He felt that his ideal of liberal culture for Princeton was frustrated by disunity in the college, chiefly caused by the domination of social life by the exclusive junior-senior eating clubs. Intense social competition for club membership and the priority accorded to wealth and social status, Wilson now charged, had undermined what he considered the democratic foundations of the college. At the end of 1906, Wilson launched a campaign to achieve the social “coordination of the undergraduate life with the teaching of the university” by proposing residential quadrangles to supersede the clubs. Wilson’s rhetoric initially convinced the trustees to endorse his Quad Plan, but it soon faced mounting opposition from wealthy younger alumni—largely former club members. Although there was never a concrete plan to finance such a scheme, Wilson adamantly fought this opposition, inflating the plan’s significance by invoking a national mission—“because Princeton is the national leader among all the Universities of America.” His intransigence only eroded support until the Quad Plan was decisively rejected by the board.

This struggle complicated a longstanding effort to build a residential graduate college. His antagonist in this dispute was Graduate Dean Andrew F. West, a more extreme Anglophile than Wilson. West had already placed his stamp on a temporary graduate residence, where he sought to cultivate gentlemen scholars with all the trappings of an Oxbridge college. Although Wilson and West held fairly similar views, they fell out over the location of the college but more fundamentally over who would set university policy. In this acrimonious conflict, Wilson further alienated his former backers among trustees and alumni, all to West’s pointed advantage. When he was decisively repudiated on this issue, Wilson opted to accept the Democratic nomination for governor of New Jersey. Humbled in Princeton largely due to his own ill-advised campaigns, Wilson’s academic reputation nonetheless remained untarnished. In defending his positions, he articulated an interpretation of liberal culture that was more inclusive in stressing campus unity and less elitist in condemning the undue influence of wealth and social status.92 By 1910, his chief and certainly most important academic admirer was now president of Harvard.

Forty years after the installation of Charles W. Eliot, Abbott Lawrence Lowell announced a new orientation for Harvard by devoting his entire inaugural address to his vision for the undergraduate college. Lowell tacitly accepted the achievements of Eliot’s reign but asserted “we must go forward and develop the elective system,” which meant structuring student choice and retaining students for 4 years. The contemporary college had lost the solidarity that had formerly been an important part of student learning, and he cited Wilson on the “chasm that has opened between college studies and college life.” Like Wilson, his aim was to reshape and invigorate student scholarship and to restore some measure of social integration. His approach was essentially the same, but his temperament was not. Lowell was a Boston Brahmin and Harvard aristocrat with supreme confidence in his own judgments, but he also understood the virtues of patience and process.93

Lowell immediately established a system of “concentration and distribution,” much like Princeton’s, since “the best type of liberal education in our complex modern world aims at producing men who know a little of everything and something well.” This was followed by provision for divisional examinations, although the faculty took several years to implement them. They were complemented by the addition of tutors to work individually with students and particularly help them prepare for divisional exams. Finally, Harvard established honors degrees, distinguished by an honors thesis. Perhaps learning from Wilson’s missteps, Lowell sought the reintegration of social life incrementally by establishing residences for freshmen very much on the Oxbridge model: “dormitories and dining halls, under the comradeship of older men, who appreciated the possibilities of a college life, and took a keen interest in [students’] work and their pleasures.” While falling short of the “social coordination” envisaged by Wilson, this was still a significant step toward democratization of college life. And it was feasible, being accomplished by 1914. Under Lowell, Harvard realized the essential features of liberal culture or at least realistic approximations: raising the bar for everyone by eliminating easy options and requiring some measure of achievement; providing real incentives to motivate ambitious students to excel; and recognizing collegiate life as an integral feature of student learning. Most emphatically, in practice and in rhetoric, Lowell upheld 4 years of liberal studies as the ideal course for intellectual development and subsequent careers or professional study. Thus, he seconded Wilson in reasserting the value of a liberal arts education.94

★ ★ ★

This last affirmation was badly needed by the country’s hundreds of private colleges. For them, the demise of the classical AB course had created a curricular Purgatory, uncertainty over what direction they should, or could, take. They faced the daunting challenge of adapting to the disciplinary coursework demanded by the academic revolution. This required a different scale of operation. In 1880, when the average college had 88 students, a proper college was deemed to need 10 faculty. In 1900, 25 faculty were recommended to teach the expanded curriculum. However, in 1915 an “efficient college” called for 400 students and 40 faculty. At the time, one-half of private colleges had fewer than 300 students and not close to adequate faculties. William Rainey Harper in 1900 had estimated that only one-quarter of the colleges could remain viable. Most of them doggedly hung on, but their ambiguous status was known as the plight of the colleges. They badly needed more students, more teachers, and more money, but perhaps the greatest need was a refurbished, positive image. It was difficult to claim superiority of teaching given the demonstrable inferiority of faculty qualifications, course offerings, and laboratory facilities. Church sponsorship proved to be a two-edged sword, tapping a dependable (often shrinking) clientele, but alienating those preferring nondenominational settings. During the zenith of popularity for graduate programs, circa 1890, many colleges joined the doctoral lists. Twelve Ohio colleges granted PhDs before 1900, but only the program at Ohio State endured. Dickinson College in Pennsylvania was typical of these efforts, awarding four doctorates in the 1890s before abandoning the program as hopeless. In the tradition of multipurpose colleges, colleges were tempted to increase vocational offerings, even though it became evident that they might bring enrollments but not prestige. Bucknell, for example, sought additional students by turning away from the liberal arts and establishing vocational programs in education, law, premed, engineering, and home economics. After 1900, it became increasingly apparent that colleges could not compete with universities on the latter’s terms but rather had to find their own turf.95

The growing consensus favoring liberal culture was encouraging for the colleges, though they could not yet identify with Yale or Princeton. The need to strengthen undergraduate education was pervasive in the 1900s and had powerful allies. Both the CFAT and the Rockefeller General Education Board placed their influence and their dollars behind this movement. The increasing use of rankings to distinguish standard colleges from their weaker brethren forced institutions to be mindful of measures of quality.96 Abraham Flexner, who would work for both foundations, published a volume in 1908 articulating contemporary criticisms of the colleges, particularly the incoherence of the curriculum. Anglophilia was given a tangible boost in 1902 by creation of the Rhodes Scholarships for study at Oxford. Intended to strengthen ties between Britain and English-speaking nations, a Rhodes clearly had the desired effect on Frank Aydelotte, whose years at Brasenose College (1905–1907) burnished admiration for thorough study, residential community, amateur sports, and the cultural value of literary study. He sought to inculcate these values as a teacher at Indiana University and MIT and founded the American Oxonian to popularize them more widely by mobilizing the Rhodes alumni. Given these intellectual currents, possible remedies for the predicament of the college soon appeared.97

The founding of new institutions often reflects or even exaggerates the fresh ideas of the times, as was the case with Cornell, Johns Hopkins, Bryn Mawr, and Clark universities. This was also the case in Portland, Oregon, where Simeon and Amanda Reed dedicated their $3 million estate to the establishment of an institution of higher education. The implementation of their vague wishes was placed in the hands of the local Unitarian minister and leading citizen, Thomas Lamb Eliot, who conscientiously explored all possibilities. The Reeds had specifically mentioned “practical knowledge,” but when he queried educators around the country, they responded that a technical institute was expensive and would probably duplicate offerings of the state universities. Founding a university with professional schools was also deemed too expensive. Eliot was then advised by the secretary of the General Education Board, Wallace Buttrick, that “a strong, high grade college of the arts” would fill a lacuna for the region. Buttrick further recommended for president of such a college a young professor at Bowdoin with bold ideas about college education, William Trufant Foster. He accepted the presidency of Reed College (1910–1919) on the condition that it be dedicated exclusively to the liberal arts and sciences and offer “as high grade of scholarship as any in the country.”

Foster visited about one hundred colleges, discovering widespread unrest over lack of student interest in their studies, exacerbated by devotion to athletics and social affairs. In response, he formulated a model for what he called the “ideal college.” It had to be private to resist outside influence; it should be small to promote interaction and community; the content of courses was less important than the thoroughness with which they were taught; and it would avoid the distractions of intercollegiate athletics and fraternities. These characteristics were preconditions for its two essential purposes: “to become a Johns Hopkins for undergraduates, the Balliol of America,” in the quality of education and to be “broadly cultural and ultimately practical” by applying knowledge to socially useful purposes. With a concentration on rigor and relevance, Reed set a new model for liberal education. It quickly gained widespread recognition for academic seriousness and became an exemplar of new possibilities.98

The year after Reed opened, Alexander Meiklejohn was named president of Amherst College (1912–1923). Son of a Scottish immigrant, he attended Brown on a scholarship, followed by a PhD in philosophy at Cornell (1897). Returning to Brown to teach philosophy, Meiklejohn became a favorite with students in the classroom and on athletic fields. Named the first dean of the college in 1901, he combined an idealistic commitment to student moral and intellectual development with excellent rapport with students in extracurricular affairs. Elevated to the presidency of Amherst, Meiklejohn inspired his inaugural audience with an idealistic depiction of the “Liberal College.” Its purpose was to lead students “into the life intellectual. The college is not a place of the body, nor of the feelings, nor even of the will; it is first of all, a place of the mind.” He condemned prevailing notions that college should prepare students for practical careers, and he castigated the elective system for producing “intellectual agnosticism, a kind of intellectual bankruptcy, into which, in spite of our wealth of information, the spirit of the time has fallen.” Rather, the liberal college should provide the kind of enduring mental formation needed for long-term leadership and success. More tangibly, he asserted that a liberal education consisted of just five elements: “the contributions of philosophy, of humanistic science, of natural science, of history and of literature.” Meiklejohn thus asserted that the basic liberal arts provided a college education that was superior to the professional and specialized disciplinary offerings of universities. The Liberal College address was widely publicized and struck a responsive chord by offering a seemingly plausible ideal, given the colleges’ limitations.99

Foster and Meiklejohn both saw the enhancement of student learning as the essence of the problem, but they differed on strategies. For Foster, academic rigor was the solution, and he institutionalized it by the selective admission of motivated students, elimination of “distractions” (athletics and fraternities), and thoroughness of instruction, reinforced with senior theses and oral examinations. Meiklejohn focused on the content of the curriculum, believing that challenging material, well taught, could stimulate greater student interest and effort. At Amherst he soon proposed a core curriculum centered on social and economic institutions to provide a unifying focus for the five elements of the liberal arts. Only one such course was created, but Meiklejohn would persist in believing that the essence of liberal education could be captured with the right curricular formulas. Both men were idealists in positing the intellectual autonomy of their colleges, but Foster sought to aggressively address social problems and enlighten the community, while Meiklejohn advocated more aloof cultural criticism. Both men also shared the fate of being more successful in devising ideal colleges than administering them. Foster’s self-righteousness and political radicalism cost the college potential support, and he resigned in 1919 after falling out with the trustees. Meiklejohn injected new life into staid Amherst with young faculty and the new course, but his governance of the college was a disaster. He failed to put the interests of the institution ahead of his own preoccupations, and he was dramatically and inevitably fired in 1923. However, in the prewar years, both men accomplished much of their respective visions and helped to lay a foundation for the rehabilitation of the liberal arts college.100

The organization of the Association of American Colleges (AAC, 1915) was a concrete step in this direction. The last sector to develop a national association, the colleges lagged in defining their role in the emerging system of higher education. As a spokesman at the inaugural conference asserted, “The present-day college must show that it is doing a work that is not done and cannot be done by the university.” They were threatened from below as well by the ongoing expansion of high schools, junior colleges, and normal schools/teachers colleges. Their defensiveness was heightened by the nature of the original membership—predominantly denominational colleges from the Midwest. By bringing its rather isolated members together, the AAC served a number of purposes: it provided quality control by adopting the Carnegie standards and setting requirements for membership; its meetings and publications provided a valuable forum for comparing practices; and a permanent central office focused particularly on aiding members to strengthen administration. For example, an extensive survey was conducted to define “the efficient college,” which produced data that allowed any college to compare student/faculty ratios or expenditures per student with peers.101

Most important, the AAC gradually embraced a liberal arts education as the signature mission for its membership. Not all of its original members were classical liberal arts colleges, by any means. But as the AAC elaborated this ideal, colleges such as Bucknell and Dickinson adopted it as a distinctive and defensible mission. Foster and Meiklejohn participated in AAC meetings and helped to articulate this mission. In 1915 the country’s 400+ private colleges were still arrayed at all stages of development. The commissioner of education addressing the AAC in 1916 repeated Harper’s observation that half of the nation’s colleges, those with incomes under $40,000, would better serve as junior colleges. The AAC represented the stronger half of colleges, but there were still large gaps between the average college, the efficient college, the liberal college, or the ideal college. The organization of the AAC and its subsequent work sought to define the nature of a liberal arts college and offer practical guidance for attaining it.102

★ ★ ★

Liberal culture in the prewar years was too diffuse to be a movement, yet too widespread to be ignored. On one hand, it was a protest against the dominant quantitative trends of the era: the predominant vocational emphasis of most institutions; the inexorable specialization of disciplinary knowledge in universities; the fragmentation of the curriculum through the elective system; and the disdain for intellect of the high collegiate era. On the other hand, it advocated a menu of educational ideals: an overriding desire to enhance the learning of undergraduates, specifically in the arts and sciences; an admiration for the educational model of Oxbridge colleges; a recognition of the learning benefits of a residential community; a similar appreciation of the developmental contributions of extracurricular activities, including (sometimes) athletics; and, above all, a belief in the value, articulated with various phrases, of a liberal education. The history of virtually every college during these years shows them wrestling with some combination of these issues.103 The negative complaints of liberal culture were principally early twentieth-century reactions to aspects of the academic revolution and the rise of applied fields. The positive aspects would have continued repercussions after World War I. Members of the AAC tended to converge toward the liberal arts model; the residential component of liberal education assumed paramount influence among private universities; Frank Aydelotte implemented a true honors curriculum as president of Swarthmore College; and Meiklejohn and others proposed new curricular formulas for capturing the essence of the liberal college.

Liberal culture was but one of the enduring legacies of the collegiate revolution. Between 1875 and 1915, the American college had been transformed for students, for the American public, and for the educational system. Where students had formerly experienced most of college life within a formal institutional structure, they now were socialized in fraternities or sororities, at football games, with the Campus Y, or in other organized student activities. Moreover, these activities were fundamentally democratic. They may have originated at eastern private schools, but by the twentieth century they were common at colleges everywhere, including normal schools and historically black colleges. They may have been most liberating for women, who created their own collegiate space on male-dominated campuses and an extraordinarily rich slate of activities at their own colleges. However, the collegiate revolution also had a huge impact on the American public. The colleges no longer appeared to be reserved for a narrow population aiming for professional careers but rather appeared open to all qualified aspirants, even if that was still a small slice of an age cohort. Colleges were portrayed as promoting the manliness and savoir faire needed for success in the business world. This image was reinforced by alumni, who, in turn, took pride in their alma maters and made donations for their improvement. In short, colleges had emerged as a clear route to middle-class careers and life styles. Colleges also assumed an unambiguous place in the educational structure. By 1915 a clear separation of secondary and higher education existed in most of the country, and blurring was stigmatized in places where separation was not yet complete. Similarly, college, or at least the first 2 years, had now become the gateway to professional schools. Liberal culture offered a theoretical justification for a changing reality, idealizing an institution whose structural position had now become fixed. It was also, sub rosa, an effort to recapture the elite character that a college education had formerly represented. The elaboration of the virtues of a liberal arts education was thus a premonition of the emergence of greater differentiation in the realm of higher education—the herald of mass higher education.


1 Cornelius Howard Patton and Walter Taylor Field, Eight O’clock Chapel: A Study of New England College Life in the Eighties (Boston: Houghton Mifflin, 1927).

2 Lyman H. Bagg, Four Years at Yale: By a Graduate of ’69 (New Haven: Chatfield, 1871); Henry Seidel Canby, Alma Mater: The Gothic Age of the American College (Murray Hill, NY: Farrar & Rinehart, 1936); George W. Pierson, Yale College: An Educational History, 1871–1921 (New Haven: Yale University Press, 1952), Santayana quote, pp. 6–7.

3 Canby, Alma Mater, 23–80; quotes pp. 28, 37, 62, 78, 68.

4 Jerome Karabel, The Chosen: The Hidden History of Admission and Exclusion at Harvard, Yale, and Princeton (Boston: Houghton Mifflin, 2005), 39–76. Edwin Slosson noted: “There are so many kinds of democracy … every university boasts the purest brand”; regarding Yale democracy amidst Yale affluence, he found it “distinctly encouraging to find that the democratic spirit is still regarded as a desirable thing to have in a university”: Great American Universities, 362, 71.

5 Helen Lefkowitz Horowitz, Campus Life: Undergraduate Cultures from the End of the Eighteenth Century to the Present (New York: Knopf, 1987).

6 Canby, Alma Mater, 48–49, 60. Canby would include himself among the “meager creatures,” having earned a PhD in English in 1905. He taught at Yale until 1916 before pursuing a literary career in publishing.

7 Pierson, Yale College, 36. Like the heroine of The Scarlet Letter, Sheffield graduates had to bear the letter S after their class numerals for the sin of attending the scientific school.

8 Edwin E. Slosson, Great American Universities, 34–74; Karabel, The Chosen, 74–76; Dan A. Oren, Joining the Club: A History of Jews and Yale (New Haven: Yale University Press, 1985), 24–37. Jewish students at Yale College rose from near 2 percent (1900) to near 6 percent (1916), with perhaps one-third coming from wealthy, assimilated homes and two-thirds (ca. fifteen) from the New Haven ghetto.

9 W. Bruce Leslie, Gentlemen and Scholars: Colleges and Community in the ‘Age of the University’ (New Brunswick: Transaction, 2005 [1986]), 95–114; David P. Setran, The College ‘Y’: Student Religion in the Era of Secularization (New York: Palgrave Macmillan, 2007); Robert Schwartz, Deans of Men and the Shaping of Modern College Culture (New York: Palgrave Macmillan, 2011).

10 Singing well represented the collegiate spirit: J. Lloyd Winstead, When Colleges Sang: The Story of Singing in American College Life (Tuscaloosa: University of Alabama Press, 2013).

11 Leslie, Gentlemen and Scholars, 110–11, 195–209; Patton and Field, Eight O’clock Chapel, 236–302. Rushes persisted at many campuses well into the twentieth century.

12 The leadership of the Y was steadfastly committed to gender separation, even though some campus Christian Associations were originally coed. Michigan, for one, was expelled in 1886 for remaining coed after the Young Women’s Christian Association was formed, primarily as a student organization: Setran, College ‘Y,’ 33–35; David P. Setran, “Student Religious Life in the ‘Era of Secularization’: The Intercollegiate YMCA, 1877–1940,” History of Higher Education Annual, 21 (2001): 7–45.

13 Setran, College ‘Y’; Scott J. Peters, The Promise of Association: A History of the Mission and Work of the YMCA at the University of Illinois, 1873–1997 (Champaign, IL: University YMCA, 1997), 18–52; Lewis Sheldon Welch and Walter Camp, Yale: Her Campus, Class Rooms, and Athletics (Boston: Page, 1900), 50–65.

14 Wayne Somers, Encyclopedia of Union College History (Schenectady: Union College Press, 2003), 304–15; Claude M. Fuess, Amherst: The Story of a New England College (Boston: Little, Brown, 1935), 287–90, 346–48.

15 Fraternity members at the University of California grew from 10 percent in 1900 to 37 percent in the 1920s: Laurie A. Wilkie, The Lost Boys of Zeta Psi (Berkeley: University of California Press, 2010), 222; at Michigan in 1924, 32 percent of men and 22 percent of women belonged to more than one hundred fraternities and sororities: Howard H. Peckham, The Making of the University of Michigan, 1817–1992, edited and updated by Margaret L. Steneck and Nicholas H. Steneck, (University of Michigan, Bentley Historical Library, 1994), 168; at Wisconsin, more than one-quarter of students in 1930 lived in fraternity or sorority houses: Merle Curti and Vernon Carstensen, The University of Wisconsin: A History, 1848–1925, 2 vol. (Madison: University of Wisconsin Press, 1949), II, 503; at Illinois, which had banned fraternities until 1889, the dean of men held office in ATθ, and the University had the largest number of Greek-letter organizations in 1930: Schwartz, Deans of Men, 32–36.

16 Philip Alexander Bruce, History of the University of Virginia, 1819–1919, 5 vol. (New York: Macmillan, 1921), IV, 97–101, 335–40; V, 271–79; John K. Bettersworth, People’s College: A History of Mississippi State (n.p.: University of Alabama Press, 1953), 373–78.

17 George B. Manhart, DePauw through the Years, 2 vol. (Greencastle, IN: DePauw University, 1962), I, 133–37; Diana B. Turk, Bound by a Mighty Vow: Sisterhood and Women’s Fraternities, 1870–1920 (New York: New York University Press, 2004). Several colleges claim the first sororities; they were originally formed for mutual support in a hostile environment at recently coeducational institutions, like Asbury, only later becoming the female counterpart of a Greek system.

18 Leal A. Headley and Merrill E. Jarchow, Carleton: The First Century (Northfield, MN: Carleton College, 1966), 372–77.

19 Ronald A. Smith, Sports and Freedom: The Rise of Big-Time College Athletics (New York: Oxford University Press, 1988).

20 Ibid., 67–82. Thanksgiving Day games between traditional rivals in major cities quickly spread across the country, not least for the revenues they raised.

21 Ibid., 147–208; John S. Watterson, College Football: History, Spectacle, Controversy (Baltimore: Johns Hopkins University Press, 2000), 26–98.

22 George E. Peterson, The New England College in the Age of the University (Amherst, MA: Amherst College Press, 1964), 80–112; Marilyn Tobias, Old Dartmouth on Trial: The Transformation of the Academic Community in Nineteenth-Century America (New York: New York University Press, 1982).

23 Leslie, Gentlemen and Scholars, 43–45 et passim.

24 Karabel, The Chosen, 13–38; Kim Townsend, Manhood at Harvard: William James and the Others (Cambridge: Harvard University Press, 1996).

25 The president of Amherst stated in 1905: “The aim of the college is not to make scholars. The aim is to make broad, cultivated men … not athletes simply, not scholars simply, not dilettantes, not society men, not pietists, but all-round men,” Peterson, New England College, 30–43, quotes 31, 39.

26 Karabel, The Chosen, 25.

27 Patton’s failures are depicted by P. C. Kemeny, Princeton in the Nation’s Service: Religious Ideals and Educational Practice, 1868–1928 (New York: Oxford University Press, 1998), 87–126; his success with alumni by Joby Topper, “College Presidents, Public Image, and the Popular Press: A Comparative Study of Francis L. Patton of Princeton and Seth Low of Columbia, 1888–1902,” Perspectives on the History of Higher Education, 28 (2011): 63–114, quote p. 69.

28 Daniel A. Clark, Creating the College Man: American Mass Magazines and Middle-Class Manhood, 1890–1915 (Madison: University of Wisconsin Press, 2010).

29 Karabel, The Chosen; Joseph A. Soares, The Power of Privilege: Yale and America’s Elite Colleges (Stanford: Stanford University Press, 2007).

30 The following draws on Marc A. VanOverbeke, The Standardization of American Schooling: Linking Secondary and Higher Education, 1870–1910 (New York: Macmillan, 2008); and Harold S. Wechsler, The Qualified Student: A History of Selective Admissions in America (New York: Wiley, 1977), 16–130.

31 Roger L. Geiger, “The Crisis of the Old Order: The Colleges in the 1890s,” in Geiger, American College in the Nineteenth Century, 264–76.

32 VanOverbeke, Standardization, 156.

33 William Rainey Harper, The Prospects of the Small College (Chicago: University of Chicago Press, 1900).

34 Hugh Hawkins, Between Harvard and America: The Educational Leadership of Charles W. Eliot (New York: Oxford University Press, 1972), 224–62; VanOverbeke, Standardization of American Schooling, 115–42.

35 Geiger, “Crisis of the Old Order,” 268–75.

36 Chicago Medical College, affiliated with Northwestern (1870) offered a 3-year graded course, and Michigan employed salaried medical professors since 1851. The following draws on Kenneth M. Ludmerer, Learning to Heal: The Development of American Medical Education (New York: Basic Books, 1985); a comprehensive overview is provided in Abraham Flexner, Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching, Bulletin No. 4 (New York: CFAT, 1910); for a panorama of medical education in Chicago, the city with the most medical schools, Winton U. Solberg, Reforming Medical Education: The University of Illinois College of Medicine, 1880–1920 (Chicago: University of Illinois Press, 2009).

37 Flexner’s emphasis on learning by doing placed him among progressive educators: Ludmerer, Learning to Heal, 166–90; Thomas Neville Bonner, Iconoclast: Abraham Flexner and a Life in Learning (Baltimore: Johns Hopkins University Press, 2002), 69–90. Flexner’s brother Simon was one of the country’s leading medical scientists and educators.

38 Flexner, Medical Education, 11; William G. Rothstein, American Physicians in the Nineteenth Century (Baltimore: Johns Hopkins University Press, 1985), 287–94; Solberg, Reforming Medical Education, 139–74; Ludmerer, Learning to Heal, 139–206.

39 Quoted in Arthur E. Sutherland, The Law at Harvard: A History of Ideas and Men, 1817–1967 (Cambridge: Harvard University Press, 1967), vii.

40 Bruce A. Kimball, The Inception of Modern Professional Education: C. C. Langdell, 1826–1906 (Chapel Hill: University of North Carolina Press, 2009); Robert Stevens, Law School: Legal Education in America from the 1850s to the 1980s (Chapel Hill: University of North Carolina Press, 1983).

41 Quoted in Sutherland, Law at Harvard, 175.

42 Ibid., 168; Alfred Z. Reed, The Training for the Public Profession of the Law (New York: CFAT, 1921). Virtually all requirements of professional schools in this era contained loopholes or exemptions, which tended to be tightened over time. Harvard Law offered nongraduates the option of taking a rather stiff entrance examination from 1875 to 1909 (Sutherland, Law at Harvard, 168–70). The difficulties of achieving these reforms against opposition within and outside the Law School is detailed by Kimball, Inception, 193–232.

43 Kimball, Inception, 210–29, quotes pp. 224, 266; the pass rate on Harvard examinations appeared to be 70 to 80 percent, but a substantial number of students chose not to sit the exams: Reed, Training, 356–68.

44 Kimball, Inception, 264–70.

45 Kathleen a. Mahoney, Catholic Higher Education in Protestant America: The Jesuits and Harvard in the Age of the University (Baltimore: Johns Hopkins University Press, 2003); Kimball, Inception, 295–308, 341–42.

46 Bruce A. Kimball depicts this complex process extending 25 years, to 1915: “The Proliferation of the Case Method in American Law Schools: Mr. Langdell’s Emblematic ‘Abomination,’ 1890–1915,” History of Education Quarterly, 46, 2 (Summer 2006): 192–247.

47 Robert A. McCaughey, Stand Columbia (New York: Columbia University Press, 2003), 138–40, 182–84; Stevens, Law School, 51–72, 96–98.

48 Stevens, Law School, 73–91; Reed, Training, 434–52; Dorothy E. Finnegan, “Raising and Leveling the Bar: Standards, Access, and the YMCA Evening Law Schools, 1890–1940,” Journal of Legal Education, 55, 1–2 (Mar./June 2005): 208–33; Joseph E. Kett, The Pursuit of Knowledge under Difficulties: From Self-Improvement to Adult Education in America, 1750–1990 (Stanford: Stanford University Press, 1994), 261–69.

49 Stevens, Law School, 92–115; Kett, Pursuit of Knowledge, 264–67.

50 Reed, Training, 398–99; Finnegan, “Raising and Leveling the Bar,” 231–33.

51 Roger L. Geiger, “The ‘Superior Instruction of Women,’ ” in Geiger, American Colleges in the Nineteenth Century, 183–95; Nancy E. Durbin and Lori Kent, “Postsecondary Education of White Women in 1900,” Sociology of Education, 62 (Jan. 1989): 1–13.

52 Methodists had always supported women’s education, and Asbury (1867), Northwestern (1869), and Ohio Wesleyan became coeducational, as did new foundings Syracuse and Vanderbilt; Bates (f. 1863) was the first coeducational New England college, followed by Colby and the University of Vermont (1871). Coeducation was specifically rejected by Amherst, Brown, Williams, and Middlebury (until 1883): David B. Potts, Wesleyan University, 1831–1910 (New Haven: Yale University Press, 1992), 98–105.

53 Helen Lefkowitz Horowitz, Alma Mater: Design and Experience in the Women’s Colleges from Their Nineteenth-Century Beginnings to the 1930s (Boston: Beacon Press, 1984).

54 Barbara Miller Solomon, In the Company of Educated Women: A History of Women and Higher Education in America (New Haven: Yale University Press, 1985).

55 Sue Zschoche, “Dr. Clarke Revisited: Science, True Womanhood, and Female Collegiate Education,” History of Education Quarterly, 29, 4 (Winter 1989): 545–69; Charlotte Williams Conable, Women at Cornell: The Myth of Equal Education (Ithaca: Cornell University Press, 1977), 66–74.

56 Solomon, Company of Educated Women, 115–40.

57 Horowitz, Alma Mater, 28–81.

58 Lynn D. Gordon, Gender and Higher Education in the Progressive Era (New Haven: Yale University Press, 1990), 55–70. Point systems for women’s activities were used at Smith, California, and Minnesota and no doubt, at other schools; Turk, Bound by a Mighty Vow.

59 California was fortunate to receive the benefactions of Phoebe Apperson Hearst, who took a direct interest in Cal women. She donated scholarships and athletic facilities and, for a time, supported a female physician: Gordon, Gender and Higher Education, 56–60.

60 Jana Nidiffer, Pioneering Deans of Women: More than Wise and Pious Matrons (New York: Teachers College Press, 2000).

61 Gordon, Gender and Higher Education, 66–71; Ruth Bordin, Women at Michigan (Ann Arbor: University of Michigan Press, 1999); Claudia Goldin, “The Meaning of College in the Lives of American Women: The Past One-hundred Years,” National Bureau of Economic Research, Working Paper No. 4099 (June 1992).

62 Horowitz, Alma Mater, 147–68.

63 Ibid., quotes pp. 148, 163; Patricia Ann Palmieri, In Adamless Eden: The Community of Women Faculty at Wellesley (New Haven: Yale University Press, 1995), 199–211, quote p. 203.

64 Katherine Sedgwick, “An Ambiguous Purpose: Religion and Academics in the Bryn Mawr College Curriculum, 1885–1915,” Perspectives on the History of Higher Education, 27 (2008): 65–104. The liberal Hicksite Quakers established Swarthmore as a coeducational college in 1864.

65 For the most thorough treatment of Thomas, including her lifelong attachments to women: Helen Lefkowitz Horowitz, The Power and the Passion of M. Carey Thomas (New York: Knopf, 1994).

66 Thomas was, by 1894, acting as president; she had the support of the outgoing president and her trustee father; her companion, Mary Garrett, pledged to give $10,000 (ca. 10 percent of the college budget!) each year that Thomas was president. Yet, trustees were reluctant to name a woman, and many wanted to assert the Quaker character of the college. She was elected president on a 7 to 5 vote: Horowitz, Power and Passion, 257–64.

67 Ibid., p. 318. After 1910, Thomas experienced internal resistance to her autocracy from students and faculty.

68 Ibid., 315–23, quote p. 318; M. Carey Thomas, Education of Women, Monographs on Education in the United States, Nicholas Murray Butler, ed. (Washington, D.C.: U.S. Department of Education, 1900), quote p. 40.

69 Conable, Women at Cornell, quote p. 117; Morris Bishop, A History of Cornell (Ithaca: Cornell University Press, 1962), quote p. 420; Potts, Wesleyan, 212–20.

70 Gordon, Gender and Higher Education, 113–15; Thomas Woody, A History of Women’s Education in the United States, 2 vol. (New York: Octagon, 1966, [1929]), II, 290–94. For Penn’s aversion to women, see Sarah Manekin, “Gender, Markets, and the Expansion of Women’s Education at the University of Pennsylvania, 1913–1940,” History of Education Quarterly, 50 (Aug. 2010): 298–323.

71 R. D. Apple, “Liberal Arts or Vocational Training? Home Economics Education for Girls,” in S. Stage and V. Vincenti, eds., Rethinking Home Economics: Women and the History of a Profession (New York: Cornell University Press, 2004); Sarah Stage. “Richards, Ellen Henrietta Swallow,” American National Biography Online Feb. 2000, http://www.anb.org/articles/13/13–01382.html (Nov. 5, 2013); Maresi Nerad, The Academic Kitchen: A Social History of Gender Stratification at the University of California, Berkeley (Albany, NY: SUNY Press, 1999). Home economics received a huge boost when it was included in the Smith-Lever Act (1914) for cooperative extension and the Smith–Hughes Act (1917) for teachers.

72 Woody, History, II, 304–20.

73 Ibid., 318–20; C. H. Cramer, Case Western Reserve: A History of the University, 1826–1976 (Boston: Little, Brown, 1976), 89–105.

74 Tracy Schier and Cynthia Russett, eds., Catholic Women’s Colleges in America (Baltimore: Johns Hopkins University Press, 2002), 25–60; Edward J. Power, A History of Catholic Higher Education in the United States (Milwaukee: Bruce, 1958), 183–97.

75 Amy Thompson McCandless, The Past in the Present: Women’s Higher Education in the Twentieth-Century South (Tuscaloosa: University of Alabama Press, 1999), 18–38; Gordon, Gender and Higher Education, 165–88; Woody, History, 187.

76 Woody, History, 295–302; Palmieri, Adamless Eden, 217–31.

77 In 1909–1910, 158,620 women were enrolled in colleges and universities (43,441), Division A women’s colleges (8,874), Division B (11,690), and normal schools (94,615); 157,401 men were enrolled in colleges and universities (119,578) and normal schools (37,823): Bureau of Education, Biennial Survey of Education, 1916–1918, 4 vol. (Washington, D.C.: GPO, 1921), III, 686; IV, 10.

78 Woody, History, 189–92, 290–303; Palmieri, Adamless Eden, 211–16.

79 Goldin, “Meaning of College in the Lives of American Women.”

80 Margaret W. Rossiter, Women Scientists in America: Struggles and Strategies to 1940 (Baltimore: Johns Hopkins University Press, 1982). Women earned 10 percent of PhDs in 1901 and 20 percent in 1921.

81 Flexner, Medical Education, 178–79; Woody, History, 333–81. For women in Chicago medical schools: Solberg, Reforming Medical Education, 86–94.

82 These disparities are treated in somewhat different form by Laurence R. Veysey, The Emergence of the American University (Chicago: University of Chicago Press, 1965).

83 Alex Duke, Importing Oxbridge: English Residential Colleges and American Universities (New Haven: Yale University Press, 1996), 39–64; W. Bruce Leslie, “Dreaming Spires in New Jersey: Anglophilia in Wilson’s Princeton,” in James Axtell, ed., The Educational Legacy of Woodrow Wilson: From College to Nation (Charlottesville: University of Virginia Press, 2012), 97–121. Oxbridge had critics in Britain: “Down to 1914, aristocratic idleness and hearty philistinism cohabited at Oxbridge with serious scholarship and academic competition”: Robert D. Anderson, European Universities from the Enlightenment to 1914 (Oxford: Oxford University Press, 2004), 198.

84 The year 1902 marked heightened awareness of the Oxbidge model as inspiration for liberal culture in American higher education with publication of John Corbin’s An American at Oxford (Boston: Houghton Mifflin, 1902) and the creation of the Rhodes Trust to fund study at Oxford by college graduates of English-speaking nations: Duke, Importing Oxbridge, 54–63.

85 Samuel Eliot Morison, Three Centuries of Harvard (Cambridge: Harvard University Press, 1936), 385–87; Pierson, Yale College, 232–57, quotes pp. 240, 629, 241n.

86 Laurence R. Veysey, “The Academic Mind of Woodrow Wilson,” Mississippi Valley Historical Review, 49, 4 (March 1963): 613–34; James Axtell, “The Educational Vision of Woodrow Wilson,” in Axtell, Educational Legacy, 9–48.

87 On Wilson’s academic life: Henry W. Bragdon, Woodrow Wilson: The Academic Years (Cambridge: Harvard University Press, 1967); John M. Mulder, Woodrow Wilson: The Years of Preparation (Princeton: Princeton University Press, 1978).

88 Veysey, “Academic Mind,” 632, 633; Axtell, “Educational Vision,” 28.

89 Veysey, “Academic Mind,” 618–21; Bragdon, Woodrow Wilson, 284–86; Woodrow Wilson, “Princeton in the Nation’s Service,” in Woodrow Wilson, ed., College and State: Educational, Literary and Political Papers (1875–1913), 2 vol. (New York: Harper & Brothers, 1925), 259–85, quotes pp. 275, 280. Key alumni considered Wilson to be indispensable and in 1898 executed a secret contract promising him $2,500 per year to remain at Princeton for the next 5 years: Bragdon, Woodrow Wilson, 227.

90 Axtell, “Educational Vision”; Mulder, Woodrow Wilson, 157–86; Bragdon, Woodrow Wilson, 287–311.

91 Wilson scholars have speculated that a severe stroke he suffered in 1906 affected his personality, leaving him more single minded and authoritarian: Axtell, “Educational Vision,” 35–36; Mulder, Woodrow Wilson, 225–28.

92 Mulder, Woodrow Wilson, 185–225, quote p. 199. During this last campaign (1909), Wilson made his famous complaint: “the side shows are so numerous, so diverting—so important, if you will—that they have swallowed up the circus”: quoted in Leslie, Gentlemen and Scholars, 189n. Wilson’s departure was a victory for social elitism at Princeton (Bragdon, Woodrow Wilson, 384–409), but not a defeat for his academic reforms: see James Axtell, The Making of Princeton University (Princeton: Princeton University Press, 2006).

93 Morison, Three Centuries of Harvard, 439–49; Abbott Lawrence Lowell, “Inaugural Address (October 6, 1909),” in Samuel Eliot Morison, ed., The Development of Harvard University Since the Inauguration of President Eliot, 1869–1929 (Cambridge: Harvard University Press, 1930), lxxvii–lxxxviii, quote p. lxxxvii.

94 Ibid., lxxxiv, lxxxvii.

95 Geiger, “The Crisis of the Old Order”; Charles Coleman Sellers, Dickinson College: A History (Middletown, CT: Wesleyan University Press, 1973), 290; W. Bruce Leslie, Gentlemen and Scholars, 177–88.

96 After 1900, efforts to classify institutions according to the quality of education proliferated. The CFAT was compelled to make such judgments for eligibility for pensions. In 1908 the AAU began an ongoing effort to classify colleges on the basis of the fitness of their graduates for graduate study. In 1910–1911 the Bureau of Education made a comprehensive classification of colleges on the same basis, which was suppressed as too controversial to publish: David S. Webster, Academic Quality Rankings of American Colleges and Universities (Springfield, IL: Charles C. Thomas, 1986).

97 Abraham Flexner, The American College (1908); Frances Blanchard, Frank Aydelotte of Swarthmore (Midddletown, CT: Wesleyan University Press, 1970).

98 Dorothy Johansen, “History of Reed College to 1920,” ms. (1984), Reed Digital Archives, quotes, pp. 17, 52, 64–65; “Comrades of the Quest: The Story of Reed College” (adapted from the New York Times of April 15, 1917), Reed Digital Archives.

99 Alexander Meiklejohn, “What the Liberal College Is,” The Liberal College (Boston: 1920), 21–50; Adam R. Nelson, Education and Democracy: The Meaning of Alexander Meiklejohn, 1872–1964 (Madison: University of Wisconsin Press, 2001), 33–96.

100 Johansen, “Reed College”; Burton R. Clark, The Distinctive College (New Brunswick: Transaction, 1992 [1970]), 91–150; John P. Sheehy, “What’s So Funny ’Bout Communism, Atheism, and Free Love? The Radical Legacy of William Trufant Foster,” Reed Magazine (Summer 2007): 25–32; Nelson, Education and Democracy, 97–129.

101 “The Efficient College: Read before the Association of American Colleges” (Jan. 21, 1916), American Association of Colleges Bulletin, 2, 1 (1916); Hugh Hawkins, Banding Together: The Rise of National Associations in American Higher Education, 1887–1950 (Baltimore: Johns Hopkins University Press, 1992), 16–20, quote p. 41.

102 Philander P. Claxton, “The Junior College,” Association of American Colleges Bulletin, 2, 3 (1916): 104–12; Hawkins, Banding Together, 41–44.

103 For example. Nancy Jane Cable, The Search for Mission in Ohio Liberal Arts Colleges: Denison, Kenyon, Marietta, Oberlin, 1870–1914, PhD Diss., University of Virginia, 1984, 279–339.