The big change in the social structure of the twentieth-century United States has not been in the organization and control of the productive sector of the economy, but in the superstructure built upon it. We have seen an enormous expansion of what I have called “political labor,” the rise of a huge “sinecure sector” of government employment, massive educational institutions, and the growth of the tertiary sector generally. An underlying factor allowing the growth of this nonproductive sector has been the economic pressures for keeping up aggregate demand as productive technology has grown steadily more powerful while proportionately fewer workers are required. But the most direct agent of this expansion has been the growth of the educational credential system. The schools themselves have been a major component of government employment (42% in 1975, and 10% of all employment, see Statistical Abstract of the United States, 1976: Tables 452 and 569). The extension of school attendance has kept increasing proportions of the potential labor force occupied elsewhere (often, as in the case of college students supported by financial aids, in a condition of disguised welfare support). Educational credentials have been the means by which much of the sinecure sector has been built up. It has provided the means of building specialized professional and technical enclaves, elaborated bureaucratic staff divisions, and in general has served to monopolize jobs for specialized groups of workers and thus insulate them from pressures for directly productive work.
The United States is the most credentialized society in the world, and its educational system is correspondingly unique. Where most industrial countries have had relatively small secondary school systems, the United States has not only long pushed toward universal high school attendance, but in the 1960s also came close to universal completion of high school by the teenage population. College attendance built up to close to one-half of the young adult population, college graduation came near to one-fifth, and postgraduate studies have expanded at an unprecedented rate. Moreover, as we have seen (Table 1.1), these trends have been building up steadily since the late nineteenth century.
The U.S. educational system has been different not only quantitatively, but also structurally in comparison with most industrial societies. Most others divide their secondary education among two or three institutions—an elite organization (the English public school, the German gymnasium, the French lycée), teaching traditional high culture and leading to university attendance; a commercial high school preparing students directly for business or clerical work; and sometimes a third technical school for engineers and technicians, or at a lower level, a vocational arts school for the potential working class (when these are not simply released from school if they fail the differentiating examination at the beginning of their teens). These forms have been described as a sponsored mobility system, in that once the branching point has been reached, the rest of the student’s career is set, and the student who passes this point is sponsored through successive levels without much attrition (Turner, 1960).
The U.S. system, by comparison, is described as a contest mobility system. There are no crucial branching points. There are no sharp divisions among different types of secondary schools, or even among different sorts of undergraduate colleges. It is fairly easy to transfer among different sorts of programs or to re-enter the system after dropping out (Eckland, 1964). The main pattern is one of continuous attrition from year to year, from high school on through advanced professional training, as the “contest” continues indefinitely. Moreover, there are no fixed end points. The occupational consequences of education in this system are purely relative. Where the European type of branching into specialized training produces distinctive types of careers at early choice points (e.g., medical or legal training are entered directly at the end of secondary school), the U.S. system continually puts off final professional identification to the very end of the sequence. This end point has changed continuously as once-elite high school degrees have become near-universal, and common undergraduate training has been supplanted by graduate-level education and, for the lucrative specialties, increasingly by postdoctoral specialized training.
In brief, the contrast is between a system in which elite occupational access is marked off early by horizontal branchings and one in which there is a continuous set of vertical ranks, formally accessible to everyone with sufficient perseverance, with elite occupational status attached to sheer longevity within the system.1 There are also differences of quality among different schools at each level of the American system,2 but these form a continuous progression rather than sharp divisions, and individual students can and do transfer horizontally among them as the contest goes on. Moreover, as we shall see, the lower-prestige forms have continuously changed to emulate the high-prestige forms.
The American system exemplifies the workings of a volatile cultural market. For this, it has had the determining conditions in abundance. A cultural market appears when material resources are invested in specialized culture-producing organizations. The more decentralized and competitive these organizations, the more competition there is in the realm of cultural consumption, especially if at the same time culture is attached to stratification, by which economic and political domination are organized on the basis of cultural group membership. In the United States around the turn of the twentieth century, these conditions approached an extreme. America’s geographical and political decentralization not only prevented any particular group from monopolizing control of the educational system (as was very nearly the case in most European societies), but also actively fostered the founding of schools at all levels, by state and local governments, church denominations, and private entrepreneurs. Not only was there no state church, but also competing denominations emulated one another in founding schools. Middle-class organization of new communities behind the receding frontier could also be accomplished by establishing schools on local initiative. And most importantly of all, the United States underwent a period of the most massive and diverse ethnic immigration in the history of the world. In the situation of political and geographical decentralization, this multiethnic society generated a tremendous struggle for cultural hegemony, which was fought out especially by building a massive school system.
The decentralization of control over this system meant that any specific purpose for which schools might be intended would be swallowed up by the competition among groups and institutions. Business owners might favor schools to socialize the work force (Bowles and Gintis, 1976); the average Anglo-Protestant citizen might favor them to Americanize offensively alien cultures (Katz, 1968). The plutocratic elite might build exclusive private schools to keep their children separate (Baltzell, 1958: 337–348); Catholic communities or other religious and ethnic minorities might build separate schools in the interest of their own cultural pride (Curti, 1935). But the surrounding context would not let any school alone; its students found themselves in a larger world where more and more education was becoming visible in the populace, and higher and higher levels of education were needed for occupational entry. Schools came to imitate one another’s programs: secondary schools in order to enter their graduates into high-prestige colleges, the varieties of colleges and professional schools to make themselves part of the sequence that would allow their students to claim at least a chance of attaining the highest social positions. American schools thus came to form a unified hierarchy, and one that expanded continuously in a vertical direction.
Although the competing cultures of America were most sharply distinguished originally in ethnic and religious terms, the specific content of education has become increasingly less important. As the system elaborated, the value of any particular kind and level of education came to depend less and less on any specific content that might have been learned in it, and more and more upon the sheer fact of having attained a given level and acquired the formal credential that allowed one to enter the next level (or ultimately to pass the requirements for entering a monopolized occupation). Cultures of specific groups were gradually transformed into abstract credentials. In theoretical terms, the particular cultures of everyday social transactions were transmuted into a general currency. This cultural currency became increasingly autonomous, increasingly subject to organizational forces within its own sphere of production. The process of bureaucratization within schools themselves would further shape the nature of the cultural currency into directions far removed from the original ethnic and religious contents, and indeed removed as well from whatever scientific and technical content various reformers attempted to add to it.
The rise of a competitive system for producing an abstract cultural currency in the form of educational credentials has been the major new force shaping stratification in twentieth-century America. Beginning with the internal organization of schools themselves, credentials have permeated the occupational structure, from elite professional monopolies on down, and have been the decisive impetus to the elaboration of super-complex bureaucracies in all spheres. Just as the ethnic struggles shaped the other social struggles of the late nineteenth and early twentieth centuries, the permeation of educational credentials into the occupational world has shaped the major issues of stratification in recent decades.
Does this mean that ethnic and cultural issues have submerged economic class conflict? To put the question in this way would be to misunderstand the nature of social organization. Economic classes do not exist in one compartment, cultural groups in another. Empirically, the world consists of the interaction of individuals maneuvering individually for whatever goods and favorable social positions they can attain. Not only are cultural goods weapons that individuals can use in obtaining and monopolizing economic positions, but cultural goods also make all the difference in determining how individuals will ally with others in order to fight for advantages. To put the matter another way: There is no pre-set number of economic classes struggling as blocs for economic survival or supremacy. The “working class” and “bourgeoisie” are only statistical categories formed in the mind of a theorist, except insofar as real people actually form alliances to act as such groups. The forming of such groups is a cultural matter (as sophisticated Marxist theorists have recognized well enough when they see class formation as a question of mobilizing groups by “raising consciousness”). In fact, economic struggles have almost always been fought out among a much larger number of smaller groups, as cultures that have united them have usually extended only locally, within particular organizations or occupational sectors, or within particular ethnic groups. Ethnic groups, or segments of them, are economic interest groups when they occupy common occupational positions. The “ethnic division of labor” (Hechter, 1974) or “eth-classes” (Gordon, 1964) have been far more common economic actors than Marxian class-wide fronts. And the same holds for the groups that have emerged in the twentieth century based upon the abstract cultural currency of educational credentials. In describing the conflict of cultural groups, then, I will be describing the actual shaping of class struggles in America.
The aim of this chapter, then, is to outline the rise of the educational credential system in the United States and the conditions that have shaped it. This involves looking first at the immigration that made the United States into a multiethnic society and the way it has shaped American political and economic conflicts ever since. Then we examine the growth of educational organizations, from the minimal schooling of monoethnic colonial America, to the growth of elementary and secondary schools as ethnic conflict escalated, and concluding with the development of colleges and universities. The vicissitudes of professional education and its eventual linking to a unified credential system will be left for Chapter 6.
The thin population of the United States, relative to its geographic size and economic resources, created a huge demand for labor in the nineteenth century. Eventually, the cultural conflicts of the immigration, the most diverse in the history of the world, would force a closing of the state barriers around the territory. The open immigration policy of the intervening period was virtually unique. The colonies of France, Spain, and Holland were reserved for their own citizens, and their territories already contained sizable native populations, thus eliminating any sizable demand for external labor and limiting immigration to a single elite. But in North America, in the absence of an indigenous labor force working at a subsistence level, an expensive labor force had to be imported, with resulting impetus for mass consumption industry and rapid economic development.3 Against the background of initial settlement by radical Protestants and the conditions favoring their middle-class cultural style in the colonial period, the alien cultures of the period of mass immigration provoked a crusade for cultural domination. The institutional consequences of that crusade make up a major part of the uniqueness of American social structure.
The American population in 1790, at the beginning of the republic, was about 4 million (Historical Statistics of the United States, Series Z20). About 66% were of English–Scottish background; another 8% were Irish (mostly northern Irish Protestants); another 10% were Dutch or German. In total, about 85% of the country was Protestant and probably English speaking, with the remaining 15% black Africans, most of them slaves.
The United States had one of the most rapidly growing populations of the modern era. The population was under 4 million in 1790; it had risen to 23 million in 1850, 76 million in 1900, 151 million by 1950, and about 205 million by 1970 (Statistical Abstract, 1971: Table 1). Part of the increase was due to internal growth, but most of it was due to immigration. The 4 million people who carried the culture of the early United States, an overwhelmingly white, Protestant, English-speaking group with their black slaves, faced a long series of immigrations from alien cultures that threatened to swamp them.
The cultural threats of immigration built up progressively throughout the nineteenth century. As Table 5.1 shows, by 1850, 10% of the people in the United States were foreign born (extrapolating from the only available figures, those for males). It is not known how many others had foreign-born parents. The black population was about 15%; thus, by the mid-nineteenth century, the old native majority was down to about 75% or possibly below. The great shift, however, occurred after the Civil War. In 1900, about 14% of the American populace was foreign born, about 20% had foreign-born parents, and the black population was down to about 11%. Together this adds up to 45%—close to a majority of the population not of the original stock. Only in the twentieth century was the tide turned back so that by 1950 only about one-third of the population was foreign born, of foreign parentage, or black.
Table 5.1 Percentage of Native, Foreign, and Black Population (Adult Males), 1850–1950
|
1850 |
1900 |
1950 |
Native-born white males of native parents |
75 |
55 |
68 |
Foreign-born white males |
10 |
14 |
7 |
Native-born white males of foreign parents |
— |
20 |
15 |
Black males |
15 |
11 |
10 |
Total adult male population (millions) |
12 |
38.5 |
74.4 |
Source: Historical Statistics, Series C, 218–283.
This conveys some notion of the pressure that Anglo-Protestant America has felt over the last century and a half, but the following figures make it even stronger (Historical Statistics, C 218–283; H 538–543). In 1850, 70% of the foreign born came from northwestern Europe or Canada. Hence they were usually Protestants, although some of them spoke German or Scandinavian languages. By 1900, half of the foreign born in the United States came from northern Europe. The general trend during this period is for immigration to come from “alien” cultures. By 1890, 40% of the United States was Catholic and about 5% was Jewish. Since blacks were mostly Protestant, this means that by 1890 the Catholics and Jews probably outnumbered the white Protestants in the United States.
The conflict arising from the situation was especially severe because many of the groups were organized on widely differing cultural styles. Their backgrounds ranged from those of Anglo-Protestant independent commercial farmers, Catholic peasants from the autocratic agrarian societies of Europe, traditional Jewish merchants and artisans from the cordoned-off shtetls of Poland and Russia, to African slaves from horticultural tribes, and indigenous Indian tribes, many living in hunting-and-gathering cultures. In effect, the United States was not only a transplanted polyglot Europe, but also a sampling of cultures from most of the major types of societies that have existed in world history, all pushed together simultaneously.
Such major differences in political, economic, and community organization do not merely produce superficial differences in language, cuisine, and style of dress, but much more profound differences in styles of work, attitudes toward power, religion, and personal interaction. The bulk of the Anglo-Protestant population, for example, held ascetic attitudes toward work, savings, play, and sexual behavior (Main, 1965; Morgan, 1966). These attitudes fitted small independent communities where moderate economic respectability could be attained by most people, and the market enforced continuous discipline. Many of the Catholic immigrants, in contrast, came from the bottom levels of societies dominated by aristocratic landlords; in these circumstances, hard work and frugality were externally enforced rigors, not internalized ideals, and an attitude of fatalism went along with the ideal of period celebration and largesse (Thomas and Znaniecki, 1918; Arensberg and Kimball, 1948; Banfield, 1958; Gans, 1962). Similar differences held in political style: Whereas the Anglo-Protestants were used to co-management of community, affairs, and stressed public enforcement of religious moralities, the Catholic peasants had a heritage of political subordination that gave them mainly familistic and patrimonial standards of political loyalty (Weber, 1968: 458–472, 1006–1067). From these sorts of differences flowed sharply opposing conceptions of morality: The Anglo-Protestants regarded the immigrants as lazy, dissolute, and politically corrupt; the Catholics’ view of their hosts is not recorded, but it is likely that they viewed them as fanatical, priggish, and disloyal to the claims of family and friends.4 In personal style as well, there was likely a further antagonism: The Protestants no doubt found the patrimonial style of the immigrants too emotional and boisterous, perhaps even sinfully loud; the Catholics, Jews, and blacks must have regarded the market-conscious Protestants as inhumanly cold and self-righteous.
In short, the boundaries among these diverse groups were upheld by deep cultural differences. To be sure, some groups were not so far from each other, and there was comparatively rapid assimilation among some groups with similar cultural styles. German and Scandinavian Protestants, deriving from commercial farming and small-town backgrounds rather like those of most Anglo-Americans, assimilated fairly quickly to them;5 the Catholic groups tended to assimilate to each other (Kennedy, 1952; Greeley, 1970; Laumann, 1969, 1973). But the stronger the cultural differences, the less likelihood of assimilation and, indeed, the greater the overt cultural antagonism. Thus the different emotional, sexual, and sociable styles of Africans,6 and the equally remote cultures of the Indian tribes, were regarded with near total condemnation by all of the European groups and used as justification for treating them as inferior or even subhuman. In general, then, the cultural diversity of late nineteenth-century America produced not merely culture shock, but also profound feelings of emotional antagonism and moral outrage. Communities built upon opposing conceptions of morality now fought over control of a common economy and a democratic polity in which they were all formal participants. It is no surprise that the Protestants took increasingly stronger measures as their domination was threatened.
In the late nineteenth century, the conditions that had made middle-class Protestantism a sealed cocoon of cultural authority were breaking down. The rise of big cities, the growth of the national economy, the big bureaucracies, the mass media, the challenge of the immigrants—these removed the mechanisms whereby the Protestant middle class had established its culture of businesslike respectability as the linchpin of everyday belief. This decline in cultural resources meant a loss of control over labor; economic interests became threatened as the cultural resources by which they had been upheld declined. The close-knit communities of the early colonies had been able to enforce their public standards with an iron discipline. This power was eroded more and more as settlement moved west and the economy developed. By the turn of the twentieth century, Protestant culture had not only softened, but also its adherents had lost the power to enforce what was left of it through traditional means.
Yet the conditions still existed to support some version of Protestant culture and to provide a basis for organizing a counterattack. There were still respectable middle-class communities and opportunities for social mobility through hard work. The restrained tone of Anglo-Protestant culture was still a force to be reckoned with in one’s everyday associations. It was still a mark of respectability that would get one a good job from Anglo employers, a loan from Anglo bankers, and professional repute among Anglo clientele.7 The ongoing struggle to advance one’s career in a world in which Anglo-Protestants controlled the major economic resources kept the culture relevant.
A many-sided struggle went on over just which standards would prevail. The more traditional Anglo-Protestants struggled against the slackening standards of urban Protestants; business interests invoked the old virtues to legitimate their class control; liberals invoked traditional moralism to justify reform. All of these made a larger bloc struggling to keep its cultural advantages over the immigrants.
The Protestant counterattack took two forms. There were the hostile, conservative efforts to repress or exclude non-Anglo groups entirely, and there were the self-consciously “liberal” efforts to assimilate the immigrants and the Protestant backsliders into conformity to traditional Protestant culture. Conservative reactions pervaded both political and economic spheres. Most extreme were the nativist political movements, often with a violent tinge, extending from the anti-Catholic and anti-Masonic Know-Nothing Movement of the 1840s to the Ku Klux Klan revival of the 1920s and subsequent right-wing ideologies (Billington, 1938; Solomon, 1956; Jones, 1960). None of these exclusively nativist movements was ever very successful politically, but the nativist tone found its way into the programs of more important parties. Its most important practical proposal, the prohibition of immigration (except from white Protestant homelands) was finally enacted in the 1920s. It was not carried out earlier in large part because of the opposition of American business leaders, who actively fostered immigration by sending agents to arrange passage for workers.
This meant that ethnic divisions shaped the lines of class stratification. Administrative jobs were reserved whenever possible for Anglos. Immigrant ethnic groups were confined to manual labor or, at best, foremen of labor gangs. Within the ranks of labor, various ethnic groups fought for position—generally with the Irish coming to control the skilled positions, Italians and Slavs the unskilled, and blacks confined to service positions. Ethnic solidarity became the basis for workers’ resistance to their employers. Studies as late as the 1940s show the continuing pattern: The groups of workers acting to control work pace usually were made up of ethnic Catholics; rate busters—individuals who ignored group sanctions and produced near their personal maximum—were likely to be Anglo-Protestant (Collins et al., 1946: 1–14; Hughes, 1949: 211–220). Union membership, similarly, was usually drawn especially from the ranks of immigrants, and union organizers and ideologies often came directly or by example from European socialist movements. Anglo businessmen, in turn, used private guards such as the Pinkertons, judicial powers, and federal troops to crush strikes and destroy union organization (Johnson, 1968: 104–143; 1976). The failure of industrial labor organizations before the 1930s, when national government power finally shifted in response to ethnic voters, was due largely to this vigorous suppression.
The other form of response was self-consciously gentler, although its cultural ends were much the same. Its keynote was “reform.” In the political sphere “reform” meant an effort, under the name of “good government,” to destroy the power of Catholic political machines and return urban political power to more genteel Anglo elites (Gusfield, 1958: 521–541; Banfield and Wilson, 1963: 138–167). Since cities in the United States are legal creatures of the state government, Anglo control of state legislatures could be used to take local autonomy away from urban governments and return it to the state legislature. Other reforms, such as the creation of a post for city manager and the replacement of precinct representation on city councils (which represented local interests) by at-large elections, were designed to reassert Anglo control; the introduction of civil service regulations for government employment was similarly designed to break the power of bosses’ patronage. Reform governments in office used police power to attempt cultural control: Raids on burlesque houses and attempts to crack down on prostitution and gambling were carried out under the Protestant ideology of “suppression of vice.”
The more grass-roots effort at the reform of urban culture was organized largely under religious auspices. A wave of Protestant revivals reestablished middle-class respectability among the former frontier areas before the Civil War. After the war, the revival movements turned to the cities. It is unlikely that the unending series of crusades from Billy Sunday to Billy Graham had a notable impact on ethnic cultures, but these crusades may well have served to shore up the declining faith of the Protestant migrants to the cities. The fact that urban crusades were partially financed by urban business elites, including men such as Morgan and Vanderbilt, who were not otherwise noted for their personal piety, suggests the coalition of anti-ethnic and anti-working-class forces in action (Hofstadter, 1963: 110). Closely associated with the urban revivals were the Salvation Army and the YMCA movements. The latter was staffed by upper-middle-class Anglo-Protestants from the religious colleges, as was the growing social-work movement. The efforts of settlement houses were to Americanize the immigrants by teaching English, respectable forms of culture, manners and morals, and generally to extirpate “vice and delinquency.” The program of teaching English seems to have had an impact since it provided a skill the immigrants could use to acquire citizenship and political rights. The other aims of the welfare workers were by no means so welcome and had little effect (see Whyte, 1943: 98–104; Platt, 1969).
The more militant wing of Christian reform was expressed in the movement to prohibit alcohol (Gusfield, 1963). Puritan culture always regarded drunkenness as evil, but its traditional emphasis had been on moderation. The Puritan colonists regarded a mild use of alcohol as possibly medicinal and only a minor vice. Alcohol grew into a major issue only in the late nineteenth century, when it came to symbolize the wholesale challenge to the traditional Protestant values. The movement to use the force of the state to cut off alcohol by a constitutional amendment gained ground in the years just preceding World War I: Prohibition was finally enacted in 1919. The timing was not accidental. Decades earlier, Protestant culture was still locally dominant and needed no such action to assert itself. A few decades later, the action would have been impossible; rural America was losing its majority, and indeed by the 1930s the urban and ethnic forces were strong enough to repeal Prohibition. The struggle over alcohol symbolizes the crisis period of the ethnic struggle, the decades on either side of World War I. H. L. Mencken, the Scopes trial, and F. Scott Fitzgerald symbolized the cultural conflicts of the era.8
The most significant effort to assert Anglo-Protestant culture was in the realm of education. Public schools, with their compulsory attendance laws, spread precisely in those states facing the greatest immigration influx; the claims of educators to Americanize the immigrants was a major force in getting public support. And in other ways as well the struggle for social precedence in late nineteenth-century America began to shift toward an increasing reliance upon educational credentials. The results have ramified, not only in the schools, but also in the professions and the entire occupational structure.
Eventually, the ethnic conflicts were to recede, although as of the late 1970s they had not yet disappeared. The high point of multiethnic diversity came just before World War I; just afterwards, most culturally alien immigration was cut off. In succeeding decades, assimilation has diminished some of the cultural differences and broken down some of the group barriers (Lenski, 1963, 1971; Schuman, 1971). Not all cultural differences have disappeared, to be sure; ethnic communities have much more resilience than sociological theory traditionally gave them credit for, and major ethnic differences can still be found in occupational levels and occupational enclaves (Glenn and Hyland, 1967; Wilensky and Ladinsky, 1967; Duncan and Duncan, 1968; Jackson et al., 1970), as well as in marriage and friendship patterns (Greeley, 1970; Laumann, 1973), political participation (Greeley, 1974), social attitudes (Featherman, 1971), and emotional expressiveness (Zola, 1966; Zbrowski, 1969). Moreover, cultural diversity has been replenished, although on a smaller scale, by continuous migration from Mexico, Puerto Rico, and Latin America, and by the mobilization of blacks in their movement from the rural South into the large industrial cities. The conflicts to which these movements have given rise in recent decades are well known.
In order to explain the American cultural market, it is not merely sufficient to describe how much cultural diversity still exists, and hence how strong group boundaries are. The most crucial effect of multiethnicity in American history has already happened: the creation of an institutional pattern, a “contest-mobility” credential system, which has gathered the diverse particulars of American cultures into an impersonal cultural market with its own abstract currency. Even if ethnic differences were completely to disappear today, this institutional structure would still be with us and still occupy a central place in our stratification. The fact that ethnic diversity does still exist and that Chicanos, Puerto Ricans, blacks, Asians, and various white ethnic groups (even self-conscious, backlashing Anglo-Protestants) make demands for precedence within the credential system today merely accelerates the pattern of development of that system in channels laid down several generations ago.
The height of multiethnic conflict in the United States was reached in the early twentieth century. But the process of cultural conflict, and hence the building of the cultural market, grew up gradually and over a number of generations. In broad outlines, we can say that ethnic cultural issues came to permeate the stratification struggle not long after the Civil War with the tremendous increase in non-English-speaking immigration at that time. But Irish Catholic immigration, which picked up in the 1830s and became considerable in the 1840s, had already provided opposition, and there were cultural issues involving class relations within the early Anglo-Protestant community even earlier. For a strong relative contrast, it is worthwhile to go back to the educational institutions of America at the turn of the nineteenth century in a situation of near ethnic homogeneity.
In contrast to the lengthy sequences and complex branchings of modern American education, the school sequence at the turn of the nineteenth century was short and simple, and the system was small.9 The basic skills of literacy and arithmetic were taught in several locations: at home (by parents or, in the wealthier families, by tutors), in church, in the apprentice’s work place, or at local public and private schools. More advanced education based on the classical curriculum (the medieval trivium and quadrivium) were offered by colleges. There were also a very small number of academies that offered a classical curriculum beyond the elementary level.
These three types of educational institutions did not form a sequence. Colleges and academies taught substantially the same curriculum, and graduation from an academy was not necessary for college admission; as in medieval Europe, bare literacy was the only requirement at many colleges. Elementary schools did not provide social certification, and attendance at such a school was not a prerequisite to college enrollment. Schools other than colleges carried with their training little or no additional social status; they existed for the limited and functional purpose of inculcating literacy, the possession of which was deemed sufficiently obvious to require no official announcement. The early American colonies were literate societies, but the literacy was accomplished by the largely informal acquisition of obviously useful skills. The main difference among types of schools was their social class composition: Elementary schools and apprenticeship were for the artisan class; the Latin grammar schools and the colleges (as well as home tutoring) for the upper class; and academies emerged in the eighteenth century, with a somewhat less classical curriculum, for the urban commercial middle class.
College education was essentially a secondary education at this time. Unlike elementary education, it served no clear technical purpose, but it did have the power of certification: Colleges were legal entities, chartered by the state and deriving from their charters the right to grant the traditional degree of the European university, the B.A. The earliest colleges of the seventeenth century had the traditional powers to grant higher degrees (Master and Doctorate), including the subjects of medicine and law, but this vocationally important certification power was rarely exercised. In the fluid frontier society with its volatile labor market, the teaching of medicine and law came to be carried out primarily through apprenticeship, and college control over entry into professions disappeared. The earliest colleges had been founded primarily to provide educated men for the ministry; this vocational function was preserved, although seminaries were also founded to give expressly theological training. (In a sense, American colleges had become secondary schools for preparation for the “higher education” of seminaries.)
American formal education in the early nineteenth century was rudimentary. There were growing numbers of locally supported elementary schools, teaching the classical curriculum but providing no certification; an unusually large number of colleges, which taught a curriculum essentially the same as that of the academies and offered a bachelor’s degree; and finally, there were a few seminaries, which recruited from college graduates but also from those without college degrees. Requirements for entry into any of these institutions were minimal and did not depend upon certification by any other educational institutions. The only restrictions placed upon entry to academies, colleges, and seminaries were sex (only men were admitted), literacy (sometimes in Latin and Greek as well as in English), ability to pay the modest fees, and adherence to the orthodoxy of the religious denomination in charge of the school. Age was not a requirement, and hence the age of college students varied from early adolescence to the mid-twenties. The number of years of college study was not rigidly adhered to, although the traditional medieval term of study was 4 years. Only seminaries provided a certification relevant to any occupation, and even they did not monopolize paths into the ministry, as ordination by evangelists was another common route, especially on the frontier. The colleges provided a certification in the form of a degree, but it was acquired by few of those who entered college (most dropped out when they felt they had enough education) and seemed to carry with it little vocational advantage.
Formal education was important for only two occupations: the ministry and school teaching. In the coastal areas, especially in New England, where the status and pay of ministers remained high, the link between the colleges and the clergy was fairly close. The elite Congregational and Anglican clergy were virtually hereditary castes, their sons practically monopolizing the specified cultural training at home and in schools (Main, 1965: 139–141, 275). If ministers in the older areas were well off, however, the teachers were not. School teaching was an ill-paying and hazardously unstable occupation, and teachers barely clung to the bottom rungs of the lower-middle-class life style, often at the cost of foregoing marriage. School teaching was not a very desirable occupation. Although it seems to have attracted many college graduates as a temporary job on the way to the ministry, the educational qualifications of teachers were acceptable at far below the college level (Bailyn, 1960: 96). Thus even for teachers, formal school certification was not highly stressed.
In general, the educational system in the colonies before the American Revolution comes close to an ideal type of an educational system in which organizational restraints and symbolic certification counted for nothing. Upon this base line we may trace the growth of a system in which, as the technical importance of training grew in some areas, the social edifice surrounding it grew even faster.
The first step toward a system of educational certification in the United States was the creation of free public elementary schools. Even though the elementary schools were not themselves certifying agencies, they provided the basis for a sequence of years of public instruction that was eventually to culminate in the certificate of a high school degree and to be unified by school attendance laws.
Colonial education at the elementary level took place in many contexts: home, apprenticeship, church, private schools, tutoring, and some public schools. Public schools were a small part of this group. They were supported by local townships, but they often required a small fee from the parents of their students, and truly free education was provided primarily for the children of families willing to declare themselves indigent. As a result, public school attendance was fairly low and confined largely to the most prosperous New England and Middle Atlantic states. Basic education could be acquired outside the schools; the overall efficiency of the system was indicated by the fact that between one-half and two-thirds of the adult male populace was literate.10
Beginning around the turn of the nineteenth century, states began to require local townships to establish free public elementary schools, and somewhat later began to support them with financial aid. By 1860, most New England states had created free elementary school systems; southern states did not begin to do so until the northern pattern was imposed by the Reconstruction after the Civil War. Western states and territories began to do so soon after they were formally organized.
The impetus for the foundation of public elementary schools came primarily from upper-class and upper-middle-class professionals, especially ministers, educators, and lawyers. This was the class of religious leaders of earlier days; in the more secular climate of the nineteenth century, they turned to humanitarian crusades, as if to restore their old cultural dominance through moral reforms. The influx of ethnic aliens, beginning with the Irish Catholics in New England of the 1830s and 1840s, gave a special opportunity for cultural entrepreneurship. Public education was but one cause among many. This same group of professionals was also active in crusades against slavery, war, intemperance, imprisonment for debt, harsh penal conditions, inhumane treatment of the insane, and harmful labor conditions. They agitated for child labor restrictions, public hospitals, and social welfare, as well as for free schools.
The reformers used a wide variety of arguments to gain political support for public education, claiming that education had good effects on labor productivity, political stability, and moral character. Some of these arguments were echoed by a few supporting industrialists, but most manufacturers and merchants were hesitant to support free education, primarily because of the tax expense. The lower classes in general were rather opposed to it. On the frontier, the lower class and the farmers tended to oppose public education on the ground that there was no need for it (Curti, 1935: 66–68, 87–90). Urban workers were also generally apathetic, especially since the public schools were attended most heavily by the children of the upper classes and imposed upper-class notions of deference to authority upon wayward working-class youths. In Beverly, Massachusetts, in 1860, for example, workers mobilized by a shoemakers’ strike voted overwhelmingly to abolish the new public high school, whereas the upper class voted to retain it (Katz, 1968: 19–93).
There were some contrary currents. Labor union programs sometimes called for free unified schools serving all social classes, usually linked with the demand for restrictions on the competition from child labor (Ensign, 1921: 41, 52; Curoe, 1926: 13–15, 29–31). But labor was not the main mover or even a consistent supporter of public schools. Its influence was more likely indirect, through the support of child labor laws. These in turn raised the question of great interest to upper-middle-class moralists: What was to be done with the unoccupied children?
It is apparent that mass elementary education was created not primarily in response to industrial demand nor in response to a publicly felt desire for its practical benefits, but rather in response to the political influence and persistence of the descendants of the colonial clerical elite who made the political alliances and ideological appeals necessary to further their cause. In the New England states where the movement began, its leaders gave the impression of being on the defensive, acting to preserve a traditional moral culture that was being challenged from several directions: by the speculative ethos of commercial and industrial expansion just under way; by the rise of a working-class culture mobilized by urbanization and by trade unions; and by the beginnings of alien immigration, especially of Irish Catholics. This was not simply a matter of fighting against status decline in a purely cultural battle. The clerical and professional elite of New England was an actual subclass, with economic and political interests to defend and with which its cultural prestige was intimately bound up. It found itself in a situation in which other social groups were increasingly mobilized to challenge its social domination in economic, political, and cultural respects alike. But these other groups were by no means united among themselves; and by making appropriate alliances, the old clerical class was able to sell other groups the apparent advantages of a formal and compulsory education in the traditional culture.
Perhaps the most significant of these other groups can be found most clearly not in the old coastal settlements, but inland, where waves of religious revival had been spreading westward intermittently since the late eighteenth century. The early period of the frontiers was dominated by land speculation, violence, and uninhibited masculine carousing. Once the frontier moved farther west and land became settled for routine commercial agriculture, the middle-class ethic of disciplined, ascetic work and familial morality began to be reasserted. Religious revivals were the vehicle for this movement to reorganize once-frontier communities on a respectable petit bourgeois basis. Thus, although the population shift beyond the Appalachians at the end of the eighteenth century had reduced religious adherence to perhaps only 10% of the population, revivalism swept the inland areas, especially from the 1830s onward (Hofstadter, 1963: 55–141). The agents of these revivals were itinerant Methodist and Baptist preachers, who by 1850 had made their denominations the largest Protestant groups. Along with religion, and frequently in the persons of the same entrepreneurs and congregations, came education.
Education, then, was adopted by a newly expanding middle class, which was creating a commercial society upon territories that had recently settled down from the violent and much less tightly organized days of frontier conquest. Traditional Anglo-Protestant culture, in the form of religion and education, was used to create solid middle-class control—indeed, for the new commercial settlers to constitute themselves initially as a locally dominant class. Cultural organizations were their weapon against what they saw as the “barbarism” of their more adventurous and individualistic predecessors. It was this source of demand for traditional culture that gave the clerical class new opportunities, although they would act no longer in the role of undisputed community leaders, but as entrepreneurs selling culture, and thus laying the basis for a public cultural market.
In the first half of the nineteenth century, the basic pattern of elementary education was established. It would be extended without important modification to the South and to the new states of the West. The main purpose of elementary education was to inculcate moral values and the basic social skills of literacy. To that end, its curriculum remained religious and classical, its methods rote learning and discipline. Public feelings about the value and necessity of education continued to be ambivalent. On the one hand, attendance rates rose, even in the absence of compulsory attendance laws, to perhaps 40% of the youthful populace in 1840 and 60% in 1880.11 On the other hand, attendance did not seem to be an urgent matter, as the ages of elementary school students varied from 5 to 18 (Folger and Nam, 1967: 6–8). Since retardation rates were highest in rural areas, where high schools developed latest, it seems likely that before the introduction of free public high schools, parents sent their children to elementary school only sporadically. Rigid age-grading did not become important until elementary school came to be treated as a step toward secondary education.
We may conclude that elementary education by itself met a mild public demand. If the public did not care enough about education to pay tuition for each child’s education, it would take advantage of free elementary schools, if and when the services of the children were not needed on the farm or in the shop. No doubt the enactment of child labor laws in the late nineteenth century made the school attractive as a caretaking institution. The sporadic attendance pattern suggests that children attended until they achieved what seemed a sufficient degree of literacy, then dropped out unless they planned to go on to high school. For the majority of the populace, free public elementary schools operated primarily as a training ground in basic literacy, and particularly in the frontier areas, as a device for establishing a basic level of middle-class respectability. Only with the rise of the public high school did the formal certification become more prominent and the elementary school itself become transformed.
Publicly controlled, tax-supported high schools began to appear in eastern states such as Massachusetts, New York, and Pennsylvania by 1860, and spread throughout the rest of the country during the remainder of the nineteenth century. The political battle for public secondary education was fought by the same group of upper-class and upper-middle-class humanitarian reformers who had won public elementary schooling. As in the earlier instance, all manner of public benefits were alleged in favor of this innovation. In 1877 an educator declared that the schools alone had saved America from the tenors of the French commune, and another leader in the struggle for public support declared: “The remedy for agricultural depression, bad roads, the discontent and thriftlessness of youth, for many of the ills of which we complain, is a well-sustained school system [Curti, 1935: 274; see also 209–210, 216–218].” Every political crisis and issue was seized upon as indicating the need for more public education.
Diverse coalitions were involved in establishing state-supported secondary schools, with one central element remaining constant: the educators themselves. The public schools created in the first half of the century themselves provided the major element in the growth of free secondary education in the second half of the century: an articulate and highly dedicated group of men who strongly believed in the value of education and were skilled at gaining political support for it, the administrators and state educational officials. Once established, the educational system, like other organizations, struggled to extend its size, resources, and influence. To this end, educators succeeded in having state after state adopt laws not only establishing secondary schools, but also requiring school attendance between certain specified ages. The first such law was passed by Massachusetts in 1852, the second by Vermont in 1867. Such laws were enacted by 25 states prior to 1890; most of the other states (concentrated in the South and West) did so by 1920 (Folger and Nam, 1967: 24–26). At first the laws specified ages 6 to 14, but the upper age was gradually raised to 16 and in places to 18. Efforts to enact and extend child labor laws at about the same time undoubtedly helped bring labor support to the humanitarians and educators in their battles for public support and attendance laws.
The arguments for public high schools and for compulsory attendance were much the same as those advanced for public elementary schools. Industrial benefits were alleged on occasion, but the principal rationalizations referred to good citizenship, political stability, and moral qualities. Generally speaking, compulsory attendance laws must be seen in the context of group conflict. The schools were established by upper-middle-class reformers, and their content and methods were designed to inculcate the virtues of an idealized rural society. Voluntary attendance in high school around 1860 was highest among Anglo-Protestant middle-class rather than working-class children. Greater proportions of children attended in the more traditional (and predominantly Anglo-Protestant) New England small towns (28%) than in medium-sized towns (15%) or cities (8%).12 But although the schools appealed most to carriers of middle-class and rural culture, the effort to establish public high schools was strongest in the cities (Katz, 1968: 48). This was partly a matter of securing a greater tax base for urban schools, but mainly it reflects the challenge of urban immigrant working-class culture for which the public school was a response.
The pattern by which compulsory attendance laws developed makes this clear. Compulsory attendance laws grew out of two related issues: truancy and control over recalcitrant youths. Elementary school administrators complained constantly of irregular attendance, and legal means for apprehending absent students were urged. The first state law against truancy was passed in Massachusetts in 1850, 2 years before the first compulsory attendance law. But even earlier than this, the Massachusetts State Reform School was established in 1848, the main purpose of which was to provide a place where rebellious students could be committed. That the issue was cultural conflict was indicated by the most prevalent offenses for which boys were committed; in order of frequency, among 440 boys committed, 1850–1851 (Katz, 1968: 175): 394 “had used profane language”; 350 “had been idle previous to admission or had no steady employment”; 303 “had attended the theater and similar places of amusement”; 259 “had used obscene language.” The typical scenario of what was regarded as delinquency, as recounted by Katz (1968), went as follows:
The first crime was often truancy from school. The truant became familiar with “horse racing, the bowling saloon, the theatrical exhibitions and other similar places of amusement, debauchery and crime.” In the bowling alley, “initiated by being employed in setting pins,” he soon acquired “the desire to act the man” and became a “juvenile gambler.” Profanity, drunkenness and licentiousness soon followed, hurrying him forward in the path of crime and ruin [172–173; see also 164–185].
The “criminals” were for the most part boys of the urban working class, many of them Irish Catholics. The solution for the reformers was the reform school, set in the morally pristine countryside and having compulsory religious instruction under the direction of a Protestant evangelist minister. It was in this mode that compulsory school attendance laws developed. Massachusetts seems to have led the way, both because it had the upper-middle-class leaders and the traditions of community moral control over individuals and the earliest cultural challenge of alien immigration. The spread of compulsory school attendance laws after the Civil War seems to have followed the lines of immigrant and urban challenge. As the conflicts between native Protestants and immigrant ethnic groups built up, the importance of school as a weapon of control increased. From compulsory elementary schooling in the nineteenth century followed compulsory secondary schooling in the twentieth.
Secondary education was conceived primarily as socialization, and the curriculum as initially established emphasized the classical studies, quite remote from commercial life. But here arose a tension within the plans of the secondary educators. On the one hand, they were committed to providing an education consisting of cultural socialization, but this socialization was to take place at a level beyond that of elementary school and hence must have some distinctive content. This content could be found in the traditional subjects of higher education: classics, languages, literature, history, science. On the other hand, this content was already given by the existing colleges, and as we shall see in the section dealing with colleges and universities in this chapter, the colleges were widely available in America even before the push for high schools. Thus high schools and colleges competed with each other because age-grading had not yet developed and they both drew upon the same age range of students.
Compulsory schooling laws, which began from truancy laws around 1850 as part of the moral socialization crusades of that time, specified ages at which children were to attend school but not the level of study or the type of school (Folger and Nam, 1967: 24–26). Grade levels, the 4-year high school, the 8-year elementary school, and, above all, regular yearly promotion did not develop until the 1870s. There was considerable retardation in grade: As late as 1910, a majority of the students aged 14–17 were still in elementary school; approximately 25% of the elementary school population was over 13 years of age, and 80% of secondary school students were over 18 (high school students might be as old as their twenties) (Folger and Nam, 1967: 6, 8–10). And colleges, following the medieval model that prevailed in America until the late nineteenth century, also had a range of students from their early teens to their mid-twenties. They did not require prior high school or even elementary school attendance as long as students were literate (sometimes in Latin and Greek), and many colleges established their own preparatory departments for students who needed to work up these prerequisites.
Moreover, both types of schools competed with professional training in this same age range. Professional training in medicine, law, divinity, and engineering in the middle of the nineteenth century was largely carried out by apprenticeship, again beginning anywhere from the early teen years to the twenties. Where professional schools did exist, they would admit students (as had the medieval European universities, the British Inns of Court in which lawyers trained, and so on) without prior educational requirements except literacy, and at any age from the early teens on.13
Under these conditions, it is not surprising that all of the traditional contents of education were frequently attacked by the public as useless drills in irrelevant subjects. The alternatives proposed most often were some form of new, vocationally relevant education for the modern age. Some educators opposed this, but others began to organize vocational high schools in the 1880s. The ideal of a utilitarian education had been raised even earlier. Before the Civil War, an idealized view of manual training flourished among upper-middle-class philanthropic reformers as part of the effort to recapture the virtues of rural life. Thus manual labor was first introduced in seminaries training missionaries for their evangelical ventures in the West (Fisher, 1967: 16). Manual labor was the main course of training in reform schools, where it was regarded as the best remedy to crime and delinquency, and the moral value of manual training could still be extolled late in the century by social workers such as Jane Addams, who argued that it would give a sense of meaning to the lives of prospective industrial workers (Fisher, 1967: 47).
In the period of heavy industrialization after the Civil War, however, this sentimentalized philanthropic view receded, and advocates of vocationalism began to argue that vocational training could provide a new elite of engineers. The foremost advocate of this position was Calvin Woodward, a Harvard graduate and Dean of the Polytechnical Institute at Washington University in St. Louis, who set up a Manual Training High School designed to replace the traditional vocationally irrelevant education. Woodward’s school attracted much attention, especially in the atmosphere of Horatio Alger ideology then popular among businessmen.
Nevertheless, engineers themselves were not friendly to the manual training programs. The engineering profession was just separating itself from the apprenticeship-trained occupation of skilled mechanic, and the Morrill Act in 1862 opened up land-grant colleges to them. Accordingly, they were not eager to have engineering training established at the secondary level. “No standard of gentility, no patent of nobility can be too high for a profession which leads the civilization of the world…” declared the president of the American Society of Engineers in 1891, in an address entitled, “The Engineer as a Scholar and a Gentleman” (Fisher, 1967: 66). The American Society of Mechanical Engineers was cool on manual training schools and in the late 1890s began vigorously to support the manual trade school as an alternative, nonelite conception of vocational training.14
In the early twentieth century, vocational educators swung back to the ideal of education for the lower occupations, advocating separate trade schools or programs for the working class. The program came to be supported after 1900 by the American Society of Mechanical Engineers, the National Association of Manufacturers, and the National Education Association. The manufacturers and engineers regarded the extension of trade schools as a crusade to break the power of foreign-controlled trade unions, which kept American boys from learning a trade. Frederick Taylor, the advocate of scientific management as a solution to labor strife, tended to equate the notions of “education” and “production.” According to Fisher (1967):
This is evident in Taylor’s testimony before a special Congressional committee, where he described how shovelling could be made scientific. When one of the shovellers in a gang was having trouble or producing below par, said Taylor, a “teacher” was sent to show him how to do the job most easily and quickly. “Now gentlemen,” Taylor continued, addressing the committee, “I want you to see clearly that because this is one of the characteristic features of scientific management, this is not nigger driving; this is kindness; this is teaching; this is doing mighty well what I would like to have done to me if I were a boy trying to learn how to do something” [89; see also 115–131].
Given this view of trade education, it is not surprising that Samuel Gompers and the trade unions generally regarded it with suspicion and hostility, even though they had some status interests in dignifying manual labor by giving it a place in the school curriculum. It is apparent from the evidence cited in Chapter 1 that vocational education was never of great practical importance. Its advocates were always most strongly motivated by ideological issues, and programs were actually put into effect in periods of political crisis. Thus the Smith–Hughes Act, giving federal support to vocational education, was passed during the antiforeign fervor of the American entrance to World War I. Further extensions occurred in 1933 and 1934 as part of New Deal efforts to handle the “youth problem” and again in the NDEA Act of 1958 during the nationalist upsurge of the Cold War Sputnik issue and an economic recession.15
The leaders of American secondary education have generally been hostile to the vocational school movement (Curti, 1935: 317, 348; Cremin, 1961: 27ff.). They saw vocational education as undermining the ideals of a culturally socializing education and of education for citizenship, and they doubtless felt the accompanying threat to their status. In principle, vocational high schools or their curricula might have replaced entirely the high schools more oriented toward liberal arts, or the school system might have become bifurcated along class lines. Educators generally opposed the second threat as well as the first. The political difficulties of funding the liberal arts schools in competition with a superficially more appealing technical education must have seemed a threat, and their commitment to the moral and social value of education made them oppose any structural split that would dilute their moral influence and undermine the political inclusiveness of schooling.
As it turned out, vocational education never posed a serious threat since it had major weaknesses of its own. Vocational education might seem attractive on comparison to the rote learning of the classical curriculum, but vocational schools themselves lost out in competition with occupational realities. Training for manual labor could just as well be acquired on the job or through apprenticeship, and training in purely manual skills apparently reduced the value of education to the student below that of liberal arts education. The prestige of education was lost, but little advantage was given over to the manual laborer who gained his skills on the job. The slogan was put forward that technical skills would dignify labor, and much was made of the distinction between “arts” and “trades.” Nevertheless, such efforts to give status to manual labor did not convince. Students such as those at Booker T. Washington’s Tuskegee Institute objected to manual training in place of learning the middle-class culture that would raise their social status (Curti, 1935: 293). In general, manual training was associated in the public mind with the rehabilitation of criminals (Fisher, 1967: 78–79).
The combination of these two faults—the lack of competitive advantage of manual education over on-the-job or other informal training and the fact that manual education provided no social mobility for students into middle-class culture—meant that manual arts education was bound to lose out to the liberal arts course. Students preferred to take liberal arts or drop out altogether. High school education did take on an increasing vocational value as the number of clerical jobs increased, especially with the introduction of the typewriter and the subsequent hiring of female clerical help around the turn of the century. The commercial arts course was more successful than the manual arts course, although its main appeal was confined to girls. In content and in status, however, it remained close to the liberal arts curriculum and thus prepared students for the middle-class culture as well as for the technical skills required of clerical jobs. Vocational schools themselves did not flourish. There were fewer than 100 in 1950, and only the NDEA Act of 1958, operating in a period in which schools had become a major means of keeping unemployed youths off the streets, brought a serious expansion of vocational training (Fisher, 1967: 218).
In retrospect, we may view the vocational education movement as an attempt of some sectors of the educator community to adapt to the influx of lower-class students into the high schools resulting from the introduction of compulsory attendance laws. Having fought for such laws to extend their influence, educators found that their teachings were irrelevant to the lives of many of their students, but instead of relinquishing them by giving up on compulsory attendance, they sought for a rationale to justify keeping them in school. The same interpretation may be roughly applied to the Progressive movement in education, which attempted to substitute a rather vague “life adjustment” training for the classical curriculum. Some of its innovations, such as athletics and other extracurricular activities, no doubt served an internal function for the school itself—providing some diversion for their unwilling captives and hence some control over them, especially through school athletics.16
Progressivism’s reforms have had a greater long-term effect than those of vocationalism, partly because they have not threatened the status socialization and mobility functions of schools. While providing some needed outlets for restless children, the shift away from emphasis on academic subjects eased the pressure of teaching a large and hostile body of working-class students. These pressures were especially strong in the largest cities. In 1909, 58% of the pupils in the 37 largest cities had foreign-born parents, and it was in these cities that Progressive education first took hold (Swift, 1971: 44).
Progressivism provided, for a time, a useful ideology to counter criticism of the uselessness of the curriculum, and its emphasis on scientific pedagogy and on scientific tests gave some basis to claims of professional status for teachers. Once college attendance became a widespread goal in the United States, high school education no longer needed to be defended as an end in itself, and Progressivism declined as a popular ideology (Cremin, 1961: 175ff; Trow, 1966). Nevertheless, its practices remained prominent, both in the emphasis on athletics and the extracurriculum and in the values prevailing among students. Thus Coleman’s 1957–1958 study of 10 midwestern high schools found that athletics and leadership in social activities were most looked up to, scholarly competence relatively little, and that these attitudes were strong even among the middle-class students who planned to attend college (Coleman, 1961). Indeed, the antiacademic values were highest in an upper-middle-class suburban high school where going to college was the norm. Studies of high schools in the 1920s found that the Progressive innovations in the extracurriculum had been heartily accepted by students and teachers alike, making the schools into a kind of social club for the middle-class students and their parents (Lynd and Lynd, 1929: 188–205, 211–222; Waller, 1932: 103–131).
The attack on the practical uselessness of secondary education was thus blunted by a few relatively minor reforms. The demands for a vocationally relevant education were only a convenient form of rhetoric for expressing dissatisfaction in a period in which the organizational relevance of any of the competing forms of schooling was dubious. This was especially true of the newest form, the high school. The underlying issue was one of career sequences and the status value of particular kinds of education for moving along such a sequence. But the public (or for that matter, the educators) had no such abstract insight into the nature of the educational market. Their dissatisfaction was real enough, but for solutions they fell back on the most common style of “practical” argument. Their actual response to vocational training, when it was offered, shows the roots of their behavior more truly.
With the growth of high schools in the 1870s, organizational problems became a dominant influence on the nature of education. Following the lead of Superintendent William T. Harris of St. Louis, the high school was organized into a 4-year sequence with promotion by regular examinations. The large numbers of pupils necessitated standardization of instruction sequences and means of evaluation; schools emphasized rote learning, attendance records, and general principles of efficiency in moving students through in large batches (Cremin, 1961: 19–21; Swift, 1971: 67–77).
At the same time, schools became larger administrative units. Unification of school districts began in the 1870s. This meant the decline of the one-teacher school with children of all levels in the same room, still the most typical pattern in 1870. School principals and superintendents began to emerge, and by the early twentieth century, schools had become huge in both the number of students and administrators. Once the bureaucratic organizational form was established, its own logic of development became dominant: Between 1920 and 1950, the number of students in public schools grew 16%, the number of teachers 34%, and the number of administrators 188% (Swift, 1971: 81). As a result of age-based attendance laws and the effects of bureaucratization, schools began to standardize by age, and the age range of students dropped sharply, both in high schools and elementary schools. In 1910, an estimated 25% of the elementary school students were over 13 years, and 80% of the secondary school students over 18 years, whereas in 1950, the figures were 2.4% and 4.2%, respectively (Folger and Nam, 1967: 6, 8–10). This has meant that the flexibility of time for school attendance has sharply decreased for the student; it also implies that students move through schools largely by aging rather than by scholastic achievement.17 Interests of administrative efficiency were prominent in this transformation: Retardation in grade was explicitly attacked as an unnecessary expense, and administrators in the Progressive urban schools took the lead in establishing criteria of efficiency based on high promotion rates (Callahan, 1962: 168–169).
The American secondary education system thus took shape: a unitary sequence, open and compulsory for all, preparing primarily for college education, rigidly age-graded, with a heavy emphasis on moving students through in orderly phalanxes. The entire trend of twentieth-century public schooling, though, is not simply a matter of bureaucratization in response to internal problems of control and large numbers. Both primary and secondary education finally found a modest but firmly established place in careers as part of a steadily lengthening channel leading through higher education. Thus it became less and less important whether students learned any particular academic content at the lower levels or even if they were moved through without real examination at all. The occupationally relevant distinctions were pushed further and further ahead to the higher levels—entering college, then completing college, eventually entering graduate or professional school, and so on. Rising school completion rates at the lower levels were simply part of a larger pattern of expansion at the higher levels.
It is to this pattern that we now turn.
The United States has always had a large number of institutions of higher education. At the time of the Revolution, there were 9 colleges in the colonies; in all of Europe, with a population 50 times that of America, there were perhaps only 60 colleges. With the turn of the nineteenth century, the United States experienced a wave of new foundations, bringing the number of colleges in existence in 1860 to approximately 250. By 1880, there were 811 colleges and universities; by 1970, there were 2556 (see Table 5.2). The United States not only began with the highest ratio of institutions of higher education to population in the world, but it also increased this ratio steadily, for the number of European universities was not much greater by the twentieth century than in the eighteenth (see Ben-David and Zloczower, 1962: 44–85).
Table 5.2 Institutions of Higher Learning and Ratio to Population of the United States
Year |
Colleges and universities |
No. per million population |
1790 |
19 |
4.9 |
1800a |
25 |
4.7 |
1810 |
30 |
4.2 |
1820 |
40 |
4.2 |
1830 |
50 |
3.9 |
1840 |
85 |
5.0 |
1850 |
120 |
5.2 |
1860 |
250 |
7.9 |
1870 |
563 |
14.1 |
1880 |
811 |
16.1 |
1890 |
998 |
15.8 |
1900 |
977 |
12.8 |
1910 |
951 |
10.3 |
1920 |
1,041 |
9.8 |
1930 |
1,409 |
11.4 |
1940 |
1,708 |
12.9 |
1950 |
1,851 |
12.2 |
1960 |
2,008 |
11.1 |
1970 |
2,556 |
12.5 |
Sources: Historical Statistics of the United States, Series A-1 and H-316; Statistical Abstract of the United States (1971: Table 198); Tewksbury (1932: 16); Rudolph (1962: 486). There are slight discrepancies among alternative sources.
aNumbers of institutions for the years 1800 through 1860 are estimates rather than exact counts.
The difference in numbers of colleges does not appear to lie in the public demand for higher education, for the supply of colleges in the United States has for most of its history been far ahead of demand, a fact which has had important consequences for the place of higher education in American society. Rather, we must look at the impulses and conditions that fostered and allowed the large number of college foundations in the United States. The conditions allowing such widespread foundations were political decentralization, the disestablishment of state churches after the Revolution, and the emerging legal tradition that liberally granted corporate charters. (The Dartmouth College case, after all, is a landmark in corporate law.) American democracy and local government made it easy to obtain state permission to found a college; in Europe, the centralized and autocratic governments limited the sources of charters, and the connection of European universities to the established church made the founding of new universities a matter of church politics.
The American political situation made it easy to start new colleges; competing political, regional, and religious groups took advantage of the opportunities. The original modest proliferation of colleges in the colonies was due primarily to rivalries between denominations—an orthodox reaction against liberal Harvard led to the founding of Yale, a Presbyterian split led to Princeton, the Baptists founded Brown, the Dutch Reformed founded Rutgers, and so on. To some degree, the rash of college foundings was also due to regional sentiments for a college conveniently located nearby. After the Revolution, and especially during the period of expansion to the West, local and state pride led to a large number of new colleges. Moreover, the frontier produced not only a widening of the franchise, but also a weakening of the influence of the respectable Eastern churches and of the cultural monopoly of the colonial upper class. In this fluid situation, the status-bearing institution of the Puritan aristocracy, the college, became available to the poorer sector of the populace. The evangelical lower-class churches, the Baptists, Methodists, and others established colleges of their own in every locality in which there was a sizable group of church members or a threat from the college of a rival denomination. In this situation, a figure arose who was to dominate higher education in America until well into the twentieth century: the college president, an educational entrepreneur.
The colleges that proliferated across the American landscape in the early nineteenth century were imitations of the original colonial colleges; innovations on the narrow classical curriculum were rare until the second half of the century. This curriculum had originally been taken from the liberal arts section of the European universities and led to the traditional degree of B.A. In the medieval university, the arts instruction was essentially a preparatory school for the true university studies in theology, medicine, and law; the medieval university, as the institution of higher education in a society that otherwise possessed only local church-run elementary schools, combined the functions of a secondary school and a set of professional schools. (The upgrading of the B.A. curriculum to a level above that of secondary schools occurred in Europe only in the early nineteenth century, at a time when true secondary schools began to develop.) The American college, however, had lost the professional training and certification functions at the outset. British legal tradition, on which Americans drew, trained lawyers outside of the universities; Protestant denominational opposition to the established church forms of university theology schools resulted in the elimination of a theology faculty; and the tiny and widely dispersed American medical profession precluded the establishment of university medical faculties. The colonial college was essentially the equivalent of a secondary school, and the nineteenth-century colleges were faithful imitations of the original models.18 In the absence of secondary schools in America, the college acted as a substitute; moreover, the range of ages of the college students—from the early teens to the mid-twenties—was in fact equivalent to that of the few secondary schools that began to emerge in mid-century.
The nineteenth-century American college was primarily an instrument for inculcating religious piety, according to the tenets of particular denominations, and for conferring whatever status still remained in the concept of a college and its degree. Only the status certification function of the college was important at this time, for the training inculcated via the classical curriculum and stem discipline was of little intellectual or practical value. The college was distinguished from the secondary school only by its name and its degree, not by its substance. The origins of the large number of American colleges cannot be attributed to the demands of the economy nor to the demands of the industrial revolution; not only did the colleges precede the economic expansion, but also their training was in no way designed for practical skills.
By the 1850s, the American colleges faced a crisis. The colleges were small and continually in financial difficulties. The failure rate was high: Feverish entrepreneurship in the educational sphere had founded perhaps 1000 colleges before the Civil War, of which over 700 failed (Rudolph, 1962: 47). Clearly, there were far too many colleges for the demand. But in midcentury, the crisis appeared to be worsening: Colleges not only failed to grow, but they also shrank in size; the number of college students in 1870 in New England had been declining for some time, both absolutely and in proportion to the population (Rudolph, 1962: 218). Moreover, attacks began to be heard upon the colleges and the curriculum as irrelevant to the lives and interests of the vast majority of the populace. Yet this was a period of rising literacy in the populace, a time of the consolidation of public elementary schools, and the beginnings of public secondary schools. A larger potential clientele was becoming available, yet the colleges’ situation was becoming worse rather than better.19
The crisis was clearly a result of overextension; the United States, with perhaps five times as many colleges as the rest of the world together, had far too many colleges for the demand. Still, this overextension had surely existed throughout the nineteenth century; the crisis also seems to reflect a change in the nature of the value of the college education and the rise of an external source of competition for the colleges. It appears that the status value of a college education had become diluted by the large numbers of colleges. Offering little besides an imitation of the high-status New England colleges, the newer colleges succeeded only in reducing the value of a college degree by making it more widely available, much as an exclusive club loses its desirability by being thrown open to the public: Its charm existed largely in its closed doors; once inside, the public was likely to find that its armchairs were musty and uncomfortable. The educational entrepreneurs who traveled the country from college to college, often founding and abandoning several, had finally debased the currency they had hoped to build into status fortunes.
The expansion of secondary schools may have contributed to the crisis. To be sure, secondary schools shared the liabilities of the colleges in providing little educational substance for which there was a real demand; but they had the advantages of child labor laws, the elementary school movement and its related compulsory attendance laws, as well as freedom from tuition and location near the homes of their clientele. Moreover, the age range of secondary school and college students was substantially the same; only a small proportion of the population attended either type of institution (see Table 1.1), so they competed for the same limited pool of students.
The colleges’ only weapon was its certification power, but that was waning because it was based on the status appeal of an exclusive medieval institution that was rapidly losing its exclusiveness. The curriculum of the college met little public demand, and professional training in the ministry, law, and medicine could be obtained outside the colleges. The first answer of the educational entrepreneurs was to initiate a few cautious experiments with a vocational course, but this in itself met little success. The old curriculum was repeatedly attacked as providing no useful training, yet when nonclassical programs were introduced (such as at Miami, Virginia, and New York in the 1820s and 1830s) as alternatives to the classical B.A. course, they failed for lack of students (Rudolph, 1962: 126–129, 238–240). The old method of certification was becoming less respected, but new methods that based their appeal solely on the technical training provided (the programs mentioned did not culminate in the B.A. but only in a certificate of proficiency) evoked even less demand than the classical course. Utilitarianism seemed an easy solution to a few reformers, but the conservatives saw more clearly when they clung to the old forms of status conferral. As one college president put it: “While others are veering to the popular pressure…let it be our aim to make Scholars and not sappers or miners—apothecaries—doctors or farmers.”20 The function of the college was primarily certification and not training; to advertise college education purely on utilitarian grounds was to lose even its certification power.
In the 1870s, the leading entrepreneurs hit on a reform, a shift from the “classical” college to the modern university. In the 1880s, the new form spread rapidly, and the fortunes of American higher education began to improve. The rate of failures fell off sharply, the numbers of institutions increased rapidly into the twentieth century, the size of student bodies and of faculties increased, and the ratio of students rose steadily from 1.7% of the 18- to 21-year-old population in 1876 to 53% in 1970. The innovation was initiated by a few new universities such as Johns Hopkins and Cornell in the 1870s and 1880s and by some older colleges such as Harvard, which reformed themselves into universities. The success of these leaders resulted in rapid emulation. In the background, of course, was the increasing conflict of ethnic-class cultures, which gave the old cultural institutions a new urgency in a period of stepped-up immigration.
Curricula were changed and postgraduate (M.A. and Ph.D.) studies were added to the B.A. program. The curriculum change was brought about primarily by the introduction of the elective system in which the student was given a choice of a variety of course offerings (Rudolph, 1978). This necessitated going beyond the classical curriculum and resulted in a great expansion of courses in the sciences, modern languages and literature, social studies, and eventually in vocational fields. The creation of graduate schools had a similar effect on the curriculum: the superseding of the classical curriculum by a variety of specialized scholarly and scientific fields of study.
As Vesey has shown, these reforms were carried out in the name of a variety of ideals: practical utility, pure science, and scholarship, or high culture (1965). Although the proponents of these ideals were often quite antagonistic to each other, none of them was able to prevail over the others, nor indeed to have much effect in shaping the lives of the students. Nevertheless, these ideals appear to have played a crucial part in overcoming the crisis in higher education; they provided the necessary public relations to revive the prestige of the college degree and attract large numbers of students.
The utilitarian ideal had been tried before but although usefulness may provide a good standard by which to declare classical curriculum a dismal failure, it does not attract students when schools offer training they can get just as well outside (see Chapter 1). By emphasizing training alone, universities stand in danger of losing their main value, status certification. Although utilitarian rhetoric was used to pass the Morrill Act (setting up land-grant colleges) in Congress in 1862, little attention was paid to organizing practical courses in agriculture, and when such courses were started, they attracted few students (Curti, 1935: 212). The only utilitarian claim for a university education that seemed successful was the claim to educate a political elite in the arts of public statesmanship, an ideal that was successfully used at Cornell and later at Princeton and elsewhere.
Science, scholarship, and high culture appealed primarily to those who wished to perpetuate the old cultural elite. Interestingly enough, these tended to become a specialized occupational group: those who expected to become university teachers. Since this was a period of the expansion of universities, the demand for teachers was great enough to make possible the expansion of graduate schools, which would be justified as training teachers. The undergraduate emphasis on scientific and scholarly subjects probably had little more effect in training the minds of most students than the classical curriculum had, but it at least carried with it the prestige of a small but prominent and growing profession, that of the scientific researcher. Thus, by offering an education described glowingly, if vaguely, in terms of some larger utility, and carrying the prestige of science for students who would not themselves become scientists (nor, generally speaking, study science very hard), the educational entrepreneurs managed to revive the status of the college degree.
The main appeal of the revitalized university for large groups of students was not the training it offered but the social experience of attending it. The older elite was being perpetuated in a new, more easygoing form. Intercollegiate sports grew up at the same time as the university revolution was being carried out, and the same elite institutions led both innovations (Rudolph, 1962: 373–393). Through football games, colleges for the first time became prominent in the public eye, and alumni and state legislators found renewed loyalty to their schools. At the same time, fraternities and sororities became widespread, and with them came college traditions of drinking, parties, parades, dances, and “school spirit.”21 It is little exaggeration to say that the replacement of the pious, unreformed college by the sociable culture of the university was crucial in the growth of enrollments, or that football rather than science was the salvation of American higher education.
The two developments—the rise of the hedonistic and ritualistic undergraduate culture and the transformations of curriculum—need not be opposed in this way, however, because they occurred together as part of the same effort to revive the status of the colleges. The rise of the undergraduate culture indicates first of all that college education had come to be treated as consumption by the new industrial upper classes, although it also attracted growing numbers of the intellectually oriented and those seeking careers in teaching. College attendance had become an interlude of fun in the lives of upper-class and upper-middle-class young Americans, and the rise of enrollments must be partly attributed to the rising standard of living. But the rituals of undergraduate life had another important aspect: They were direct expressions of the informal side of stratification, sociability. Through participation in the parties and pranks of college life, young Americans formed and consolidated friendships. Sororities arose, as Scott has shown, through the efforts of parents to promote class endogamy (1965: 514–527). Fraternities, which had already existed in the 1830s, became all-important. The collegiate culture took the function of bringing together the children of the upper middle class, forming them into groups of friends bound together by sentiments of college activities and eventually intermarrying (see Baltzell, 1958: 327–372). The new style football-and-fraternity college thus became important in forming elite status groups in America, recapturing in secularized form the status of the early colonial colleges.
That the initial success of the new university model was more a result of its status appeal than its efficiency in training is shown by the defeat of the “acceleration” movement in the 1880s. Some utility-minded administrators attacked the 4-year curriculum as a useless tradition inherited from the medieval universities, and they introduced reforms to allow students to move through college at their own pace, acquiring their training on an individual basis (Rudolph, 1962: 446–448). This effort to put training back as the central function of the college was a failure. Students did not want to disturb the rituals of freshman and sophomore class rivalries, junior dances, and senior privileges. The leaders of the acceleration movement, taking too seriously their own rhetoric of reform, misconceived the appeal of college education to many of their clients: Most students found the essence of college education to be the enjoyable and status-conferring rituals and social life of college rather than the content of classroom learning.
The internal form of the university was shaped by its history of seeking students and catering to their desires for enjoyment and certification. The 4-year curriculum remained because of the value placed on traditional forms of certification. As a result of the long period of seeking students, the colleges had become largely secular and nondenominational; they could not afford to exclude students because of their religion. Similarly, until well into the twentieth century, admission standards were low or nonexistent. The need for a clientele also broke down sex requirements. With the period of university reforms, coeducation was introduced and spread widely in most of the larger and more successful schools, a process that fed upon itself since as the marriageable males concentrated in the colleges, the women had a strong incentive for following. The expansion of credentialed employment in the public schools at this time also drew many women by offering the prospect of careers in teaching.
A prestige hierarchy began to emerge among American colleges and universities for the first time during the period of reform. The schools that led the reform—the original colonial colleges, the heavily endowed new private universities, and the well-supported midwestern state universities—soon set themselves apart from those who were slow to follow. As the period of expansion set in, the leading schools began to attract thousands of students while those that lagged behind in reforms remained tiny. Financial differences followed suit, as did intellectual differences, due to the quality of the faculties that could be attracted by schools of varying resources. With the rise of football and its accompanying undergraduate culture, Yale, Harvard, and Michigan became household words. Once this prestige hierarchy was established and buttressed by financial success, what Riesman called the “academic procession” was formed, and the conservative colleges at the rear were forced, however unwillingly, to emulate their wealthy and famous rivals (Riesman, 1958: 25–65; Jencks and Riesman, 1968).
The leading universities were now in a position to set standards throughout the educational world and began to rise to a position from which they dominated all other forms of education: secondary, professional, and teacher training. In the 1890s, the leading universities, organized in the American Association of Universities, began to exercise accrediting power over secondary schools by setting up standards of admission. Secondary schools became more clearly separated from elementary schools, and a specifically defined age group for each was enforced by the universities’ newly instituted requirement that 12 years of school precede college admission (Rudolph, 1962: 281–286; Wechsler, 1977). It was Harris’s response to this pressure from the universities (see page 155) that caused him to oppose vocational instruction in the public high schools. Thus college education linked itself specifically to the 12-year secondary school system. Aptitude and achievement tests came later, in the twentieth century. A high school diploma was the first formal requirement for college admission and remains so even today, when standardized tests geared to individual attainment could take its place.
Through the prestige of its social and curricular innovations, the new American university became the major influence in the world of education. This influence was expressed in several ways. Those schools that could do so attempted to become universities themselves. Other schools, such as those for professional and vocational training, attached themselves to universities or were annexed by them. Competing forms of education had to establish niches for themselves that did not draw on the same group of students. This was done by tailoring their curricula as either preceding or following the university’s B.A. program.
As Jencks and Riesman (1968) showed, specialized forms of higher education, from teacher-training schools to fundamentalist colleges, Catholic universities, women’s colleges, and community-oriented junior colleges, all have come to emphasize the same scholar-dominated disciplinary sequences as the leading universities have. Beginning in the 1890s, normal schools (teacher training institutes) developed the standard 4-year B.A. curriculum requiring a high school diploma for admission. In this way, many state normal schools transformed themselves into state colleges, and decades later some of these colleges succeeded in establishing graduate departments and renaming themselves universities. Similarly, the state agricultural and mechanical universities set up under the Morrill Act tended to downgrade their vocational functions and to expand their arts and sciences offerings in imitation of the proliferating specialties of the growing major universities with their large numbers of research-oriented scholars. Even the junior college movement, designed expressly as a community service and vocational venture in direct opposition to the scholarly curricula of the universities, could not uphold its distinct curricular ideals. The irresistible force was the students, who greatly preferred the liberal arts, transfer-preparatory programs to the job-oriented terminal programs. Designed as an alternative to the university, the junior college became a last chance for educational mobility for students with poor secondary school records and an agency for “cooling out” those who would not make it into the university (Clark, 1960: 569–576; Jencks and Riesman, 1968: 480–509).
The most dangerous potential rival to the early American university was the professional school, particularly since it drew from the same pool of students. In the late nineteenth century, it was nowhere required to have a B.A. in order to attend a school of law or medicine, and the criticism of the uselessness of the classical college education might have led to a shift of students away from the colleges and toward immediate training in the professions. But instead of substituting for college education, the professional schools became subordinate to the B.A. program. Universities moved to found their own professional schools and make them into graduate schools, requiring a B.A. for admission. This eliminated the competition for students and increased the attractiveness of the B.A. by giving it a palpable value for a visible, if somewhat artificial, career sequence. Professional schools did not resist, for they had troubles of their own, as we shall see in Chapter 6. Professions, after all, maintain sometimes precarious monopolies over practice, especially by controlling entry into their ranks, and the professions in America were wracked by internal status conflicts in the late nineteenth century. The old traditional Anglo-Protestant status of the universities was a handy weapon for the professional elites, defending their holds against ethnic and lower-class intruders. Thus an alliance was forged and the power of the universities established.
The universities thus consolidated their newly reestablished prestige and firmly established the 4-year B.A. curriculum, or analogues of it in some vocational fields (engineering, business administration, education, nursing), as a universal stage in a sequence toward post-high-school certification. Whatever the amount of training necessary, 4 years of it was to be required; moreover, any field aspiring to high professional status must establish itself in the sequence after the B.A. First with law and medicine (and trickling down within these professions from the most prestigious professional schools to the others by emulation), and later in engineering, business administration, and education, the post-B.A. program became the route to prestige, formalized in employment requirements.
Forms of education that competed with the universities never acquired any status: Correspondence courses and commercial technical schools (such as those offering limited training in engineering specialities or commerce) led a marginal existence, attracting enough students to make a steady profit only in fields of very short labor supply (such as computer programming in the 1960s), or jobs of low status (auto mechanics, secretaries), or by resorting to unscrupulous advertising and financing plans (see Hollingshead, 1949: 380–381; Clark and Sloan, 1966). Such schools emphasized only training; they required no high school degrees (which, after all, may be quite irrelevant to learning a specific technical skill); they did not prolong instruction for 4 years, nor include extraneous, nontechnical courses, nor take up time with extracurricular activities; they certified only the specific skill trained, not a diffuse social status. Yet it is the name of the college degree, the medieval terminology of the B.A. and its modern elaborations, that carries the prestige and attracts the students. Here, as elsewhere, certification prevails over training.
By the mid-twentieth century, the universities had realized a self-fulfilling prophecy. By continually harping on the unspecified but great usefulness of the college degree for “success,” universities had succeeded in surviving, and growing, until the point at which college education could be seen to have specific payoffs. With the rise of extensive graduate education, both in the professions and for scholarly careers (the latter having become increasingly attractive because of the continual expansion of positions for undergraduate teachers), the B.A. program was transformed into a link in a sequence, to be justified by pointing to the next link. Thus colleges traded on their old status skillfully enough through a period when a college degree had no links to further education until that further education arose to justify the undergraduate course.
History helped the universities to cover their tracks in another respect. By continually telling the public that its education led to elite positions and by offering the opportunity for social mobility, the university attracted most of the members of the populace who had any chance to reach elite positions. The colleges now were able to declare the truth of their prophecy, since having attracted most of the upper-middle and upper classes, and then the middle class and the most ambitious and intelligent members of the lower class, they could point out that the elite positions in American society were increasingly filled by college graduates; they neglected to mention that there were now large numbers of college graduates who did not reach high-status positions. By their very numbers, college graduates could no longer count on elite status, since they came to exceed the number of elite positions available (even including those positions they could create).22 But the colleges had carried out a fait accompli, and there was no turning back. Now college graduation had become the requirement for many positions for which no such education had been required before. College education, once an incidental accompaniment of high status, now became the prerequisite of mere respectability.
Educational credentials, as we have seen, first became established within the school system itself. The crucial point was when the elite universities began to accredit high school programs for college admission, thus resolving part of their struggle for an amorphous common age group of potential students. The other half of this struggle was resolved when the universities’ other rival, the professional schools, began to link themselves to colleges by making a bachelor’s degree a prerequisite to entering their programs. Thus educational credentials first became formal requirements for advancement of students within a sequence of schools.23 At first, then, degrees, like grades and test scores today, had no directly negotiable value in the occupational realm but did determine one’s movement to a higher level of schooling. The advanced school levels then began to take on occupational significance in particular professions as these obtained state-licensed monopolies incorporating credential requirements. The process by which this developed and the further spread of educational credentials outward into the rest of the economy is the subject of the next chapter.
The major impetus for this whole development, I have argued, was the severe multiethnic conflicts of American society after the mid-nineteenth century. Insofar as the school system was created to resolve the strife by reducing cultural diversity, one can say it has met with a degree of success. It did manage to make training in Anglo-Protestant culture and political values compulsory for all children up to a certain age, and it did make it virtually compulsory for a continually increasing period beyond this if the student wanted to be economically successful. Yet one could not call this a total success in its own terms. The creation of a massively inclusive educational system has caused schools to become internally bureaucratic parts of an indefinitely lengthy sequence of obtaining a negotiable cultural currency. Hence the content of education has become increasingly irrelevant, except in the very short run of passing a particular course or even cramming for an exam. The results have been stored almost entirely in the cryptic records of credits, grade-point averages, and degrees. The system that was to preserve Anglo-Protestant culture has thus also tended to destroy it.
At the same time, the diversity of ethnic cultures in the populace has given way only gradually. Ethnic cultures are still at least mildly in evidence today, three full generations after the cutting of mass immigration in 1922. And newly mobilized and rather sharply divergent ethnic cultures—black and Latin—have put renewed pressure upon the legitimacy of the culture imposed by the schools. But for all their vehemence, these conflicts are now safely channeled within an institutionalized system. For all their rhetoric of ethnic cultural preservation and even separatism, the main result of these conflicts has been to make the abstract bureaucratic credentials easier for members of their ethnic minorities to obtain. Even this has had ironic consequences. Easier passage through lower or intermediate levels of the educational system has had the aggregate effect of devaluing those levels of the credential currency and mounting pressures for yet higher levels of attainment. However these ethnic struggles are resolved, the expansion of the credential system quite possibly has reached nowhere near its outer limits.
Country |
Percentage completing secondary schoola |
Percentage attending university |
Percentage graduating from university |
United States |
75 |
39 |
18 |
USSR |
47 |
19 |
9 |
Japan |
57 |
11 |
12 |
France |
30 |
11 |
— |
England |
12 |
6 |
5 |
West Germany |
11 |
6 |
— |
Sources: Bereday (1969, 80, 281); Havighurst (1968, 55–56); Blewett (1965, 113, 118, 122, 158–159).
a Percentages refer to relevant age groups.
There were more “socially maladjusted people” in this country than ever before, the Board (quoting liberally from the Reader’s Digest) suggested; “crime and disobedience to the law” were on the upswing; thousands of Americans were killed and maimed every year in automobile accidents. The nation could take a vital step in the direction of solving these problems by providing more vocational education; the crime rate could be lowered and the many Americans handicapped by accidents could be rehabilitated.