The Reconstitution of the Hospital
FEW institutions have undergone as radical a metamorphosis as have hospitals in their modern history. In developing from places of dreaded impurity and exiled human wreckage into awesome citadels of science and bureaucratic order, they acquired a new moral identity, as well as new purposes and patients of higher status. The hospital is perhaps distinctive among social organizations in having first been built primarily for the poor and only later entered in significant numbers and an entirely different state of mind by the more respectable classes. As its functions were transformed, it emerged, in a sense, from the underlife of society to become a regular part of accepted experience, still an occasion for anxiety but not horror.
The moral assimilation of the hospital came at the end of the nineteenth century with its scientific redefinition and incorporation into medicine. We now think of hospitals as the most visible embodiment of medical care in its technically most sophisticated form, but before the last hundred years, hospitals and medical practice had relatively little to do with each other. From their earliest origins in preindustrial societies, hospitals had been primarily religious and charitable institutions for tending the sick, rather than medical institutions for their cure. While in Europe from the eighteenth century they played an important part in medical education and research, systematic clinical instruction and investigation were neglected in America until the founding of Johns Hopkins. Before the Civil War, an American doctor might contentedly spend an entire career in practice without setting foot on a hospital ward. The hospital did not intrude on the worries of the typical practitioner, nor the practitioner on the routine of the hospital.
But in a matter of decades, roughly between 1870 and 1910, hospitals moved from the periphery to the center of medical education and medical practice. From refuges mainly for the homeless poor and insane, they evolved into doctors’ workshops for all types and classes of patients. From charities, dependent on voluntary gifts, they developed into market institutions, financed increasingly out of payments from patients. What drove this transformation was not simply the advance of science, important though that was, but the demands and example of an industrializing capitalist society, which brought larger numbers of people into urban centers, detached them from traditions of self-sufficiency, and projected ideals of specialization and technical competence. The same forces that promoted the rise of hospitals also brought about changes in their internal organization. Authority over the conduct of the institution passed from the trustees to the physicians and administrators. Nursing became a trained profession, and the division of medical labor was refined and intensified, as conceptions of efficient and rational organization prevailing elsewhere in the economy were applied to care of the sick. The sick began to enter hospitals, not for an entire siege of illness, but only during its acute phase to have some work performed upon them. The hospital took on a more activist posture; it was no longer a well of sorrow and charity but a workplace for the production of health.
The effects of this change rippled outward, altering the relationship of doctors to hospitals and to one another and shaping the development of the hospital system as a whole. Once the hospital became an integral and necessary part of medical practice, control over access to its facilities became a strategic basis of power within the medical community. The tight grip that a narrow elite long held over hospitals no longer seemed tolerable to other physicians, who responded by forming their own institutions or pressing for access to established ones. Under financial pressures and the threat of increased competition, the older hospitals gradually opened their doors to larger numbers of practitioners, creating a wider network of associations stratifying and linking together the profession in new and unexpected ways.
The access that private practitioners gained to hospitals, without becoming their employees, became one of the distinctive features of medical care in America, with consequences not fully appreciated even today. In Europe and most other areas of the world, when patients enter a hospital, their doctors typically relinquish responsibility to the hospital staff, who form a separate and distinct group within the profession. But in the United States, private doctors follow their patients into the hospital, where they continue to attend them. This arrangement complicates hospital administration, since many of the people making vital decisions are not the institution’s employees. Yet it also may encourage more private relationships between doctors and patients than exist where patients are attended solely by salaried hospital physicians.1
The terms “public” and “private” refer both to individual experience (its visibility to other people) and to the structure of institutions (their relation to the state). In both of these senses, hospital care in America has generally had a more private character than it has elsewhere. American hospitals not only have private doctors; their architecture creates more private space for the treatment of patients. Hospitals in Europe and elsewhere typically offer more of their care in large open wards, while American hospitals tend to be smaller in size, with more private accommodations. The economic organization of hospitals in the United States also reflects a less public conception of their function. Instead of a centralized system of hospitals under state ownership, America developed a variety of institutional forms—a kind of “mixed economy” in hospitals—with both public and private institutions of several kinds under independent management. The institutional transformation of the late nineteenth century did not lead to any higher-level coordination. Both internally and as a system, American hospitals have had a relatively loose structure because of the autonomy of physicians from hospitals and of most hospitals from the government. While hospitals changed radically, private interests, as well as the interests of privacy, were preserved and even strengthened.2
Hospitals Before and After 1870
The reconstitution of the hospital involved its redefinition as an institution of medical science rather than of social welfare, its reorganization on the lines of a business rather than a charity, and its reorientation to professionals and their patients rather than to patrons and the poor. I state the changes rather sharply for emphasis; they need to be qualified in some particulars. Well before 1870 private voluntary hospitals in America emphasized active medical treatment and received some paying patients; well after 1910 they remained legally under the control of trustees as charities rather than profit-making firms. But as hospital care turned into a sizeable industry at the end of the nineteenth century, medical activism, professional dominance, and an orientation to the market became much more pronounced and widespread, even in voluntary institutions. And large numbers of new hospitals were established, the majority as business enterprises.
The late nineteenth century in America was a period of economic expansion and rapid institutional development that saw not just an increase in the number of organizations of all kinds, but also a renovation in their structure. The growth of business corporations, as Alfred Chandler has pointed out, was accompanied by the emergence of a salaried management and the multiunit firm. The rise of hospitals, as of universities, offers a study in the penetration of the market into the ideology and social relations of a precapitalist institution. As the university became more actively concerned with preparing students for practical careers, it moved from gentlemanly to utilitarian values and accorded more prominence and autonomy to its professors. As the hospital advanced in its functions from caretaking to active treatment, it shifted in its ideals from benevolence to professionalism and accorded its physicians greater power. In orienting their efforts to newly marketable services, both institutions became less concerned with moral supervision and turned more squarely to professionals to carry out their new productive functions.3
Set in a wider historical frame, the reconstitution of the hospital belongs to the general movement in social structure from “communal” to “associative” relations. As Weber made the distinction, communal relations refer to the bonds of families and brotherhoods and other ties of personal loyalty or group solidarity; associative relations involve economic exchanges or associations based on shared interests or ends.4 The shift from the communal to the associative has taken place in two ways. Not only have the household and community given up functions to formal organizations; the organizations themselves have also changed. Institutions that were once primarily communal have become increasingly associative. This has been true historically even of corporations. The concept of the corporation was originally applied to monasteries, towns, and universities, where members were related to each other not by owning things in common but by living and working together. Corporations were communities. Only later did the corporation take on an abstract existence as an entity for doing business.5 This same evolution took place in the development of hospitals, which include some of the oldest corporations in continuous existence. Medieval hospitals were conducted by religious or knightly orders and had a strong communal character; those who worked there were bound together in a common identity and belonged to a common household. “Even when hospitals were taken over from the ecclesiastical authorities by municipalities in the later Middle Ages,” writes George Rosen, “they were not secularized. Essentially, the hospital was a religious house in which the nursing personnel had united as a vocational community under a religious rule.”6 In a different way, the almshouses of colonial America, which were the first institutions here to care for the sick, retained a communal character. The colonial almshouse, David Rothman writes, provided a “substitute household” for people without a home who were poor or sick. “The residents were a family, not inmates.” Even the architecture of the colonial almshouse, which resembled an ordinary residence, reflected its conception as a household. In the language of architectural historians, its social structure, as well as its architectural form, was “derived” rather than “designed.”7
Later almshouses and hospitals, with a distinctly public architecture, became more bureaucratic than familial in their internal organization. Early hospitals had a fundamentally paternalistic social structure; their patients entered at the sufferance of their benefactors and had the moral status of children. The staff, who often resided as well as worked within the hospital, were subject to rules and discipline that extended into their personal lives. A steward and matron, who might be husband and wife, presided over the hospital family. As the hospital has evolved from household to bureaucracy, it has ceased to be a home to its staff, who have come to regard themselves as no different from workers in other institutions. In their relation to patients and the public, hospitals have come to rely less on charity and more on payments for services. The modern history of the hospital has seen a steady stripping away of its communal relations as it has more closely approached the associative structure of business organizations.
The development of American hospitals, Henry Sigerist once suggested, recapitulates in shorter time the historical phases of European hospitals.8 First came the almshouses and similarly unspecialized institutions, serving general welfare functions and only incidentally caring for the sick. Founded as early as the seventeenth century in America, they received dependent persons of all kinds, mixing together promiscuously the aged, the orphaned, the insane, the ill, the debilitated. Next appeared hospitals serving the sick but still limited to the poor; finally, in the nineteenth century, hospitals serving all classes of society emerged. In other words, the almshouse metamorphosed into the modern hospital first by becoming more specialized in its functions and then by becoming more universal in its use. In 1752 the Pennsylvania Hospital in Philadelphia became the first permanent general hospital in America built specifically to care for the sick; it was followed by New York Hospital, chartered in 1771, but not opened until twenty years later, and the Massachusetts General Hospital, opened in Boston in 1821. These were later to be called “voluntary” hospitals—voluntary because they were financed by voluntary donations rather than by taxes.
The establishment of these first hospitals did not signal the decline of the almshouse. On the contrary, almshouses became more important in the nineteenth century than they had been in the eighteenth. In the colonial period, the almshouse was a secondary response to poverty and illness. As I indicated earlier, the colonists preferred to provide relief to the poor in their own homes, or to pay neighbors for taking care of the feeble and the sick. Institutions were a last choice, to be used for strangers or especially onerous cases. But after about 1828, there was a shift in policy as states abolished home relief (generally reinstating it only during periods of economic distress). By making the almshouse the only source of governmental aid to the poor, legislatures hoped to restrict expenditures for public assistance. Often squalid and overcrowded, a place of shame and indignity, the almshouse offered a minimal level of support—its function as a deterrent to poverty and public assistance ruled out any amenities. Deterioration and neglect were common. Reformers, especially after the Civil War, devoted much of their effort to splitting up the undifferentiated almshouse and sending orphaned children, the insane, the blind, and the sick to institutions specifically concerned with their problems. In a number of cities, public hospitals evolved out of almshouse infirmaries. The Philadelphia Almshouse became Philadelphia General Hospital; Manhattan’s Bellevue Hospital grew out of the New York Almshouse; the Baltimore County Almshouse became part of the Baltimore City Hospitals.9
Early American charity hospitals developed in a complementary relation to previously established almshouses and public hospitals. They were an attempt not only to separate out some of the sick from the poor and dependent, but also to provide a somewhat better alternative for the more respectable poor with curable illnesses, as well as a haven for occasional well-to-do people in special circumstances. Voluntary institutions, like the Pennsylvania and Massachusetts General Hospitals, were generally kept cleaner and better maintained and had less of a moral stigma than the almshouse, although they were still not widely used by members of the middle and upper classes.10 Anxious to give these hospitals a more attractive identity and to make them safer and more acceptable, their managers and physicians excluded dangerous or morally reprehensible cases. The contagiously ill they sent to the pesthouse, and the incurable and chronically ill, as well as those whom they thought wicked and undeserving, they sent to the almshouse. Such exclusions enabled the hospitals to restrict the number of patients they admitted and to keep down the reported mortality rates, since the hopelessly ill could be directed or transferred elsewhere before they became a blot on the hospital’s good name. This practice was encouraged by the medical staff, since the hospital would be less useful as a source of instruction to students if it filled up with chronic cases.
But, most of all, the exclusion of undesirable cases served to combat the traditional image of the institution as a house of death. Early hospitals were considered, at best, unhappy necessities. Reflecting on his experience during the Revolutionary War, Benjamin Rush had called them “the sinks of human life in an army” and hoped that the progress of science would go so far “as to produce an abolition of hospitals for acute diseases.” Many early attempts to build hospitals aroused public opposition, especially from those who lived in the vicinity. Skepticism about their value was far from irrational. Mortality after surgery, according to data from English hospitals published around 1870, was not only higher in hospitals than at home, but it rose with the size of the hospital. In an essay awarded a prize by Harvard University in 1876, Dr. W. Gill Wylie could write that civilization had not yet reached “that state of perfection where hospitals can be dispensed with.” Accident casualties, victims of contagious epidemics, soldiers, homeless paupers, and the insane required hospital care. But to extend hospitals any further was to encourage pauperism, idleness, and the breakup of the family. Hospitals, Wylie thought, “tend to weaken the family tie by separating the sick from their homes and their relatives, who are often too ready to relieve themselves of the burden of the sick.”11
Up to the time Wylie wrote, hospitals had been formed mainly to take care of people who did not fit into the system of family care. The earliest hospitals were built chiefly in ports or river towns—Philadelphia, New York, Boston, New Orleans, Louisville—centers of commerce where strangers were likely to be stranded sick or where people were likely to be found working and living alone. Institutional charters and appeals for funds alluded to the needs of such people. In 1810, when Doctors James Jackson and John C. Warren circulated a letter to some of the “wealthiest and most influential citizens” of Boston to interest them in a hospital, they mentioned, as cases in need, journeymen mechanics living in boarding houses, widowed or abandoned women, servants, and others who had no adequate housing or kin to care for them. While only scattered figures are available, isolated individuals seem to have been disproportionately represented among patients in general hospitals.12
The impulse for founding the early hospitals typically came from physicians who struck up alliances with wealthy and powerful sponsors. Doctors had an interest in creating hospitals as a means of developing medical education and as a source of prestige. The status and influence they derived from hospital positions were of such value to them that they gave their services to the hospitals without pay. In fact, at the founding of the Pennsylvania Hospital in 1751, three doctors were so eager to serve as its staff that they volunteered to provide all medicines for three years at their own expense, as well as free services.13 But in spite of the advantages doctors derived from hospitals, they could not establish them independently under their own control, for lack of funds and because of distrust of their motives. Particularly distrustful were the sick poor, who feared they might be used for surgical experiments or, in the event of death, turned over to medical students for dissection. Needing capital and legitimacy, the doctors were obliged to seek out the sponsorship of merchants, bankers, lawyers, and political leaders, who could contribute money and lead subscription campaigns. As a result, there developed an organizational structure in which boards of managers, trustees, governors, or commissioners, rather than physicians, retained the final decision-making power in private as well as public hospitals. This arrangement had its direct antecedents in England, but it would not have been reestablished in American communities unless strong forces continued in its favor. So long as doctors could not get hospitals to yield a return on the needed investment, their dependence on sponsors was unavoidable. In Reading, Pennsylvania, in the late 1860s, local physicians interested in founding a dispensary and hospital quickly realized, according to a history of the Reading Hospital, “the importance and necessity of obtaining the cooperation of certain citizens representative of the professional and business interests” of the city. Exercising great care, they chose representatives of “bench and bar, banking, the iron, lumber, publishing and brewing industries, as well as railroad and navigation, and of course, the political representatives of city, state and federal government.” The local historian who describes these choices then perspicaciously remarks, “The men engaged in these pursuits—because of their wealth and professional standing—were bound up with the interests of the community in innumerable other ways. Churches, schools, charitable organizations, and all the intricate network of communal intercourse found expression through these leading and responsible citizens.”14 Such are the advantages of having a ruling class.
For the sponsors of hospitals, the benefits were various. As so often happens to the rich and successful, by serving a social interest, they could advance their own. No doubt, hospitals helped to satisfy a genuine sense of religious obligation to the helpless; the institutions might also bring about an improved standard of medical practice by giving young physicians experience working under supervision with the poor; and they might even prove a sound investment for the community by restoring to productive labor people who might otherwise become public charges. These were the kinds of considerations—the manifest functions, as sociologists say—that dominated the rhetoric of motivation. At another level, not to be overlooked but not to be exaggerated either, hospitals also conferred a certain amount of power on their trustees through management of the endowment, the letting of contracts, patronage in appointments, and even the admission of patients. In the nineteenth century, the trustees or managers entered directly into the detailed operation of hospitals, including decisions that now would be seen as strictly medical. To gain entry to a “free bed”—one that was privately endowed and required no payment—a patient generally needed a letter from a trustee or subscriber who previously had contributed to the hospital. Thus the links between the donor and recipient of charity were sometimes quite explicit and personal. The sponsorship of hospitals gave legitimacy to the wealth and position of the donors, just as the association with prominent citizens gave legitimacy to the hospital and its physicians. Hospital philanthropy, like other kinds of charity, was a way to convert wealth into status and influence. George Templeton Strong, a Wall Street lawyer active in founding St. Luke’s Hospital, noted in his diary in May 1852, after John Jacob Astor had decided to donate $13,000, “If he and Whitney and the other twenty or thirty millionaires of the city would do such things oftener, they would never feel the difference, and in ten years would control the course of things in New York by the public confidence and gratitude they would gain.”15 An exaggeration without question, but not without some truth to it: witness the later philanthropy of the Rockefellers. Charity, too, pays dividends. Besides softening public hostility toward accumulated wealth, it also helps secure status within an upper class, which is likely to be the chief reference-group of the donor. Membership on the boards of hospitals and other private institutions became an important index of social position. In New York City, according to a historian of its Jewish community, Jews’ Hospital (later Mt. Sinai) developed within a few years after its founding in 1852 into “the most important Jewish organization in the city.” The hospital’s annual public dinners were the most lavish ever held among New York Jews, and the success of the city’s rising German Jews in securing seats on the hospital’s board soon after it was created signaled the end of their subordination to the more established English and Portuguese Jewish elite.16
Despite the various indirect incentives to contribute, donations and bequests generally did not cover the costs of voluntary hospitals. The institutions turned instead to their patients for funds, requiring them to pay at least part of the cost of their treatment. At the Pennsylvania Hospital between 1751 and 1850, according to one study, 70 percent of the mental patients had their treatment paid for, compared with 39 percent of the medical patients and none of the maternity cases.17 These figures may not have been typical, but the pattern probably reflects the diminishing proportion of persons in each category from middle-class, or at least self-supporting, families. Perhaps the presence of paying patients took away some of the traditional odium that had hung about the hospital. In America, the identification of hospitals with the pauper class was never as absolute as in Europe. On hospital wards, paying and free patients were treated together, while some wealthier individuals paid for private rooms apart from the rest. However, even these few private patients paid no fees to physicians. A tradition had been established in both public and voluntary hospitals that physicians were not supposed to take money for work there. As a charity, the hospital lay outside the theater of production and exchange.
The Making of the Modern Hospital
Primarily because of increased concern for cleanliness and ventilation, hospitals began to emerge from obloquy and disrepute even before any major technological advances had been made. During the Civil War, hospitals were no longer the sinks of human life that Benjamin Rush had mourned during the Revolution. The Union built a vast system of over 130,000 beds by the last year of the war and treated more than a million soldiers with a mortality of only eight percent. While the germ theory of disease was yet to be fully formulated, hospital authorities had heeded some of the lessons of Florence Nightingale, who through improved hygiene had reduced the death rate from forty to two percent at the British military hospitals in Scutari during the Crimean War.18
Two developments after the Civil War—one in organization, the other in medical knowledge—furthered the tendencies toward order and cleanliness already at work. The first was the professionalization of nursing, beginning with the establishment in 1873 of three training schools in New York, New Haven, and Boston. The second was the advent of antiseptic surgery, first announced by Joseph Lister in 1867 but not generally adopted for another ten or fifteen years. Together with the growth of demand from the middle and upper classes because of urbanization and changes in family structure, these developments helped to produce a deep change in the character of hospitals as well as an increase in their number.*
Before the 1870s, trained nurses were virtually unknown in America. Hospital nursing was a menial occupation, taken up by women of the lower classes, some of whom were conscripted from the penitentiary or the almshouse. The movement for reform originated, not with doctors, but among upper-class women, who had taken on the role of guardians of a new hygienic order. In New York, the impetus came from women in the State Charities Aid Association, who in 1872 formed a committee to monitor the conduct of public hospitals and almshouses. They represented, in the association’s own humble words, “the best class of our citizens as regards enlightened views, wise benevolence, experience, wealth, influence, and social position.” At Bellevue, the women found patients and beds in “unspeakable” condition; the one nurse for a surgical ward slept in the bathroom, the hospital laundry had not had any soap for weeks, and at night no one attended the patients except the rats that roamed the floors. Though some doctors approved of the ladies’ desire to establish a nurses’ training school, which would attract the wholesome daughters of the middle class, other medical men were opposed. Plainly threatened by the prospect, they objected that educated nurses would not do as they were told—a remarkable comment on the status anxieties of nineteenth-century physicians. But the women reformers did not depend on the physicians’ approval. When resisted, as they were at Bellevue in efforts to install nurses on the maternity wards, they went over the heads of the doctors to men of their own class of greater power and authority.19 (Florence Nightingale, who had friends high in the English government, had followed exactly the same course in reforming her country’s military hospitals.) Professional nursing, in short, emerged neither from medical discoveries nor from a program of hospital reform initiated by physicians; outsiders saw the need first. Eventually, of course, physicians came not only to accept but to rely on trained nurses, who proved essential in carrying out the more complex work that hospitals were taking on. The new nurses’ training schools also provided a source of cheap labor in the form of unpaid student nurses, who became the mainstays of the hospital’s labor force. (Graduates went into private nursing, if they found work.) The three training schools of 1873 became 432 by 1900, and 1,129 by 1910.20
Like nursing, but even more so, surgery enjoyed a spectacular rise in prestige and accomplishment in the late 1800s. Before anesthesia, surgery was brutal work; physical strength and speed were at a premium, so important was it to get in and out of the body as fast as possible. After Morton’s demonstration of ether at the Massachusetts General Hospital in 1846, anesthesia came quickly into use, and slower and more careful operations became possible. But the range and volume of surgery remained extremely limited. Infections took a heavy toll in all “capital operations,” as major surgery was so justly called: The mortality rate for amputations was about 40 percent. Very rarely did the surgeon penetrate the major bodily cavities, and then only in desperation, when every other hope had been exhausted. Operations were so infrequent that a surgeon’s colleagues considered it a privilege to be brought along to help out even in the minor chores. Surgery had a small repertoire and it stood far behind medicine in the therapeutic arsenal.21
Change came slowly after Lister’s work on antisepsis was published in 1867 because it was inherently difficult to reproduce. Many surgeons tested out his carbolic acid spray but found they were still plagued by fatal infections; carrying out antiseptic procedures demanded a strictness—an “antiseptic conscience” it would later be called—they could not at first appreciate. Lister’s method was not generally adopted until around 1880, soon after which it was superseded by aseptic techniques. (While antisepsis called for use of disinfectants during surgery to kill microorganisms, asepsis relied on sterile procedure to exclude them from the field of operation.) With control over infection, surgeons could begin to explore the abdomen, chest, and skull, but before they could do much good, a variety of new techniques had to be developed and mastered by the profession. It was not actually until the 1890s and early 1900s that surgery began to take off. Then, in a burst of creative excitement, the amount, scope, and daring of surgery enormously increased. Improvements in diagnostic tools, particularly the development of X-rays in 1895, spurred the advance. Surgeons began to operate earlier and more often for a variety of ills, many of them, like appendicitis, gallbladder disease, and stomach ulcers, previously considered medical rather than surgical cases. At the turn of the century, the main field of surgical invention was the abdomen. The Midwestern virtuosos William and Charles Mayo, who had done only 54 abdominal operations between 1889 and 1892, recorded 612 in 1900 and 2,157 five years later. A report by William Mayo on 105 gallbladder operations was rejected by a prominent medical journal in 1899 because the total was thought implausible; five years later the same journal reprinted an article by Mayo describing the results of a thousand such operations.22 In the early 1900s, surgery continued to expand, as thoracic surgery and surgery of the nervous and cardiovascular systems developed.
Growth in the volume of surgical work provided the basis for expansion and profit in hospital care. But first certain impediments to the use of hospitals had to be removed. Before 1900 the hospital had no special advantages over the home, and the infections that periodically swept through hospital wards made physicians cautious about sending patients there. Even after the danger of cross-infection had been reduced, the lingering image of the hospital as a house of death and its status as a charity interfered with its growth. Both patients and physicians had grounds to be wary of hospitals. Many people objected to losing the privacy and control that they might have had at home; as ward patients, the poor had no say in choosing their physicians. And though practitioners might have liked to refer more patients to hospitals, they were often afraid that doing so would mean losing the fee, and perhaps the case, because the staff might offer treatment without charge on the hospital ward. It took time to establish new understandings about professional fees and control over patients. So at first, ether and antisepsis were adapted to the home and “kitchen surgery” continued. But performing surgery in the home became steadily more inconvenient for both the surgeon and the family, as the procedures became more demanding and more people moved into apartments. And the more busy surgeons became, the more costly was the lost time in traveling to the patient’s home. To accommodate desires for privacy and fears of the hospital, many surgeons first moved their operations to private “medical boarding houses,” which provided hotel services and nursing. In the suburbs and small towns, doctors built small hospitals under their own ownership; surgery had now made hospital care profitable and permitted them to open institutions without upper-class sponsorship and legitimation. After 1900, as the old prejudice died out, most surgery moved inside hospitals.23
With greater pressure for admission, hospitals began to limit care to the more acute periods of illness, rather than the full course. Although from their beginnings American hospitals had concentrated their efforts on curable patients rather than chronic invalids, average stays had typically been long, as much as a month or more. At the Massachusetts General Hospital, the average stay for free patients dropped below four weeks for the first time in 1886; ten years later, the hospital began reporting the average length of hospitalization in days rather than weeks. At Boston City Hospital, the average stay fell from 27 days in 1870–71 to 17.8 days thirty years later. At the Bridgeport Hospital in Connecticut, it dropped from 32 to 13 days between 1900 and 1920. By 1923 general hospitals in America had an average length of stay of 12.5 days; a half century later they would average about 7.24
The growing emphasis on surgery and the relief of acute illness brought about a redefinition of purpose in some of the older charity hospitals. Active medical and surgical treatment supplanted religious and moralistic objectives and became the overriding mission. In New York City, a charitable society of wealthy ladies concerned abut the “unobtrusive sufferings” of former slaves opened a Home for Worthy, Aged, Indigent Colored People in 1842. In 1882, “in view of the thoroughly organized medical department,” the home changed its name to the Colored Home and Hospital. It became the Lincoln Hospital and Home in 1902 as it opened its doors to white patients and local physicians, and simply Lincoln Hospital in 1925 when it was turned over to the city. From a paternalistic charity providing custodial care to poor and deserving blacks, it had turned into a general hospital providing acute care to the poor of all descriptions.25
A similar shift from moralistic to medical objectives took place at Children’s Hospital in Boston, whose evolution Morris Vogel has described. When the hospital was founded in 1869, its managers announced that “while endeavoring to cure, or, at least to alleviate” the diseases of poor children, they also desired “to bring them under the influence of order, purity and kindness.” The hospital was initially concerned not so much with medical intervention as with providing an alternative home for children who were neglected, a salubrious haven where they would be nursed, fed, kept clean and safe, and receive what its managers at one point referred to as a “positively Christian nurture.” So anxious were they to isolate the children from outside influences that they restricted visiting hours to one relative at a time, between eleven and twelve o’clock, weekdays only, thereby judiciously barring working parents from frequent contact with their children. Medical concerns became steadily more important during the 1870s, when an outpatient department was opened, but the real turn toward medical activism came the following decade, as orthopedic surgery advanced. In 1883 the number of surgical patients exceeded medical patients for the first time. Moral uplift disappeared from official statements of the hospital’s objectives as the treatment of disease and injury became the chief concern. And instead of treating the poor only, the institution began to admit children from all classes.26
As hospitals became more generally accepted, the social origins of their patients changed. We have no systematic socioeconomic data, although scattered statistics for particular hospitals suggest that by the early twentieth century the occupational distribution of their adult patients became more nearly like that of the population. Perhaps the clearest evidence of the shift came in the architecture of hospitals. The changing ratio between wards and private rooms reflected the changing social balance. Few class distinctions could be more sharply delineated. While ward patients were attended by the hospital staff, private patients were attended by doctors of their choice. Ward and private patients usually received two different kinds of food, and ward patients were often not permitted to see friends and relatives as frequently as were private patients. General hospitals built before 1880 consisted almost entirely of wards, with only a few private rooms. Large wards, as Florence Nightingale pointed out in her influential book Notes on Hospitals, permitted more efficient nursing: A single night nurse could attend forty patients in one ward but not four wards of ten patients each. Large wards, Nightingale also argued, improved ventilation, simplified discipline and reduced construction costs.27 But despite these advantages, by 1908 large wards had declined to only 28 percent of the beds in hospitals designed that year, while single rooms now accounted for nearly 40 percent. These trends continued over the next two decades.28 New intermediate accommodations, semi-private rooms, were built for the middle classes, who were widely believed to have been neglected by hospitals. Hospitals had gone from treating the poor for the sake of charity to treating the rich for the sake of revenue and only belatedly gave thought to the people in between.
As hospitals came to use more of their beds for surgery and the treatment of acute illness, they had less room for recuperating patients, who were discharged earlier, sometimes to newly built convalescent homes. As a result, the boundary between staff and patients in hospitals, once crossed by convalescents and the less seriously ill, now became more fixed. In the almshouse, the inmates had taken care of each other; the original rules of the Pennsylvania Hospital, as of many others, required patients to help in nursing, washing and ironing, and cleaning the rooms. But as general hospitals became more strictly devoted to acute illness, such functions were taken over completely by employees of the institutions. By 1907, in an essay on “The Social Function of the Hospital,” a writer could complain, “At no point in his hospital career is [a patient] looked upon as anything but a medical subject He enters the hospital because he is sick, he is treated as a phenomenon of medicine and surgery, and is discharged ‘cured,’ ‘improved,’ or ‘no hospital case.’ His social status, one might say, is studiously ignored.” The patient’s role was being reduced to Parsons’ “sick role.” Not considered responsible for their infirmities, the sick are released from daily obligations in exchange for which they are obliged to submit to treatment and try to get well.29 These presumptions did not obtain in almshouses, where the sick were often seen as responsible for their infirmities, not released from all obligations, and not expected to get well. A complete dispensation from all duties came only in the fully bureaucratized hospital. This meant higher costs, since attendants had to be employed to do the work previously done by patients.
As the functions and standards of hospitals changed, construction and operating costs both increased. The typical hospital of 1870, S. S. Goldwater wrote in 1905, had cost about 15 cents per cubic foot and allowed, if liberal in its proportions, about 6,000 cubic feet per patient. It had only rudimentary heating and plumbing and usually was not fireproof. In 1905, such a hospital, Goldwater estimated, could be built at 20 cents per cubic foot, or $1,200 per bed. But because of new technological and legal requirements for hospitals, the prevailing costs per cubic foot actually ran about 40 cents and the number of cubic feet per patient had risen to 11,000. As a result, the cost per bed was now $4,400 instead of $1,200.30 In addition, the greater emphasis on acute care intensified hospital work, requiring more employees and higher operating costs per patient. Hospital budgets soared beyond the capacity of charity to meet them.
Because of the higher costs it brought, the intensification of hospital care required charitable institutions to put their finances on a new foundation. A crisis in hospital finance in New York City in 1904 brought the problem to public notice, forcing the press, the makers of policy, and the institutions themselves to explore the available alternatives.31 The private hospitals could turn to the government for more aid, but the city was already facing increased costs for its own hospitals, and no one, in those days, proposed going hat in hand to Albany or Washington. They could also turn to the public for more voluntary contributions and organize a concerted fund drive, but this source, too, proved insufficient. A third response was to call for greater efficiency and stricter business methods in hospital management. The old charity hospitals had been managed on an almost informal basis. Now they had become large organizations and there was a demand for more careful accounting, more specialized labor, and better coordination of the various auxiliary hotel, restaurant, and laboratory services that a hospital maintained. The old rhetoric of charitable paternalism was superseded by a new vocabulary of scientific management and efficiency. While much of this may have been more talked about than acted upon, the ideological change was one further signal of the hospital’s transition from household to bureaucracy.32
The principal answer to the hospitals’ financial difficulties proved to be greater payments by patients. New conditions brought on the increase in costs, but they also enlarged the potential for income. Many people were now coming to hospitals who could afford to pay, and since the real value of hospital care had increased, charges would not drive them away. The hospitals were also encouraged to impose fees by physicians who objected to the free services being given patients who could afford to pay a doctor at home, but avoided all charges by going to a hospital. Between 1911 and 1921 in New York, ward paying patients increased from 18 to 45 percent of the total number and private patients rose from 20 to 24 percent, while charity cases declined. By the twenties, according to a survey sponsored by the New York Academy of Medicine, hospital finances in the city had become secure; two fifths of the hospitals were even reporting budget surpluses.33 For the United States as a whole in 1922, receipts from patients amounted to 65.2 percent of the income of general hospitals. Public appropriations accounted for 17.7 percent; endowment income, 3.6 percent; donations, 5.7 percent; and all other sources, 7.8 percent.34
Changes in organization and financing gradually altered the distribution of power and authority in hospitals. The trustees’ sphere of control diminished, while the physicians’ sphere expanded. The shift was apparent in control over admissions. Originally, at voluntary hospitals the trustees as well as the doctors took part in deciding who of the deserving poor to accept, but as hospitals became more strictly medical institutions, the trustees’ role in admitting declined. In 1875, as part of a continuing conflict, five members of the medical staff at Presbyterian Hospital in New York resigned because of opposition to the trustees’ power of approval in admissions. The Boston City Hospital in 1897 dropped the provision that trustees might admit patients. Elsewhere the power of trustees and donors to nominate patients to free beds was quietly forgotten.35
The devolution of decision-making power to physicians reflected the more general change (to which I have already alluded) in the structure of organizations in the late nineteenth century—the growing importance of a salaried management in corporations, of administrators and professors in universities, of salaried editors and professional reporters on newspapers, of civil servants in government. In hospitals, the trustees could no longer enter into the details of management; the more common pattern was for the executive of the hospital to resolve all ordinary questions and to turn to the board only at intervals on major matters of policy.36 Unlike corporations, however, hospitals saw authority devolve more upon outside professionals, the medical staff, rather than upon its own salaried management. This peculiarity of organization arose because of the special role that outside doctors came to play in the prosperity of the institution: They had replaced the trustees as the chief source of income. When hospitals relied on donations, the trustees were vital. But as hospitals came to rely on receipts from patients, the doctors who brought in the patients inevitably became more important to the organization’s success.
THE TRIUMPH OF THE PROFESSIONAL COMMUNITY
The growing importance of hospitals to medical practice posed a severe problem for most members of the profession. While the few physicians who held hospital appointments were gaining a more decisive role, most practitioners were cut off from access to hospitals. In 1873, when the first national survey of hospitals was undertaken, the total number of visiting physicians was estimated at 580; the data were doubtlessly incomplete.37 If, however, there were twice as many, the proportion of American physicians with hospital privileges would still have come to only about 2 percent. In the 1870s, this narrow monopoly was of relatively small consequence to most doctors since the few hospitals then in existence were used almost entirely by the poor. But as late as 1907, after hospitals had grown enormously in number and importance, a physician surveying his colleagues in the Bronx and Manhattan found that only about 10 percent held hospital positions. “The rest,” he wrote, “are entirely excluded without rhyme or reason from hospital practice, and cannot enjoy even a share of the benefits derived from such a connection.” Exclusion now “seriously handicapped” a physician. Moreover, it was unfair to patients to deny them the choice of their own family doctor when they entered a hospital. “On the one hand we have a public educated to avail itself of the facilities of a hospital in severe illness, and on the other hand a cast-iron regulation which closes the doors of the hospital to the majority of practitioners. This ‘system’ has made such striking inroads on the earning capacity of physicians in cities where it flourishes as to entail enormous pecuniary losses.”38
While patterns of organization varied, the medical staff of late nineteenth-century hospitals was typically arranged in four distinct groups: a consulting staff, composed of older and distinguished physicians, who had no regular duties; a visiting or attending staff, made up of the active physicians who supervised treatment; a resident or house staff of young doctors in training who carried out the details of treatment; and a dispensary staff that saw outpatients. Of these groups, the visiting physicians were the most important. They generally served for rotating periods of three or four months a year, a system that reduced the burden on each physician, while allowing, as one surgeon pointed out in 1885, “a much larger number of medical men to derive whatever advantage there may be from the name of being connected with the institution.39
Hospitals paid none of these doctors for their work. The house physicians gave their services for a year or eighteen months in exchange for room, board, and experience; the dispensary staff gave theirs in the hope of obtaining appointments as visiting physicians and to make themselves known to patients, who might then come to their private offices. The visiting staff provided its labor in return for access to surgical facilities, opportunities to specialize, prestige, the use of capital that the community invested in hospitals, and regular contacts with colleagues, which might open the way to referrals, consultations, and professional recognition.40 During the period between 1870 and 1910, hospital appointments became more valuable as hospitals became indispensable for surgical practice and specialization advanced.
But while their economic value increased, hospital appointments remained concentrated among a small professional elite. Among general practitioners resentment of hospitals was widespread. In an editorial in 1894, the Medical Record noted that most doctors looked upon the growth of hospitals “critically, not to say coldly.” They resented the “arbitrary” treatment they received at the hands of hospital managers who took advantage of their desire for hospital affiliation “to get as much out of them with as little return as possible.” The hospitals were killing private surgical work. Even well-to-do patients might enter hospitals, “paying perhaps nothing,” because the hospital rules permitted no private fees to be taken. “The spread of the hospital is thus tending to throw a larger amount of medical work every year into the hands of corporations . . . [making] the few skilful, the many unskilful and dependent.”41
The widely resented rule forbidding physicians to take fees from private patients, which had been established at the older voluntary hospitals, began to die out at the turn of the century. In 1880, according to Henry Burdett, no American hospitals permitted fees. But by 1905 a writer in the Boston Medical and Surgical Journal could report that of 52 hospitals surveyed in New England, only five, among them the Massachusetts General Hospital, continued to deny physicians the right to charge for services to private patients. In general, hospitals were now permitting physicians to charge patients in private rooms, but still barred fees for ward patients. Increasingly, they were also allowing physicians not on their staff to treat paying patients in unused private rooms. But ambiguities and difficulties persisted; in 1904 a hospital journal reported that whether patients in private rooms had to compensate physicians for their services was still “not clearly defined.” In a typical situation, “a doctor who is not on the staff sends a patient to the hospital, and, perhaps before he has made his first visit the hospital authorities have intimated to the patient that if she will accept the services of a member of the staff she can have such without charge. It makes no difference how well able the patient may be to pay, the staff physician makes no charge, and the physician who had previously been in charge loses the patient and the fee which would have been his had the patient been treated at home.”42
Private practitioners protested vehemently against this sort of “patient-stealing” by the hospital staff, insisting that hospital authorities abide by the profession’s code of ethics and guard their proprietary interests. They also wanted adherence to the professional vow of silence and noninterference. Without such cooperation, patients might hear disparaging remarks about their doctor’s ability, or members of the staff might revise the diagnosis and plan of treatment. In sending patients to hospitals, private doctors risked not only losing the fee, but possibly discrediting the image of competence they were trying to maintain. From their viewpoint, unless the staff cooperated, the management of impressions was much more vulnerable in the hospital than in the home.
Physicians started asking why they did not completely control hospitals. “Is it not about time the professional mind began to dominate in the control of these institutions?” asked a physician in the Journal of the American Medical Association in 1902. “Fairly estimated, do not our services justly entitle us to a voice in all professional questions in and out of the hospital, second to none, even to that of those benevolent individuals, charitable organizations or religious societies that founded these institutions?” Bayard Holmes, a prominent Chicago physician and socialist, also writing in the AMA’s Journal, formulated “the hospital problem” for his colleagues in the following terms:
When the industrial revolution of the seventeenth century began it found Europe peopled with independent tradesmen. . . . Now we find the homeless, tool-less dependent machine operators far removed from the people who furnish a market for the standardized product of their toil. The hospital is essentially part of the armamentarium of medicine. . . . If we wish to escape the thralldom of commercialism, if we wish to avoid the fate of the tool-less wage worker, we must control the hospital.43
Oddly enough, proprietary hospitals were one of the main ways of resisting corporate domination and establishing professional control. Some small private hospitals were built by individual surgeons for their own cases; others were joint ventures. To supply enough patients to make the hospitals profitable, competing doctors often had to combine their efforts. “No other profession,” wrote the leader of a group of eight physicians who incorporated a hospital in a town in upstate New York, “has had such cruel jealousies and such costly strifes. These differences are being abandoned and must be to make the . . . hospital . . . a success.” The creation of doctor-controlled hospitals was easiest in small towns throughout the country and in the cities of the West, where trustee-dominated institutions had never been founded. In the early years of the century, more proprietary than charitable hospitals were being built. They were opened mainly by physicians who had no hospital privileges elsewhere, or who had positions but felt the hospitals were not providing adequate accommodations for their private patients. The increased competition from these new enterprises catering to the middle and upper classes forced the older voluntary hospitals to make adjustments because of the threatened loss of clients and revenue. The private hospital, a writer noted in 1903, had “taught the larger hospital that it must open its doors to all reputable physicians.”44
By 1907 there was a movement—“none too strong, perhaps,” commented the editor of the National Hospital Record, “but enough to show in which direction the current is moving”—to open up hospitals to doctors not formerly on their staffs. “Experience has proved conclusively that ‘the open door’ to the hospital is a benefit, not only to the rank and file of doctors, but to the hospital. It pays in dollars and cents.” Not everyone was convinced; a few voices even urged movement in the opposite direction. A number of critics had long maintained that American hospitals were, if anything, too loose for the good of their patients and their own budgets. European hospitals, conducted by a small, permanent medical staff, stood as an example of more disciplined and economical organization. In American hospitals, observed Arpad Gerster, with a rotating staff of visiting physicians and changes every year in the house staff and student nurses, “you can only wonder that chaos and waste are no greater than they actually are.” Since the services of the visiting staff were gratuitous, there was no way to regulate their hours or make sure they gave each patient adequate attention. “Medical men who complain that the hospitals are not converted into a free and general stamping ground for every one having a doctor’s diploma,” Gerster declared, “will naturally be disgusted by further restricting the number of those who will have charge of hospital facilities. They must be shown, however, that the hospitals do not exist mainly for the indiscriminate benefit of the medical profession, but are here, first, for the benefit of the patients, and secondly for that of the community. Restriction of the number of those who attend at our hospitals is the condition sine qua non of economic reform.”45
General practitioners naturally saw closed staffing as a way to maintain privilege rather than quality. Physicians excluded from city hospitals in Louisville and Cincinnati petitioned against the “unjust and undemocratic” control of the institutions by a “ring” of monopolists; in New York, a number of them organized “physicians’ economic leagues” to fight on their behalf. “We all know, only too well, the great scramble for hospital association,” a doctor told a meeting of one such league in 1915. “A physician not in the coterie of a hospital staff pulls every wire to get one and not succeeding, starts another coterie to establish another hospital. A crooked politician would blush with shame to be seen in the company of some of our physicians did he know to what extent of knavery they have gone to get on a hospital staff.” Partisans of the excluded noted that hospitals served to educate doctors and advocated extending privileges to all members of the profession on the grounds that those isolated from hospitals could not keep up with new advances.46
The decisive consideration proved to be financial. Voluntary hospitals had multiplied in great numbers and many had fallen seriously into debt. As the industry trade journal explained, hospitals would fail without support from local physicians. “If favorably disposed toward the hospital, the physician can very frequently recommend that a patient be transferred to the hospital even where the distinct need of this transfer does not exist.” A 1909 guide to hospital administration noted, “The income from private patients depends largely upon the medical staff.” If the staff had “large and profitable” practices, “then a sufficient amount of money can easily be realized to defray the entire running expenses of the institution, supplying the care not only for the private patients, but also for the charity inmates.”47
With that hope in mind, hospital boards expanded the number of positions for doctors who could serve as “feeders” to fill their beds. In Brooklyn, New York, according to a study of physician directories by David Rosner, the big change came between 1900 and 1910, when the proportion of hospital-affiliated practitioners rose from 15.6 to 42.3 percent. Many of Brooklyn’s hospitals were in financial trouble because of rising costs and opened their doors to new physicians to increase their revenues. In New York City, other studies indicate, the proportion of hospital-affiliated physicians climbed from 36.8 to 52 percent between 1921 and 1927. Moreover, no hospitals, except for research institutions, were totally “closed,” since they all generally had a courtesy staff with the privilege of attending patients in private rooms. On the other hand, no hospitals were totally “open” either, since even hospitals with large courtesy staffs limited their access to the charity wards. Nationally, almost two thirds of physicians in 1928 held staff appointments—90,903 out of about 150,000 doctors. By 1933 the number of affiliated physicians climbed to 126,261, leaving one doctor in six without any privileges.48
While doctors’ access to hospitals expanded, professional associations sought ways to tighten the medical organization of hospitals. In 1919, as part of a campaign to assure minimum standards for hospital care, the recently established American College of Surgeons adopted a requirement that hospitals wishing to receive its approval organize their affiliated physicians into a “definite medical staff.” The staff could be “open” or “closed,” with as many “active,” “associate,” and “courtesy” members as desired, so long as they were restricted to competent and reputable physicians, engaged in no fee splitting, abided by formal bylaws, and held monthly meetings and reviews of clinical experiences. Also in 1919, the AMA’s Council of Medical Education set minimum standards for hospital internships, the next year changing its name to the Council on Medical Education and Hospitals. Though compliance with these normative bodies was voluntary, they pushed hospitals toward a more formally structured, hierarchical organization.49
Even if more doctors gained entry to a hospital in their community, they did not necessarily gain access on the same footing as other physicians or to hospitals of equivalent status and quality. In Cleveland, according to a study published in 1920, 25 percent of the medical profession held control of 80 percent of the hospital beds. Blacks and foreign-born doctors, particularly Italians and Slavs, were almost completely unrepresented on hospital staffs. These kinds of inequalities persisted. When doctors from lower-status ethnic backgrounds obtained positions, they did so at the lower levels of the system. Studying the informal organization of medical practice in Providence, Rhode Island, about 1940, Oswald Hall found that appointment decisions depended largely on nontechnical considerations, such as personality and social background. “In the earlier days,” a hospital administrator told Hall in regard to the selection of interns, “we had competitive examinations, but we had to discontinue those. The person who did best on an examination might not show up well in the intern situation. He might lack tact; he might not show presence of mind in crises; or he might not be able to take orders. And more than likely the persons who did best on the written examinations would be Jewish.”50
The continued dependence of practitioners on hospitals throughout their careers made them dependent on what Hall identified as the “inner fraternity” of the profession. “The freelance practitioner,” he wrote, “has gradually been supplanted by one whose career depends on his relationship with a network of institutions.” Access to favored positions in that network came through “sponsorship” by a community’s established physicians, who could advance or exclude aspirants at various stages of their careers by influencing professional school admissions, dispensing hospital appointments, referring patients, and designating protégés and successors. Because the hospital was essential to successful practice, its various grades could be used as delicately calibrated rewards to signal the progress of a career.51 Although opening up hospitals to more doctors weakened the elite’s traditional monopoly over hospitals, it brought greater control over the profession.
“Paradoxically,” writes William Glaser, “the integration of private and hospital practices in America produces a more diffuse staff structure inside the hospital and a more orderly structure in the community of private practitioners. Since the majority of doctors in most countries practice outside the public and voluntary hospitals, rank in these institutions cannot be used to arrange a hierarchy in the medical profession generally. Granting or withdrawing hospitalization privileges cannot be used to regulate professional and personal behavior; in fact, this use of hospitalization privileges makes America one of the few countries with any controls over the quality of private practice.”52
It is unclear whether the use of this power in the early twentieth century did raise the quality of private practice in America. But there can be no doubt it was used to exclude doctors unacceptable to the organized profession. By the twenties, membership in the local medical society had become an informal prerequisite for membership on the staff of most local hospitals. In 1934 the AMA tried to institutionalize its control over hospital appointments by requiring all hospitals accredited for internship training to appoint no one to their staff except members of the local medical society. Black doctors, who were excluded from the local societies, could be kept out of hospital positions on those grounds.53 So could anyone else who threatened to rock the boat. The private practitioners, who had first seen hospitals as a threat to their position, had succeeded in converting them into an instrument of professional power.
THE PATTERN OF THE HOSPITAL SYSTEM
Class, Politics, and Ethnicity
The rapid rise in the reported number of hospitals from 178 in 1872 to more than 4,000 in 1910 stemmed only in part from the growth of hospitalization. After all, more hospital beds might have been accommodated in fewer institutions by increasing their average size. Mental hospitals in America developed in this way, enlarging their capacity rather than feverishly multiplying in number. By 1920, when there were some 4,013 general hospitals with an average size, in beds, of 78, there were 521 mental hospitals with an average size of 567.54 The contrast between the two kinds of hospitals developed because they had quite different functions. General hospitals became a necessary local adjunct of medical practice, while mental institutions did not. Physicians who were excluded from the staff of existing general hospitals formed new ones; doctors in small towns opened hospitals to prevent their big-city colleagues from drawing away their patients. No similar incentives promoted the establishment of mental institutions. While communities wanted to have general hospitals readily accessible, they were quite prepared to have the mentally ill removed to a distance. Small general hospitals also multiplied because many of them were sponsored by competing religious groups, while the more burdensome and unremunerative long-term institutional care of the mentally ill fell almost entirely to the states, which centralized facilities to save money.
The hospital system in America—leaving mental institutions aside—emerged in a series of three more or less coherent phases. The first of these, running roughly for a century after 1751, saw the formation of two kinds of institutions: voluntary hospitals, operated by charitable lay boards, ostensibly nondenominational but in fact Protestant; and public hospitals, descended from almshouses and operated by municipalities, by counties, and, in the case of merchant marine hospitals, by the federal government.
In the second phase, beginning about 1850, a variety of more “particularistic” hospitals were also formed. These were primarily religious or ethnic institutions and specialized hospitals for certain diseases or categories of patients, such as children and women. Hospitals were also opened by medical sects, mainly homeopaths.
The third phase of development, running from about 1890 to 1920, saw the advent and spread of profit-making hospitals, which were operated by physicians, singly or in partnership, and by corporations.
This pattern of development was not accidental. The formation of denominational hospitals after 1850 reflected the arrival of large numbers of Catholic immigrants; the growth of proprietary hospitals after 1890 reflected the new potential for profit due to the progress of surgery. An internal dialectic was also at work. Once general hospitals had been established, physicians interested in creating institutions appealed for funds and patients on more partial axes—ethnic affiliations, special categories of diseases, sectarian medical ideas. Like the proprietary hospitals, these institutions were established in response to the changing structure of opportunities.
This sequence of development unfolded in major cities with variations and exceptions, depending on the time a community was formed, its size, the ethnic makeup of its inhabitants, and its economic development. In cities emerging after 1850, the first and second phases were superimposed. While municipal and nonsectarian voluntary hospitals generally preceded denominational institutions in the older cities of the East, they emerged simultaneously in the Midwest; in some Midwestern cities, Catholic hospitals were actually built first. In those areas of the country that built their institutions last—the Far West and the South, where the growth of hospitals had been stunted by the economic aftereffects of the Civil War—the profit-making sector took on more importance than elsewhere. By the early 1900s, in comparison with national averages, the Eastern states showed more nondenominational voluntary hospitals, the Middle West a disproportionate number of church hospitals, and the South and West an excess of proprietary hospitals.* These regional variations reflect their successive development and associated economic differences. Because the Eastern cities grew up first, they had an edge as centers of banking and commerce. The greater accumulation of capital there aided the creation of the early voluntary hospitals, as well as private colleges, museums, and other nonprofit institutions. Because the South and the West had less private capital available for philanthropy, they relied more on the profit-making sector in hospital care and on the state in higher education.
Despite these regional variations, metropolitan hospital systems across the country fell into a fairly standard pattern. At their core were the largest institutions, the elite voluntary and the municipal hospitals. The ethnic, religious, and special hospitals were somewhat smaller and less central (both functionally and geographically), while the proprietary and medical sectarian institutions were typically the smallest and furthest on the fringe of the system. Each group of hospitals had its characteristic functions, organizational structures, patients, and methods of finance.
The elite voluntary hospitals concentrated on acute care; they had relatively closed medical staffs and the closest ties to university medical schools. Their patients were the very poor (for teaching purposes) and the very rich (for revenue and, one hoped, bequests). They had the largest endowments, enjoyed the most prestige as centers for medical training and treatment, and were generally old and stable.
The municipal and county hospitals, usually the largest local institutions in number of beds, cared for the full range of acute and chronic illness. The organization of their medical staffs varied by region—the further west the city, the more likely its hospitals were to be open. Public hospitals generally treated the poor, relied on government appropriations rather than fees, and were plagued periodically by scandals over graft and neglect. Some were important teaching institutions.
The religious and ethnic hospitals were a mixed and intermediate group. In size, they were on the average smaller than the elite voluntary or municipal hospitals, but larger than the profit-making establishments. They rarely had large endowments and consequently relied on fees from patients, who were predominantly from the working and middle classes. Most treated short-term illness. Compared with elite voluntary hospitals, their medical staffs were more open and they had less frequent and less close ties to medical schools.
The profit-making hospitals were mainly surgical centers; they were usually small and had no ties to medical schools. They relied on fees exclusively, and their patients were from the middle and upper classes. Their rate of institutional survival was the lowest. In this regard they were typical of small businesses; they opened and closed with the vicissitudes of personal fortune.
The hospital system had no design since it was never planned, but it had a pattern because it reflected a definite system of class relations. The elite voluntary hospitals brought together the top and bottom of the society under one roof because their physicians simultaneously wanted to have poor patients for teaching and to save time by treating their wealthy patients in the same location. The mix of social classes was also thought to have some educational value. One hospital director candidly explained that in their training doctors and nurses tended to deal with ward patients “so much as cases, and not as persons,” while “the personalities of the patients and the friends come in very largely in the care of patients in private rooms.”56 When the superintendents of several major hospitals in New York were asked in a 1904 survey whether hospitals ought to be divided into two classes—private hospitals for people who could pay and public hospitals for people who could not—they unanimously rejected the idea. If all poor patients were cared for by municipal hospitals, charitable donations would dry up, requiring higher rates in private rooms.57
Thus the split between public and private hospitals did not become a straightforward class boundary. Both kinds of hospitals treated poor patients, but they treated them in different ways. “[P]ublic hospitals,” wrote S. S. Goldwater in 1906, “are conducted at a low rate of expenditure, which implies a low grade of efficiency; hospitals supported by voluntary contributions, on the other hand, aim at a higher grade of service and are unashamed of expense accounts relatively vast.” New York City was a prime example. “Here, on the one hand, are the public hospitals, Bellevue, City, Metropolitan and Kings County, conducted at an average expense of $1.00 per capita per day or less; and on the other hand a large number of private institutions of the highest grade, supported mainly by the gifts of the benevolent and conducted at a daily per capita cost which approximates $2. Throughout the country, in Philadelpia, Cincinnati, St. Paul, Milwaukee, Chicago, St. Louis, San Francisco, New Orleans, etc., contrasts of this sort are found. . . .”58
The relation between public and private hospitals had been foreshadowed by the complementary roles of the almshouses and early voluntary hospitals. While voluntary hospitals admitted poor patients, the public institutions received the less desirable poor, the overflow of mostly chronic cases. Other state welfare institutions, such as mental hospitals and homes for the deaf, the blind, and the retarded, likewise provided long-term care of the poor at low average daily expenditures per person. The government accepted responsibility for the residual problem cases other institutions would not take.
In addition to operating their own hospitals, most state and local governments gave subsidies to private hospitals for their charitable services. In 1904 one quarter of all public funds spent for hospital care supported private institutions.59 Such assistance, however, promoted a pattern of uneven development in medical services. In the District of Columbia, the secretary of the Board of Charities noted in 1906 that government subsidies had created “too many comparatively small hospitals for acute medical and surgical services, and . . . utterly failed, thus far, to provide the necessary accommodations for chronic, convalescent, tubercular, inebriate, and generally undesirable cases.” Only one hospital was under the city’s direct control. “The result is that this hospital is constantly overcrowded with general chronic cases, which are not desired and which will not be received by institutions not under the immediate control of the city.”60 This pattern became a standard feature of American medicine—a highly developed private sector for acute treatment and an underdeveloped public sector for chronic care. Private hospitals for acute illness would be running well below capacity, while overcrowded public institutions were teeming with the victims of tuberculosis, alcoholism, mental disorder, and other diseases of social disorganization.
The public and private hospitals also functioned as alternative systems of patronage and sponsorship. At elite private hospitals, as we have already seen, wealthy patrons sponsored the admission of patients to free beds, and staff appointments went to physicians from established families, while Catholics and Jews were passed over. Correspondingly, public officials used municipal hospitals to dispense jobs and contracts and secure the timely admission of their friends and constituents as patients. Such intervention was roundly criticized by physicians and upper-class reformers, who demanded that these and other municipal institutions be run on a strictly disinterested basis. But as many people have argued, the urban political machines, while frequently corrupt, were also more responsive to pressures from lower-status groups. In Boston, Brahmin families dominated the medical staffs of the private hospitals, but after Boston City Hospital was opened in 1864, Catholic and Jewish doctors were able to get staff appointments there through the intervention of their representatives.61
Discrimination was a principal reason for the formation of separate religious and ethnic hospitals. Except against blacks, outright prejudice was rare, though the Massachusetts General Hosital initially refused to admit Irish patients on the grounds that their presence would deter other people from entering the hospital. The early moralistic aims of hospitals gave religious minorities reasons for anxiety. Catholics were afraid they might not be given last rites, and Jews feared they would have to eat nonkosher food and face ridicule for their appearance and rituals. Both religious communities worried that efforts might be made to convert some of their members in moments of personal crisis. Entering a hospital necessarily involved encounters with strangers at times of weakness and vulnerability, but the encounters might be less threatening if the hospital authorities and staff were of the same faith or, even better, of the same ethnic background. For even within religious groups, there were sharp differences, as a Russian Jew in New York in 1894 found out when visiting Mr. Sinai Hospital and other “uptown” institutions controlled by the then dominant German Jews:
In the philanthropic institutions of our aristocratic German Jews you see beautiful offices, desks, all decorated, but strict and angry faces. Every poor man is questioned like a criminal, is looked down upon; every unfortunate suffers self-degradation and shivers like a leaf, just as if he were standing before a Russian official. When the same Russian Jew is in an institution of Russian Jews, no matter how poor and small the building, it will seem to him big and comfortable. He feels at home among his own brethren who speak his tongue, understand his thoughts and feel his heart.
From the other side of the encounter with immigrant Jewish patients comes a confession of the gentile doctor’s prejudice by Richard Cabot, a medical professor at Harvard and physician at the Massachusetts General Hospital:
[T]he chances are ten to one that I shall look out of my eyes and see, not Abraham Cohen, but a Jew . . . I do not see this man at all. I merge him in the hazy background of the average Jew. But if I am a little less blind than usual today . . . I may notice something in the way his hand lies on his knee, something that is queer, unexpected. That hand . . . it’s a muscular hand, it’s a prehensile hand; and whoever saw a Salem Street Jew with a muscular hand before . . . I saw him. Yet he was no more real than the thousands of others whom I had seen and forgotten, forgotten—because I never saw them, but only their ghostly outline, their generic type, the racial background out of which they emerged.62 (emphases in original)
Besides providing a haven from prejudice for the sick, the ethnic and religious hospitals also offered material advantages to the sponsoring communities and their physicians. They furnished opportunities for internships and residencies that Jewish, Catholic, and black doctors were denied elsewhere and staff appointments so that they could attend patients of theirs needing hospitalization. As Oswald Hall discovered, the most important dividing lines among hospitals were ethnic and religious, not technical. The ethnic and religious hospitals were part of a chain of institutions that served doctors in each group at successive stages of their careers. While the upper-class Yankee would go to an expensive undergraduate college, elite medical school, and prestigious hospital for his internship, the young Italian doctor would almost certainly find those gateways blocked. “However,” Hall noted, “there are other chains of institutions (in this case Catholic) which provide an alternative route, and not only open a road to a medical career for him, but also shelter him in some degree from the competition of those [with more] advantages . . . .” 63
It might seem, from the role played in relieving discrimination, that the denominational hospitals would have attracted discrete groups of patients. But this was not so. The hospitals illustrate the tendency in America first to assert and then to submerge religious differences. While specific groups sponsored hospitals, they took pride in serving patients of all faiths—though not all races—without prejudice. The clientele of a Protestant hospital might well include more Catholics than any other group. Jews’ Hospital in New York originally accepted gentiles only in cases of accident or emergency, but soon changed its name to Mt. Sinai to signify that it served the community at large. Catholic hospitals were not only open to the general community, but in some places took responsibility for public hospital service. In Rochester, Minnesota, the Mayo brothers came to rely exclusively on a Catholic hospital, St. Mary’s, even though neither they nor the majority of their patients were Catholic.
Denominational hospitals exemplified a broader pattern in American society. In some countries, where cultural divisions run much deeper than in the United States, the various groups create separate institutions to meet a broad range of social needs. The Dutch call this phenomenon verzuiling, or “pillarization,” evoking the image of independent pillars supporting a common roof. “Each denominational bloc,” writes Johan Goudsblom about the Netherlands, “has set up a whole array of organizations encompassing practically every sphere of social life. Schools and universities, radio and television corporations, trade unions, health and welfare agencies, sports associations, and so on, all fit into the zuilen system.”64 This pattern of “segmented integration” has developed only partially in America. Protestants, as by far the largest group, have generally felt little need to define their institutions on religious lines; the denominations that do build their own schools and hospitals tend to be those that see themselves as deeply at odds with the dominant culture. Among the major religious groups, only Catholics have organized an elaborate network of separate institutions—schools, colleges, hospitals, community associations. Blacks, too, have created separate institutions, at least in the South, but perhaps more out of necessity than desire. Jews have been more eager to join the common institutions of the society than to build their own. In education, for example, Jews have generally preferred to remain within the established system at all levels (the first Jewish university, Brandeis, was formed only after the Second World War).65 Jews made an exception of hospitals—every Jewish community of any size built its own hospital, often much larger than the community required—possibly because of the special place that medicine occupied in Jewish aspirations. Medicine was thought the ideal career for Jews because of the professional autonomy of private practice, which made it possible to escape most institutional antisemitism. But because of the discrimination in hospitals, special Jewish institutions had to be established to supply positions as house, attending, and consulting physicians. Nevertheless, in the long run, the assimilationist pattern prevailed. Many of the Jewish hospitals later became major teaching and research institutions and fell into the orbit of medical schools. In a sense, the assimilation and upward mobility of Jewish hospitals paralleled the larger experience of the Jewish community in America.
Cultural heterogeneity has been one of the chief factors inhibiting consolidation of hospitals in a state-run system. Ethnic and religious groups have wanted to protect their own separate interests. For the upper-class Protestants, voluntarism offered a way to exercise direct control without the mediation of state and local governments, which immigrant groups began to influence in the later nineteenth century. For the ethnic communities of lower status, private sponsorship offered a defense against discrimination. In culturally homogeneous societies, the administration of hospitals seems to gravitate sooner or later to the state. In a cross-national study of hospitals, William Glaser found that in all countries with one prevalent religion, hospitals were run by the government. Even where hospitals originated as religious organizations, the church had found the expense of running them too irksome and had chosen to use its resources for activities that more directly affected religious observance and belief. But where competition existed among religious groups, they retained control of hospitals to protect and extend their sphere of influence. As a general proposition, Glaser suggested that the greater the number of religions in a society, the more diffused the ownership and management of hospitals and the smaller their average size.66
That there were too many small hospitals in America was a complaint already being heard soon after 1900, and it became a steady part of criticism of the hospital system. “If many hospitals in each city could pool their interests,” wrote a hospital superintendent in 1911, “the result would be greater efficiency and greater economy—and yet nothing is more unlikely than that independent, privately controlled hospitals will pool interests.” Especially after the Depression began in 1929, private hospitals faced serious underutilization. A medical school professor in 1937, noting the large number of hospitals in debt running at 50 percent capacity, suggested that their financial troubles could be alleviated if some hospitals closed, raising the occupancy rates of the rest to 75 or 80 percent. “The trouble, of course, is that the hospitals are sectarian, or partially endowed, or are run for the individual benefit of some surgeon or staff.”67
While corporations at the end of the nineteenth century became multi-unit operations, hospitals remained at an earlier stage of industrial development because of the parochial interests that sustained them. Despite the possible advantages of integrated organization, none was achieved. The early efforts to reform hospitals mounted by the American College of Surgeons had the goal of “standardization”: the imposition of minimum requirements for medical record keeping, the performance of autopsies, and various other aspects of hospital organization. Hospitals participated in such voluntary efforts partly to preempt demands for more thorough government regulation. Emulating one another, hospitals became more standardized than might have been desirable, offering the same services regardless of the overall needs of their communities. They came to present the familiar American paradox of a system of very great uniformity and very little coordination. The absence of integrated management made it incumbent upon individual hospitals to develop a more elaborate administration than hospitals in other countries where administrative functions are more centralized. In America each voluntary hospital had to raise its own funds for capital expenditures, set its own fees, do its own purchasing, recruit staff, determine patients’ ability to pay, collect bills, and conduct public relations efforts. All these activities required staff, money, and space. At the same time, the American system of attending physicians also created demands for more administration. The stable medical staff typical of foreign hospitals can resolve many problems through face-to-face discussions. But in the United States, large numbers of practitioners circulate through the hospital at different times, delegating tasks to its employees and requiring more coordination to make things run smoothly. Various internal responsibilities that in foreign hospitals are controlled by powerful chiefs of service fall to administrators in American hospitals. Abroad, because of greater centralization of functions in the society and greater decentralization within the hospital, administrators have been weak in authority and low in status. In America, however, hospital administration became more important and prestigious because there was little centralization of functions in the society and much within the hospital.68
So, paradoxically, as a result of the independence of both hospitals and doctors from higher bureaucratic authority, hospital administration became professionalized more rapidly in America than it did elsewhere. In Europe hospital administrators generally had no professional degrees and were clearly subordinate in status and authority to the leading clinicians. But in America physicians themselves were attracted to hospital administration, and university degree programs in hospital administration began in the 1920s. In 1899 the administrators had founded an Association of Hospital Superintendents, which in 1908 changed its name to the American Hospital Association; in 1933 the American College of Hospital Administrators was formed.
Medical domination of hospitals began to weaken in the thirties and forties, as challenges from administrators to the authority of physicians became more common. Much of the mid-twentieth century American sociological literature on hospitals reflects this development, emphasizing the split between “two lines of authority,” the clinical and administrative, a much more salient issue in American hospitals because of the somewhat stronger position of the administration. The two groups held two different conceptions of the hospital. The private physicians continued to regard hospitals as “doctors’ workshops,” that is, as auxiliaries to their office practices, while the administrators tended to see them as “medical centers,” serving the community as the main coordinators of health services. They frequently divided over administrators’ efforts to expand outpatient care, increase medical research and education, hire full-time physicians in specialized services, and add administrative personnel to run those various activities.69
Authority in American hospitals, Charles Perrow argues, has passed successively from the trustees to the physicians and finally to the administrators, a development he explains as resulting from the changing technology and needs of hospitals. The domination of the trustees was rooted in the need for capital investment and community acceptance. Doctors then assumed control because of the increasing complexity and importance of their skills. Finally, there has been a trend toward administrative domination because of the complexity of internal organization and relations with outside agencies.70 This argument suggests virtually an immanent process of change in organizations, depending entirely on their functional needs. Yet as we have seen, the changing structure of authority was related to specific historical conditions. The growing power of physicians at the turn of the century rested in large part on their new ability to bring in revenue because of the increasing use of hospitals by paying patients; the rising influence of hospital administrators depended in part on the resistance to both centralized coordination of the hospital system and full-time responsibilities for physicians practicing in hospitals. These were the results, not of functional necessities, but of a particular configuration of interests.
While the general trend in the twentieth century has been toward more administrative control and more structure in the organization of hospitals, they remain loosely coordinated, as does the system as a whole. Within the hospital, there continue to be three separate centers of authority—trustees, physicians, and administrators—posing a great puzzle to students of formal organizations. Sociologists have wanted to know why the hospital departs from the standard model of a bureaucracy in lacking a single, clear line of hierarchical authority. Economists have wanted to know what the hospital maximizes if it does not maximize profit. From the viewpoint of each discipline’s paradigm, the hospital has been an anomaly. It seems much less so historically. Hospitals began as caretaking charities under the sponsorship of wealthy patrons. Their reconstitution as centers of active medical treatment made private practitioners anxious to gain access to their precincts. The practitioners were able to gain access in America because of the financial needs of voluntary hospitals that could not adequately draw on taxes as a source of revenue. The interests of private practitioners, together with those of different ethnic and religious groups, led to the multiplication of relatively small hospitals and blocked their integration under the state. In turn, the absence of integrated management led to more competition among hospitals, more emphasis on business functions, and more administration. All of which left, instead of a single governing power, three centers of authority held together in loose alliance. Hospitals remained incompletely integrated, both as organizations and as a system of organizations—a case of blocked institutional development, a precapitalist institution radically changed in its functions and moral identity but only partially transformed in its organizational structure.
This same pattern of blocked development was evident throughout the medical system. Integrated organization was limited in public health and almost entirely absent from what we now call “ambulatory” care. The rise of bureaucracy has been taken as an inexorable necessity in modern life, but in America the medical profession escaped, or at least postponed its capitulation.
*For the impact on hospitals of urbanization and changes in the family, see Chapter 2.
*The differences are striking. In 1923, according to a federal census, nonsectarian voluntary institutions represented 49 percent of hospitals in the Mid-Atlantic states, compared with 25 percent in the East North Central states, and only 12 percent on the Pacific Coast. Hospitals with religious sponsorship rose from a low of 8 percent in New England to 23 percent in the Midwest, but fell to just 13 percent in the Pacific states. More than half of the Pacific states’ hospitals were proprietary (52 percent), compared with 17 percent in the Mid-Atlantic and 30 percent in the East North Central states. The pattern in the South resembled the West.55