Introduction

The U.S. healthcare system is a complex delivery system that has undergone significant changes in recent years. Throughout its history, there have been many attempts at reform that include aspects such as how patients access care, quality of care, insurance, and the government’s role. While many of these attempts at reform failed, several significant events have redefined the American healthcare system over the past two centuries. The history of the U.S. healthcare system can be divided into three time periods: the Preindustrial Era, the Industrial Era, and the Corporate Era.1

The Preindustrial Era (Mid-18th to Late 19th Century)

Although the first American hospital opened in 1751, widespread development of formalized hospital systems did not occur until the 1880s.2 There has been a long history of government engagement with the healthcare system. The first attempt at federal government involvement occurred following the Civil War, when the first system of national medical care was created in the South with the construction of 40 hospitals staffed by 120 physicians.3 The poor and severely ill received care from government-run almshouses, or poorhouses, which were public welfare agencies that functioned primarily as shelters with infirmaries for the sick.4 In the 1800s, medicine was practiced similarly to other trades of the time—that is, with little in the way of overarching professional standards.5 Training was typically performed through apprenticeships rather than formal university education. The shift toward a standardized system of clinical training occurred over the next century, as more medical schools opened and the number of formally educated physicians grew to outnumber those trained in the apprenticeship system.6

As a result of the disorganization and a lack of institutional centers, most Americans in the 1800s relied on professionals other than physicians to provide care.7 Health care was an unregulated free-market economy with a fee-for-service payment system. As such, consumer sovereignty dictated demand. Communities frequently turned to traditional home remedies and so-called healers, as opposed to medical doctors, in large part due to lack of access and high costs.1 Demand for healthcare services remained low until the formalization of education led to an increase in trained physicians, which in turn legitimized the medical field; cost structures then began to change from a fee-for-service model to insurance-based payments. The quality of training began to improve with the formation of the American Medical Association (AMA) in 1847 and the implementation of licensing laws and education reforms in the 1870s. The education system went through additional changes in 1876 with the formation of the Association of American Medical Colleges, which helped monitor medical education programs. These collective efforts to enhance education and require licensing further legitimized the field of medicine.8

One of the earliest healthcare reforms during this time period was the 1854 Bill for the Benefit of the Indigent Insane. It was vetoed by the 14th president, Franklin Pierce, who argued that the government should not be involved in or commit itself to social welfare.3 As the nation approached the 1900s, health-care access and affordability would greatly change in America with advances in the medical field. Starting in the 1890s, this would largely be orchestrated by hospitals.1 Although there were only a few dozen hospitals in 1875, the number would grow to over 4,000 by the early 1900s.9

The Industrial Era (Late 19th to Late 20th Century)

From 1880 to 1930, the medical profession in the United States experienced a cultural and economic growth that dramatically changed the way that physicians practiced. Earlier, physicians competed not only against each other but also against other alternative healers in an unregulated medical marketplace. However, as the field progressed in the late 1800s and into the 1900s, the reorganization of commercial medical schools began to produce numerous graduates with greater knowledge and skills than in previous decades. By the end of the 1800s, hospitals were shifting from care centers for the poor to centers of scientific inquiry. As the culture of medicine changed to focus more on laboratory science and medical discovery, there was a call for stricter state regulations involving physicians and hospitals. The healthcare economy grew with the construction of more hospitals and an increased number of physicians, and gradually led to a change in the way in which people accessed and thought about health care. By the 1930s, health care was one of the largest industries in the United States, with Americans spending nearly $3.5 billion on medical services, despite the Great Depression.10

The first attempt to create a national health insurance model similar to what was available in many European countries began in the early 1900s. Theodore Roosevelt introduced the first draft of a plan for universal health care during his presidency (1901 to 1909); however, attempts to pass a bill were unsuccessful because of a fear of the rise of socialism resulting from too much government control. From 1916 to 1918, 16 state legislatures attempted to create legislation that would require employers to provide health insurance; these attempts also failed.11 Funding for healthcare programs was authorized by Congress in 1933 as part of President Franklin D. Roosevelt’s landmark Social Security legislation, but it stopped short of universal health care.3

Changes to Social Security and public funding for health care led to the emergence of private insurance companies. In 1929, Justin F. Kimball, vice president of Baylor Health, began one of the first hospital insurance plans for teachers in Texas. The model of allowing a patient to pay a small fee per month to cover the cost of a 21-day hospital stay created the blueprint for companies like Blue Cross.12 This model expanded over the course of the next several years from single-hospital insurance plans to larger group-sponsored hospital plans. This development created the first opportunity for consumer choice in health care and became widely supported by the American Hospital Association. By 1946, Blue Cross had been extended to 43 states, serving nearly 20 million people.11 As this model grew in popularity, commercial (for-profit) insurance began to grow, resulting in over 700 different companies selling insurance by the 1950s.

Employer-sponsored health insurance became more popular around the time of World War II, largely in response to wage freezes during the war. Many employers offered health insurance to compensate their employees for lost wages. Moving into the 1950s, the federal government and U.S. Supreme Court ruled that employee benefits were part of a union-management negotiation, and therefore, health insurance became part of collective bargaining agreements between unions and employers. In 1954, employer-paid health insurance became non-taxable under the new Internal Revenue Code. This had a significant economic value because it was equivalent to getting an increase in salary without having to pay taxes on it.11 As more Americans gained health insurance coverage through their employers, the debate around the need for a national health insurance model resurfaced, especially as a means to provide access to health care for the elderly and the poor. The American Medical Association lobbied against national health insurance out of concern for the impact on private practice physicians, private hospitals, and healthcare delivery. As a result of this resistance, in 1965, the federal government made legislative amendments to the Social Security Act that created Medicare and Medicaid. Through intense debate and grass roots-level organizing, a system emerged that consisted of a three-part program that included hospital and nursing home coverage for the elderly and insurance for the impoverished.9 Medicare, funded by a Social Security tax, was widely accepted by the public. Medicaid, on the other hand, was stigmatized as public welfare and faced some opposition from providers. Medicare had strict universal standards for care and payment. Medicaid was controlled by the states and therefore varied widely in terms of cost, access, and coverage.11 In 1970, Richard Nixon attempted to introduce a plan to create a national health insurance program, but the effort failed. Health insurance would continue to be debated over the next several decades, which resulted in many significant changes in access to care, quality, costs and payment for care, and innovation.

The Corporate Era (Late 20th Century to the 21st Century)

The advances of the 20th century added a number of new aspects to the U.S. healthcare system. These new participants and players required novel approaches in organization and management that would help to monitor the quality and cost of care. The first of these new strategies was proposed by the Nixon Administration through the Health Maintenance Organization Act of 1973.13 The object of this initiative was to encourage employers to offer prepaid medical plans that would be less expensive than traditional fee-for-service practices.1 Soon, other care network management approaches were developed, such as Preferred Provider Organizations (PPOs) and Exclusive Provider Organizations (EPOs). In 1973, Medicare was expanded to cover nonelderly disabled people who had been receiving Social Security for the previous 24 months and provided coverage to individuals with end-stage renal disease who needed dialysis or a kidney transplant.11

The Advent of Managed Care

By the late 1980s, the United States had nearly 44 million uninsured Americans and was experiencing an epidemic of HIV/AIDS. The cost of health care rose significantly above the rate of inflation, which led to the first national rise in health insurance premiums among employer-sponsored health plans. In 1983, legislation was passed to control Medicaid spending as the cost of services rose. The federal government would follow similar guidelines used under Medicare by paying a set fee based on 467 diagnosis-related groups (DRGs).14 Another attempt to control healthcare costs during this time was the creation of the Health Maintenance Organization (HMO), an organized structure of groups of providers and networks from which patients would receive their care. This model of managed care led to many health systems becoming more integrated in order to offer a variety of health services from different providers within one geographic location. Integrated healthcare systems allowed patients to use any physician within the hospital system’s network without having to worry whether their care would be covered by their insurance. This model gave rise to a variety of integrated health networks including Kaiser Permanente and Geisinger Health System.

Despite the growing number of integrated health systems and health alliances, healthcare spending was still rising rapidly. President Clinton developed legislation in 1993 focused on major health reform, lowering costs, and providing access to care for all Americans. The Clinton administration’s health plan had a “bedrock assumption that all Americans must be guaranteed health coverage that, in President Clinton’s words, can never be taken away.”15 President Clinton believed that coverage for all would produce better health and well-being for all of society and could do so more efficiently than any past or present system.

President Clinton’s reform plan was focused on five principles; savings, choice, quality, simplicity, and responsibility.15 The Health Security Act maintained the rule that Americans, not the government or their employer, should have a right to choose where they receive their health care. Individuals would also be able to choose their “price point” for coverage, their own providers, and their own insurance package. Because there were existing state and federal health programs, the reform plan focused on developing national rules that would create consistency across state lines with respect to financing, establishing comprehensive benefits packages, insurance reform rules, guidelines for which employers could operate their own systems, and rules regarding controls on healthcare costs.15

Additionally, states were given leeway to choose alternative delivery models, define alliances and structures for health plans, and create long-term benefit care arrangements. The Health Security Act had a strong emphasis on preventative care in the comprehensive package of benefits and would create two new benefits designed to fill gaps in Medicare prescription drug coverage and long-term care in order to provide for home and community-based care instead of nursing home care for those with severe disabilities. No American could be denied coverage for preexisting conditions; there would be no waiting periods to get insurance; and Americans working for larger companies would receive their health insurance from health alliances that would require employers to pay 80% of the average premium costs. Smaller companies and public employees and dependents would contribute 80% of the weighted average premium in a regional alliance for each employee. Individuals and families would pay the difference between the 80% of the average premium and the cost of the plan they chose.15 The health alliances and reform would replace competition between health plans based on risk with competition based on quality; it would equitably spread risk by moving to a community rating system, maximize consumer choice, consolidate purchasing power, simplify choices for consumers, increase consumer cost-consciousness, reduce administrative costs, enhance insurance portability for consumers, and eliminate coverage restrictions.15 Funding for this program would have come from employers and families, as well as from government sources. Savings earned from Medicaid and Medicare, new government revenues, and an increased tax on tobacco were to provide the appropriate funding for the healthcare reform plan. However, the plan did not pass because many felt there was too much government involvement and it would eventually lead to a single-payer system. Others felt that their health would not be improved under the Clinton plan and that the cost savings would be neither attainable nor sustainable. Nearly 63% of Americans disapproved of managed care plans or HMOs, and many felt that the corporate health alliances that would be created would be similar to those of HMOs. By the end of 1994, President Clinton had lost nearly all support for the health reform plan.

By the year 2000, health care spending was rapidly increasing. The Centers for Medicare and Medicaid Services (CMS) was formed, and Medicare implemented DRG or “bundled” payments for services to control costs and spending. By the mid 2000s, CMS was beginning to restructure the payment model to focus on value-based care. Value-based programs reward healthcare providers with incentive payments for the quality of care they give to Medicare beneficiaries. These programs are part of a larger quality strategy to reform how health care is delivered and paid for. Value-based programs also support the Triple Aim: better care for individuals, better health for populations, and lower costs.16

Recognizing the need to insure more Americans, reduce excessive spending, and curb the rising costs of medical expenditures, the federal government began drafting legislation for a more comprehensive healthcare reform bill. Under President Obama, the Patient Protection and Affordable Care Act of 2010 (ACA) was passed, resulting in 21 million Americans receiving health insurance for the first time.

The ACA was the first piece of comprehensive healthcare reform legislation since the implementation of Medicare and Medicaid, passed with the goal of ensuring healthcare access to all Americans. It expanded on the model of bundled payments, provided funding for electronic medical records, and sought to create greater efficiency, quality, and transparency in the healthcare system. The ACA focused on changing eight major aspects of healthcare delivery17:

  1. Access, addressed by expanding Medicaid and creating a national health exchange

  2. Cost control, by creating accountable care organizations (ACOs), bundled payments, and a “Cadillac Tax” on employer-sponsored health benefits in situations where the monetary value of the benefits exceeds legally specified thresholds

  3. Quality improvement, focused on reducing in-hospital infections and readmissions

  4. Prevention, which included coverage for preventative services (essential health benefits) without copayments

  5. Workforce development to train new physicians and health professionals

  6. Revenue earned on devices, cosmetic surgery, and tanning salons

  7. Changes to administrative billing and costs, that is, saving overhead costs through simplification of administrative tasks

  8. Creation of the Center for Medicare and Medicaid Innovation (CMMI) to design, implement, and test new payment models.17

The 21st century has seen significant developments in how and where people receive their care. Healthcare systems are becoming larger and more integrated, offering a wider range of services and for a lower cost. Allowing patients to stay in one network for care has improved access, coordination, and quality by standardizing delivery requirements within the network. Technology has also greatly changed how Americans access care. Many healthcare networks are using telemedicine to extend access to specialty care in remote (rural) communities and to provide 24-hour on-demand health care for patients via a smartphone or PC.