TWELVE

Oversight and Entitlement

“I think we just delivered the South to the Republican Party for a long time to come.”

—President Lyndon B. Johnson upon
signing the Civil Rights Act of 1964

One of the great daily challenges every medical student faces is hunting for a quiet corner of the world to hunker down and study like a madman. My solution as a first-year student had been to bypass the library and make a quest for an abandoned hallway in one of the older buildings on the historic University of Kansas medical campus in Kansas City. The oldest medical school west of the Mississippi River, KU has been perched atop a hill overlooking the railyards and Kansas River below, and nestled up against State Line Road for over a century. The redbrick buildings are a mishmash of clinical, research, administrative, and inpatient wards bearing the names of the East Coast pioneering physicians who forsook prestige in Boston, New York, and Philadelphia, and made their way to the cattle town at the confluence of the Missouri and Kansas Rivers.

My favorite haunt has become the Eaton Building, whose upper stories lie dormant, and the speckled, worn marble floors hint at a past life of a clinical ward or hospital wing. Although there is no sign that says, “Do Not Enter,” I’m not entirely sure I’m welcome in this abandoned building, but it’s quiet and I have cobbled together a decent desk and chair, and after a few weeks it’s starting to feel like it’s my space.

Every night I come to Eaton, and am happy that my spot remains my little sacred study hollow. I like the smell of this oversized room, a faint essence of iodine (which always reminds me of my veterinarian father) mixed with bygone cleaning supplies and old-fashioned floor wax. Not that anyone has cleaned this room, or entire unit, for years, but the isolation is exactly what I want. There are over fifty buildings on this campus, and I can’t believe my good fortune in finding a no-fuss, no-drama, study bastion.

Looking out the window of this scholarly domicile, through crooked blinds with broken strings, I see the massive Bell Memorial Hospital of the University of Kansas. It’s actually the fourth hospital to bear the name, but this 1979 edifice is massive and modern, constructed of white concrete slabs and large window panes with exposed vent tubes and interior stairwells that recall the Lloyd’s of London building. Its exterior is lit by flood lamps, and the contradiction between its modernity and my little brick building, shadowy and quiet, stirs something monkish within me, invigorating contemplations of the inner workings of the human body.

Lost in thought, memorizing the origins and insertions of the muscles around the shoulder joint, I faintly hear a scratching and clanging sound emanating from the hallway. As happens when you’re isolated and alone, an odd sound supercharges my senses, and I feel like a sonar technician on a submarine. The reverberations of a sandy-scraping and metallic knell are unfamiliar and disquieting; now I’m fully attuned to the shuffle-shuttle-clang coming my way. The hallway outside my room is dark, and turning in my chair to face the doorway, an inky silhouette dissolves into view, accompanied by the syncopated motif.

Adjusting my eyes, I am gazing at an aged, friendly African American woman, bent-over and crooked, bearing the scars of years of labor and arthritis. Her black leather “old lady” shoes are worn and unevenly eroded owing to her angled and warped knees and ankles. Her brown dress is a little ragged, and her black overcoat is draped over her sagging shoulders, an odd clothing choice for August, but typical for an urban woman getting dressed for an important meeting or church. In her leathered hand is a rusty length of rebar, the textured metal rod used in construction sites. This explains the clanging sound I heard emanating from the hallway, but the heavy metal rod seems like an odd choice for a cane, with no handle and its significant weight.

Out of breath and disoriented, the woman’s relief matches mine, hers from having found a living soul in this vacated building, and mine from her not being a phantom. She finally asks me, “Do you know where Room 312 is?”

Standing up, I make my way to my vagabond friend, and discover that her name is Mrs. Robinson. Her frilled black hat rests askew on her head, with gray hair springing out from underneath its brim. Her dark eyes, yellowed sclera, and fatigued visage cannot conceal her sociable character, and catching her breath, Mrs. Robinson tells me her grandson was in a car accident. Her family had told her that Vernon was being housed in Room 312, and with that mention, Mrs. Robinson glances up at the number over my doorway. Realizing that Mrs. Robinson is lost, and has somehow found herself in this unoccupied structure, I point to the illuminated modern hospital viewable from the windows of our room. “That’s the University Hospital, and that’s where your grandson’s room is.”

“But Vernon is colored, so I knew he would be here.”

Now I’m curious, wondering if there is actually some reason Mrs. Robinson is here. “But why would he be here?”

“Because this is the Negro Ward, where I used to work as a nurse, and this is where all the colored doctors and nurses cared for the negro patients.”

I am speechless. Kansas was founded as a free state in the midst of the Civil War, just across the border from its contentious neighbor, the slave state of Missouri. To the surprise of many, Civil War battles were fought within miles of the KU medical campus, but Kansans prevailed, never allowing slavery. Sadly, segregation thrived, even in Kansas (the landmark Supreme Court decision Brown v. Board of Education was situated in Topeka, Kansas), but I never considered that the University of Kansas had a separate hospital for some of its citizens.

I take Mrs. Robinson over to the main hospital, hearing stories about life before and after segregation and the Civil Rights movement. I am pleased to be the one to inform her that all patients, without respect for race or religion, are housed together in the main hospital. After dropping her off at the elevator bank (still clinging to her clanking cane), I head to the cafeteria for more coffee. The hospital cafeteria is built on the western side of the modern hospital, with a large wall of windows jutting out from the main structure. Looking outside, across a small grassy garden, I am staring at my study haven, the Eaton Building. I never noticed before, but there is a walled-off former main entrance on the east side of the old brick building. I look closer, and chiseled above the entryway (that leads nowhere) is the identification, EATON BUILDING. This building used to be on the backside of the medical campus, and that set of doors used to be the main entrance. Peering through the dark summer evening, Midwestern fireflies providing traces of light, I squint a little more at the Eaton Building sign, and realize that not long ago, carved into stone were the words “Negro Ward.”

Following the standoff at the University of Alabama between Governor George Wallace and the federal government regarding the admission of two African American students, President John F. Kennedy on June 11, 1963, gave a heartfelt plea that would become known as the “Civil Rights Address.” Initially standing “in the schoolhouse door,” Governor Wallace relented, but that night, President Kennedy delivered a thirteen-minute speech about equal access to public institutions for black Americans, saying, “I am therefore, asking the Congress to enact legislation giving all Americans the right to be served in facilities which are open to the public—hotels, restaurants, theaters, retail stores, and similar establishments. This seems to me to be an elementary right. Its denial is an arbitrary indignity that no American in 1963 should have to endure, but many do.”

Five months later, President Kennedy was assassinated, and while there are boundless disastrous consequences associated with his slaying, there are at least two surprising accomplishments that likely would not have occurred without his murder. Before Kennedy’s original term had ended, and within seven months of his slaying, the Civil Rights Act of 1964 was signed into law by President Lyndon B. Johnson. One year later, Medicare and Medicaid were simultaneously established by the signing of the Social Security Amendments of 1965. The two most significant acts, part of what Johnson called the “Great Society,” share an unanticipated linkage, enjoying a symbiotic relationship not fully appreciated, even today. In President Kennedy’s Civil Rights Address, when cataloguing the various institutions (hotels, restaurants, theaters, and retail stores) that barred equal access to Americans of African descent, he never mentioned hospitals. In 1963, it was too much to ask for, and not realistic, but within a few short years, most of the country’s over five hundred “Negro Wards” would be shuttered, rendered impractical following the formation of Medicare.

Medicare is the single largest payer for health care in the United States; and it just celebrated its 50th birthday. If Medicare were human, it just reached eligibility for AARP. Once a senior citizen begins receiving Social Security benefits, she is automatically enrolled in Medicare Part A. One is “entitled” to receive Medicare benefits as a United States citizen; one most opt to forego all of their Social Security benefits to not be enrolled in Medicare. Hence, the term, “entitlement” spending when considering Medicare benefits. This type of “compulsory” health insurance began in late 19th century Germany, where the philosophy of “social insurance” was introduced by the leadership team of Chancellor Otto von Bismarck.1 The German Sickness Insurance Act was passed into law in 1883 (the year after Robert Koch discovered the bacteria that causes tuberculosis), establishing a compulsory health care system financed by employer and employee contributions, in which the wealthy contributed more than the poor.2

Many European countries adopted similar forms of compulsory health coverage, including Great Britain in 1911. The British National Health Service was later formed in 1948, ensuring health care coverage for all Brits (emulated in Canada in 1968). The social insurance program initially constructed in Britain in 1911 was no doubt inspirational to American intellectuals, including the leader of the Progressive (Bull Moose) Party, Theodore Roosevelt.

Roosevelt’s politics had morphed from a strict, conservative “blue blood” Republican agenda to progressive, social equality initiative orientation, and national health insurance was a party platform for the Progressive Party in 1912. Both Roosevelt (Progressive) and Taft (Republican) lost the 1912 election, with Democrat Woodrow Wilson securing the presidency from 1913 to 1921. While national health insurance had been an issue of some interest in 1912, the momentum for its passage was stalled by World War I.3 Decades would pass until a form of national health insurance would become a political possibility.

Justin Ford Kimball, a native Texan, had graduated from Baylor University and law school at the University of Michigan, and started his career as a high school teacher and principal in small Texas and Louisiana towns. Successful at every turn, he became a school superintendent before returning to Waco to practice law. Critically, Kimball then worked as counsel for the receiver in the case of a bankrupt chain of insurance companies, which exposed him to actuarial sciences. In a field where analytical skills are paramount, combining mathematical and statistical methods to analyze and manage risk, Kimball was a natural.

A gifted administrator, “crusty and colorful … a worldly erudite man who claimed kinship to half the population of the state and was well connected to the Dallas upper crust,”4 Kimball became the superintendent of schools in Dallas from 1914 to 1924. Halfway through his tenure, a worldwide plague of biblical proportions rattled an already fatigued nation.

The horrific influenza pandemic of 1918, one of history’s greatest natural disasters, had spread around the world in the final year of World War I, resulting in the deaths of more than fifty million people worldwide. More Americans died from the flu (675,000), than perished overseas fighting in the Great War. With no flu vaccine and no medicines to treat the lethal pneumonia that accompanied the disease, Americans felt particularly vulnerable to the contagion. While less than one thousand people died in the Dallas area, sickness and lost workdays were critical issues that confronted Superintendent Kimball.

Kimball created a “sick benefit fund” for the city’s teachers to protect their livelihoods during the influenza pandemic, “where a membership contribution of $1 a month entitled those who fell sick to compensation of $5 a day, which offset lost earnings after the first week of illness.”5 His experience in evaluating and mitigating risk using statistics had prepared Kimball to formulate a program for his teachers. It also forced him to carefully tabulate numbers of subscribers and their health-related statistics.

In 1929, Kimball became the vice president of Baylor University’s Dallas medical campus, overseeing education programs at the colleges of medicine, dentistry, and nursing, with an additional responsibility of shoring up the shaky finances of the university hospital. Hospitals were just transforming from almshouses and places of death and dying to sanctuaries of recovery and wellness, but at a price. Baylor University Medical Center was in deep financial trouble—even before the collapse of Wall Street on October 29, 1929—and Kimball was the kind of pragmatic, multitalented Texan for the job.

As school superintendent, Kimball had full knowledge of the numbers of teachers “insured” through their sick benefit fund, and now as administrator of the Baylor Hospital, he gained access to the hospital’s finances, its costs and revenues, and, in particular, how much money Baylor was bleeding when caring for Dallas teachers. One of Kimball’s young protégés, Bryce Twitty, conjectured, “Why we couldn’t do for sick people [teachers] what lumber camps and railroads had done for their employees … [referring to] company doctors”6 who tended to local workers, benefitting both the companies and the well-being of the labor force.

In the early fall of 1929, Kimball approached his old friends at the Dallas schools administration, proposing a type of hospital prepayment program, where teachers could make a similar monthly payment to budget against future hospital bills. There were no national actuarial data to guide Kimball; life insurance companies had always shied away from health care, and no one had worked out the statistics between healthcare demand and costs. How much to charge teachers? No one knew, but Kimball did have the thorough records from his days as superintendent. “Those records … were the only actuarial material I could find anywhere in the U.S. I had designed the forms myself to extract this information, having been an insurance lawyer. [During the fall, advertisements were circulated among teachers] that if 75 percent of the teaching group would sign up and send in 50 cents each month beginning with their November sick-benefit dues, Baylor Hospital would accept the amount as prepayment for hospital care when needed.”7

Serendipity for both the teachers and the hospital dictated that the stock market crash and the launching of the program happened within hours of each other. Not surprisingly, teachers subscribed in droves, and by December, more than 75 percent of all teachers in Dallas subscribed, and on December 20, 1929, the plan went live, coinciding with Christmas vacation. The success of “The Plan” was immediate, and employees of Dallas’s Republic National Bank and the Times Herald soon joined; in fact, 408 employee groups with 23,000 members subscribed over the next five years. Kimball had saved Baylor University’s hospital, rescuing it from insolvency and, like a pied piper, ushered a steady stream of patients to its doors. As medical costs were beginning to explode, patients were happy as well: an accident or serious illness no longer meant financial ruination.

With America teetering on the brink of the Great Depression, plans began to spring up around the country. Initially centered around single hospitals, joint hospital programs started in larger cities, and small “prepayment” premiums were well received by Americans who were beleaguered by war, pandemic, and financial collapse. More sophisticated insurance products would be developed over the next several decades, like indemnity (specific cash benefit, or “cap”), service benefit (care for a number of days for specific conditions), and major medical insurance (supplemental, “catastrophic”), but in the 1930s, the plans served as the only health insurance that most Americans had ever heard of.

In Minnesota, the local organization was called the Blue Plan, and its administrator E. A. van Steenwyk sought a new symbol for his company. After some deliberation, a blue cross was chosen, and of course, in time, the cross came to symbolize, nationwide, the hospital prepayment programs that would eventually become a centralized national organization. Justin Ford Kimball, who was neither an actuarial scientist nor trained hospital administrator, took the reins of a hospital as a fifty-seven-year-old neophyte, and started a revolution in health insurance that within a few years became Blue Cross, now a multibillion-dollar industry that dominates the federal and commercial healthcare landscape, and paved the way for Americans to afford major elective (implant-based) operations that would have been unthinkable in the Roaring Twenties, when Kimball was just a school superintendent.

Within a decade of the founding of Blue Cross, the American Medical Association (AMA) and its physicians decided that a similar prepayment program should be considered for physician office visits. There had been longstanding antagonism between the very powerful American Hospital Association and the AMA, particularly on the issue of health insurance, where hospitals were almost immediately receptive to the idea of national health insurance (even government-controlled single-payer insurance), and the AMA was fighting tooth and nail to reject government-administered health insurance.

Advances in medical and surgical sciences were “shifting the locus of care from the home or doctor’s office to the hospital,”8 and as “house calls” were vanishing, there was a growing public demand for medical service plans for physician services. In time, physicians warmed to the (initially) not-for-profit Blue Cross idea of healthcare prepayment, and Blue Shield was born. Years later, Blue Cross and Blue Shield nationalized and conglomerated into one large corporation.

The foundation of Blue Cross and Blue Shield had been built upon relationships that hospitals and physicians had with employee groups, like teachers’ unions, factory and steel mill workers, and police associations. Separately, a surprising collaboration between employers and their workers during the administration of President Theodore Roosevelt had resulted in sweeping reformation in almost every state in workers’ compensation insurance. Concessions were made by both sides; employers were motivated to establish a system of workplace injury insurance to minimize their legal exposure to employee lawsuits, and employees finally enjoyed workplace protection from punitive work hours, unsafe working conditions, and lack of healthcare for injuries sustained on the job. The combined effect of work-derived health insurance and workers’ compensation insurance meant that many workers enjoyed health care services that had been absent just one generation before. In the early 1940s, a set of wartime wage and benefit laws and regulations stipulated that employers provide healthcare as a “fringe” benefit, furthering the link between a job and health insurance.

Unfortunately, because the reforms had almost exclusively centered on workers, Americans who were unemployed or elderly were still out in the cold in the late 1950s, and as medical care got more expensive, hospitalization was becoming ever more threatening to one’s financial health. Listerism, the technique of cleansing the skin and surgeon’s hands with carbolic acid, forever changed the notion and reality of which conditions could be surgically broached, and the half-century from the 1880s to the 1930s witnessed a titanic shift in the vulnerability of mankind. However, serious infections usually hastened death, and it was only the practical introduction of sulfa drugs and penicillin in the 1940s that broadened the safety zone for surgical interventions that could be hazarded in hospitals. The combination of aseptic surgery and antibiotic treatment proved simultaneously heroic and expensive. It’s quite cheap to let someone die—it’s very costly to save someone’s life.

Wartime American employment, production, and innovation continued at a dizzying pace. “The stunning growth of the nonprofit Blue Plans was not lost on the commercial insurance industry, especially those companies that were already selling life and casualty coverage to employee groups…. Between 1940 and 1946, the number of group and individual hospitalization policies held by commercial companies rose from 3.7 to 14.3 million.”9 Similarly, group surgical indemnity coverage grew from 2.3 to 10.6 million policies. There was very little American hospital construction prior to the end of the year, but the 1946 Hill-Burton Hospital Survey and Construction Act launched a nationwide hospital boom. “Between 1946 and 1960, the number of voluntary and state and local government hospitals had increased by 1,182 … federal spending under the program began at $75 million a year in 1948 and rose to $186 million by 1961.”10 One critical stipulation of the Hill-Burton Act was that hospitals (which had received grants) were required to provide free care for twenty years to persons unable to pay for medical services. For those initial recipient hospitals that received funds in 1946, it is no small wonder that Medicare’s activation in 1966 came at just the right time.

The explosion and expansion of gleaming hospitals and the swelling numbers of insured lives couldn’t conceal the plight of the those “left out by reason of age or economic status … the indigent and the unemployed.”11 The first group to be seriously addressed would be the elderly. The Great Depression had ravaged the financial stability of millions of Americans, and the monumental Social Security Act of 1935 established a permanent national old-age pension system through employer and employee contributions (still reported in box 4 of your W-2 form). It was natural that President Harry Truman’s head of the Federal Security Administration (later renamed the Department of Health, Education, and Welfare, and still later split up into multiple federal departments) would scheme with his colleagues at the Social Security Administration that the same program that provided old-age benefits could be configured to provide healthcare coverage benefits. FSA administrator Oscar Ewing stated, “The proposed benefits would give [the aged], through their own contributory insurance system, badly needed and valuable hospitalization insurance … that would reduce federal, state and local expenditures … and reduce deficits of hospitals.”12 Just like Justin Ford Kimball’s Dallas teachers setting aside a little money with each paycheck as a prepayment of hospital expenses, Ewing was proposing a system where Americans would prepay money to be used in the event of sickness as an elder. Uttered in 1951, it would take a decade and a half to codify into law.

The most strident opposition to nationalized healthcare for the elderly had arisen from the AMA, calling Truman’s initial proposals “un-American” and “socialized medicine,” fearing that congressional oversight of medicine would lead to poor doctor pay. During the forties and fifties, “government solidified the private health-care system through corporate tax breaks that subsidized companies offering insurance to their workers. More workers were brought into the private system through this indirect and hidden form of government assistance, creating even greater resistance to the idea of the federal government directly providing insurance.”13 While the AMA and its physicians had little power in the 1800s, the modernization of medical education, and the purging of pretender medical schools in the aftermath of the (Carnegie Foundation–funded) Flexner Report of 1910, resulted in the monopolization of power by physicians, as Pulitzer Prize–winning medical historian Paul Starr has advanced.14 Flushing snake oil down the drain after the Pure Food and Drug Act of 1906 and shuttering counterfeit medical schools after the Flexner chronicle endowed doctors with ever-increasing prestige and negotiating power.

The first Congressional bill to address healthcare for the elderly was the Kerr-Mills Act (Social Security Amendments of 1960). Although the dream of compulsory national health coverage for all Americans had died decades earlier, even limited coverage for the elderly had been a slog. Oscar Ewing’s chief advisers Wilbur Cohen and Isidore Falk, therefore developed an “incremental approach” to accomplish their (as of 2019, still unrealized) goal of universal coverage. “The idea [of incrementalism] was to bring about the passage of a modest program of insurance for a small number of people, and then gradually to expand that program until it covered the entire population.”15 The Kerr-Mills Act had tremendous bipartisan support, as opposed to Sen. John F. Kennedy’s alternate compulsory health insurance proposal, to be financed with an increase in Social Security taxes. While there was an increasing number of retirees who had health insurance (31 percent in 1952, 44 percent in 1956, and 53 percent in 1959),16 the Kerr-Mills Act depended upon states agreeing to participate and the efficiency of federal administration, two of the factors that limited its effectiveness.

Incrementalism is a fruitful tactic if a restricted or pilot program thrives; alternatively, incrementalism can also be a winning strategy if the maiden program doesn’t succeed, since proponents can argue that the limitation itself had hamstrung their pet project. The Kerr-Mills Act presented universal coverage agitators both options: they could point to spotty coverage for seniors across the country while showing that many were still left out in the cold. Within months of its signing by President Dwight Eisenhower, the act was challenged by newly inaugurated President John F. Kennedy in his 1961 State of the Union address when he called for a federal Social Security–linked program to provide hospital insurance to the fourteen million Americans over age sixty-five.

The King-Anderson bill was introduced soon after Kennedy’s speech, with the proposal of compulsory coverage for hospital and nursing home care for seniors. Dubbed “Medicare,” the King-Anderson bill enjoyed support among unions and liberals, but was opposed by the AMA, business groups, and conservatives. The co-sponsor of the Kerr-Mills Act was Wilbur Mills, a Democrat from Arkansas who had risen to the powerful position as the chair of the House Ways and Means Committee during the Kennedy administration. At the time, there were still many “conservative Democrats” in the House and Senate, and Chairman Mills was one of them. From the beginning, the King-Anderson bill faced an uphill battle, starting with the slim margin of victory that Kennedy had secured in his defeat of Richard Nixon in 1960. Representative Mills worked for years to modify the legislation and secure the necessary votes to get it out of committee, but roadblocks by the AMA and other lobbies stalled the bill (saying nothing of Mills’s own recalcitrance).

Less than three years into his term, President Kennedy was assassinated on November 23, 1963. “Two days before Kennedy’s death, Washington Post columnists Rowland Evans and Robert Novak wrote, ‘As long as Mills keeps opposing health care financed through the Social Security system, President Kennedy’s plan is doomed in the Ways and Means Committee.’”17 Democrats enjoyed landslide victories in the House and Senate, and President Johnson vowed to make civil rights and Medicare a priority as part of his “Great Society”; he knew that the emotional time was ripe during the “honeymoon” first session of his presidency.

The passage of the Civil Rights Act (1964) occurred in the “twilight of a New Deal dispensation that stretched back thirty years,”18 but which could not have happened during the Truman or Eisenhower (or even FDR) administrations. The Democratic Party had dominated national politics for decades, having won seven out of the last nine presidential elections, while averaging “a whopping 424 Electoral College votes compared to Republicans’ 101.”19 The question arises, then, why the struggle to pass proposals by Truman or Kennedy? The answer is the ferociously cohesive Southern Democrats, who “protected segregation, fought unions, and subverted most social reform, [combined with the] seniority rules and the South’s pattern of reelecting its members.”20

The Democratic Party had two main coalitions during the New Deal era: Northern liberals, who were fond of crafting socially innovative proposals, like “Social Security expansion, national insurance, robust labor protections, child welfare programs, and so on …”21 and Southern conservatives, who functionally ruled Congress. Political scientist Ira Katznelson has described it as “a coalition of Swedish welfare state and South African apartheid, dominated by the latter.”22 When the Civil Rights Act did pass, it only did so after a filibuster of ninety days was broken by a cloture vote, where twenty-seven of the thirty-three Republican senators joined forty-five Democrats to break the Southern resistance.

Under a similar arrangement, many political pundits believe that the Civil Rights Act facilitated, even enabled, Medicare’s passage. Republicans sensed that Medicare was a fait accompli, and sponsored an alternative bill nicknamed “Bettercare,” a voluntary insurance program that would cover doctors’ fees, financed, in part, by general tax revenues. For its part, the AMA recommended a different plan, entitled “Eldercare,” which would have functioned as an expansion of the Kerr-Mills program, covering doctor visits, nursing home care, and prescription drugs. In essence, Eldercare was the forerunner to Medicaid.

The competing proposals of Medicare, Bettercare, and Eldercare were simultaneously contradictory and complementary. Hospital coverage, physician office insurance, and expanded indigent care were the three pressing needs that had never been considered by Congress, not even by those in favor of universal hospital comprehensive insurance for seniors. It would be impossible for all three proposals to pass at once, adding a huge obligation to the people and their government. Actually, that’s exactly what happened.

Chairman Mills pulled one of the greatest legislative coups in history, when on March 3, 1965, he proposed combining the main aspects of all three bills. During a meeting of the House Ways and Means Committee, Mills turned to Johnson’s representative, Wilbur Cohen, and asked why they “could not put together a plan that included the Administration’s Medicare hospital plan with a broader voluntary plan covering physicians and other services?”23 Cohen later recalled, “The federal government was moving into a major area of medical care with practically no review of alternatives, options, trade-offs, or costs.”24 After a few months of deliberations, both the House and the Senate passed the bill, known colloquially as “Mills three-layered cake,” but formally as amendment Titles XVIII and XIX of the Social Security Act. Title XVIII was comprised of two parts, Parts A and B, outlining hospital and supplementary medical coverage (like physician’s office visits), respectively. Medicaid was established through passage of Title XIX, but has never been referred to as “Part C;” that would come thirty years later with the passage of the Balanced Budget Act of 1997, formalizing managed capitated-fee health plans, initially called “Medicare+Choice,” and later called “Medicare Advantage.”

Understanding the history of the “baking of the cake” explains many of the confounding details of Medicare and Medicaid. For instance, why is hospital coverage defined as “Part A?” Because hospital coverage was defined under “Part A” of the Medicare Act (that had originally been the King-Anderson bill). Why is physician coverage reimbursed under “Part B?” Because doctor visits are processed through “Part B” of the Act (officially Amendment Part B of Title XVIII of the Social Security Act). Why is Part A funded through Social Security taxes? Because the bill was passed, from the beginning, as an add-on to Social Security, and explains why Part A costs are reimbursed from the Social Security bucket. Conversely, Part B payments come from general tax revenues, as was originally proposed.

After years of wrangling, a full half-century since Theodore Roosevelt had proposed universal coverage, Medicare was signed into law by President Lyndon Johnson on July 30, 1965, while sitting in the Truman Library in Independence, Missouri. Sitting next to him was the first-ever recipient of Medicare, Harry S Truman, who received his official Medicare card that day.

Civil rights activism had fundamentally reshaped the way America thought about health care for the poor, unemployed, and seniors. “Medicare, the result of a landslide election propelled by the passage of the Civil Rights Act and the civil rights movement that shadowed its implementation, was a gift of that movement.”25 Within a decade, “[h]ospitals became the most racially and economically integrated private institutions in the nation … all but four or five of the once more than 500 black hospitals had either been closed or converted to other purposes.”26, 27 What is more difficult to comprehend: that there were still segregated hospitals in America in the 1960s (like Eaton Ward at my alma mater), or that the Medicare Act helped close those hospitals?

It would take a full year for Medicare to “go live,” and in that year, one would guess that the behemoth US federal bureaucracy, like a titanic ocean vessel, would have steered and tilted its way toward a new horizon, exercising control over the thousands of physicians and hospitals that served the tens of millions of newly insured lives. Shockingly, there was no dominion, as Wilbur Cohen, one of the chief architects of Medicare, lamented later: “The sponsors of Medicare, myself included, had to concede in 1965 that there would be no real control over hospitals and physicians. I was required to promise … that the Federal Agency … would exercise no control.”28

When Medicare passed, legislators had codified the then-ruling payment policies of the Blues, in which the not-for-profit Blue Cross hospital insurance plans during the 1930s functioned as a “stable conduit of money to the [hospital industry].”29 Critically, the state-based Blues plans typically reimbursed hospitals for costs incurred while treating patients—no matter the cost—so that there was little incentive to constrain costs at a time when healthcare was entering into an explosive growth phase. Uwe Reinhardt of Princeton University has argued that this orientation around “reimbursement” and not “payment,” where “hospitals would have to manage their line-item costs against external constraints,”30 fostered an inherently inflationary system. The state-based Blue Shield plans (for physician reimbursement) paid physicians at his or her “usual, customary, and reasonable (dubbed ‘UCR’)” fee, again exhibiting almost no cost control.

The inflationary arrangement did not stop at hospital and physician reimbursement; “Medicare was required to reimburse each individual hospital retrospectively a pro rata share for all the money that the individual facility reported to have spent on capital investments in structures and medical equipment … and a pro rata share for whatever its operating costs might be.”31 With guaranteed reimbursements securing a rate of return, it is little wonder that investor-owned hospital growth took off. Medicare adopted the Blue Shield UCR-style physician reimbursement, paying physicians according to “customary, prevailing, and reasonable (called ‘CPR’)” fees, with only slightly more rigid restrictions.

“In effect, then, in return for acquiescing in the passage of Medicare into law in 1965, healthcare providers extracted the key to the U.S. Treasury from Congress,” argues Reinhardt.32 With reimbursement—and not payment—the watchword, annual Medicare outlays immediately, and annually like a metronome, vastly surpassed predicted aggregates. Ironically, it would be stalwart Republican presidents (Nixon, Ford, Reagan, and George H. W. Bush) who “sought to bring the hospital and physician sectors to their knees, in ways that Democrats would never dare to do.”33 In the late 1970s, the Carter Administration agreed to the “Voluntary Effort” of the hospital industry to control costs, but this naïve promise failed to make an impact.

Twenty years of “reimbursement” came to an end during the Reagan Administration, when Medicare rules were changed to a more business-oriented methodology. “The very idea of retrospective full-cost reimbursement, an approach that would look strange to anyone accustomed to normal business principles, particularly vexed the administration.”34 Researchers and policymakers, therefore, arranged medical conditions into slightly over five hundred “diagnosis-related groups,” or “DRGs,” permitting the remuneration of hospitals based upon a preset fixed sum per case, allowing a “fair profit.” This was truly revolutionary, and has been copied by countries around the world, and even private insurers in the United States.

Reimbursing hospitals by a DRG case–based accounting system was the opening salvo by the federal government in ending the decades-old approach of hospitals and physicians charging (and receiving) limitless sums of money for guideline-free “customary” care. With economists and statisticians gaining power, Medicare funded a major study on the “relative costs” of providing various physician services, aiming to identify the time, skill, and risk involved in treating a large number of distinct medical vignettes.35 This led to the “resource-based relative value scale” (RBRVS), paving the way for 1989 legislation that formalized a physician fee schedule based on RBRVS (with “geographic adjustors for variations in labor costs, malpractice premiums, and the [cost of office space]”).36 Forever gone would be the old notion of fat cat hospitals and doctors tugging on Uncle Sam’s purse strings, whimsically charging usual or customary fees for services.

Medicare-initiated innovations like DRGs and RBRVS have been adopted by the private insurance sector, and further, when hospitals and physicians negotiate with not-for-profit and for-profit carriers, their rates are based upon a particular Medicare fee schedule year (a physician might say, “our new Cigna contract is 135% of 2015 Medicare”). Although Medicare accounts for “about 20% of total national health spending, at $572.5 billion of a total of $2.793 trillion,”37 it is the power broker in payment reform for both public and private healthcare spending. More recently, Congress has attempted to corral physician costs on a global scale, setting a target for overall Medicare spending linked to the growth of the Gross Domestic Product (GDP). Utilizing this calculated “Volume Performance Standards” (VPS), the program stipulates a reduction in physician fees in the present year if the previous year witnessed a budget excess. While this Draconian arrangement, the Sustainable Growth Rate (SGR) system, should be easy (and powerful) to implement, it is one of the last frontiers where physicians have exercised legislative influence, and the “doc fix” has rarely been activated. Disregarding the SGR has almost become an annual congressional rite of passage in DC, and some have criticized the near-perpetual state of abeyance of the SGR as a contributing factor in the “unsustainable” increase in healthcare spending.

Medicare accounts for one-seventh of total federal spending: about $588 billion of the $3.9 trillion budget funded Medicare in 2016.38 In its fifty years, the percentage of the federal budget spent on Medicare has steadily increased, topping over 15 percent in 2016 (and 3.2 percent of GDP), and projected to surpass 16 percent of the budget and 3.6 percent of GDP by 2024. Between 2010 and 2050, the population over sixty-five will double, from about forty million to eighty-four million people, and a large tranche of that group will be seniors over eighty, who are typically very expensive to care for. Uwe Reinhardt posits, “The current debate on US fiscal policy clings to the notion that the fraction of overall government spending as a percentage of GDP must be kept at or below a given percentage—regardless of the funds needed … This is the idea that is unsustainable.”39

Skeptics can rightly criticize the poor value Americans receive for their healthcare dollars when compared to most Western countries (so admits this surgeon-author). We pay too much for drugs, implants, and procedures, but in this new era of cost-consciousness and outcomes tracking, Americans will see an improvement in “getting what we pay for.” However, there is simply no other place in the world where most economists, actuarial scientists, policy-makers, and physicians themselves, would want to be cared for when suffering from a heart attack, cancer, or trauma, but improved cost-control initiatives will need to be nurtured.

Understanding the genesis of the FDA and Medicare is essential to understanding the “perfect storm” in the explosion of implants. Improved materials sciences, the discovery of antibiotics, the supervision of implants by the FDA, the government-facilitated launch of thousands of new hospitals following World War II, the invention of health insurance and the formation of Medicare all coalesced within a few decades. Patients needed health insurance to pay for the new expensive operations; hospitals, physicians, and implant manufacturers needed a reliable flow of insured patients. In 1965, who could have guessed, in their wildest dreams, what was about to happen? Of course, Medicare costs have always exceeded budgeted predictions, but Wilbur Mills and his colleagues cannot be blamed for not reading the tea leaves when the three-layered cake was made. Revolutions are tricky things to predict.