In the quarter-century since the dawn of the biotechnology revolution, hundreds of research scientists at the nation’s elite medical schools have decamped from their tenured sinecures to join pharmaceutical firms or biotechnology start-ups. Most have set up shop near the institutions that trained them—near Boston, San Francisco, San Diego, or Research Triangle Park in North Carolina. Others have gravitated to the outskirts of Washington, D.C., to be near the National Institutes of Health (NIH), the funding colossus of the biomedical world. Most see themselves as dedicated scientists in the mold of Martin Arrowsmith, the fictional physician in Sinclair Lewis’s novel, whose passion to make a mark in the world of research was always leavened by his abiding concern for the health of mankind. But virtually all have lurking somewhere in the corners of their minds another goal. They want to start the next Amgen.
Amgen Inc., however, did not spring from any of the intellectual command posts of the biotechnology revolution. It began in an office park in Thousand Oaks, a skateboard haven about an hour’s drive north of Los Angeles, far enough from downtown that local inhabitants sometimes refer to it as Thousand Miles. In that small city of cookie-cutter ranch homes and enclosed shopping malls, a handful of scientists trained at the University of California at Los Angeles and skilled in the new art of recombinant engineering began in 1980 what would eventually become the largest, fastest-growing, and most profitable biotechnology company in the world.
One of the first companies to bring a biotechnology product to market, Amgen has grown to more than eight thousand people working in forty buildings sprawled across its industrial park–like campus. Though Amgen attracted only a handful of top-notch scientists to move there over the years, the company registered nearly $5 billion in sales in 2002 and declared almost a third of that in profit.1 According to Forbes Magazine, investors who plunked down one hundred dollars for stock in the struggling start-up in the mid-1980s would have shares worth more than $1.5 million by 2001, making Amgen one of the business world’s most extraordinary growth stories in the decades when such stories were commonplace.
Yet for all of Amgen’s spectacular success, virtually all of the company’s revenue came from the sale of just two drugs. Both gained approval from the Food and Drug Administration during George H. W. Bush’s administration. Both were considered the low-hanging fruit of the biotech revolution, easy targets for the new technology of recombinant engineering. Amgen’s big sellers are artificial versions of naturally occurring enzymes that had been identified and isolated well before the company began developing them.
Amgen’s first approved drug and its biggest seller is Epogen. It is the recombinant-engineered version of erythropoietin, the enzyme produced in the kidney that signals bone marrow to manufacture red blood cells. The largest patient population in need of erythropoietin is the more than three hundred thousand Americans on dialysis. Their failing kidneys no longer produce it in sufficient quantities to prevent anemia. The federal government picks up the tab for most dialysis patients through the Medicare program, meaning Amgen’s financial success has largely come at taxpayer expense. Amgen’s other big seller is Neupogen, an artificial version of granulocyte colony-stimulating factor, which tells the bone marrow to produce infection-fighting white blood cells. This drug is a godsend to cancer patients undergoing chemotherapy, whose suppressed bone marrow is in need of extra stimulation.
The health benefits derived from these drugs have come at a hefty price. They are among the most expensive on the market. This is not because the drugs are costly to make. The technology behind recombinant engineering, invented in the mid-1970s, is now rather commonplace and can be conducted by intelligent college students working with lab equipment ordered over the Internet. Nor is the high price justified by the original investment in research and development. Amgen earned the cost of developing these drugs within a year or two of their coming on the market.
Rather, as Amgen’s extensive advertising on National Public Radio, in magazines, and in the medical literature puts it, there is only one justification for the high price tags on its drugs. They are needed to pay for the scientists and technicians squirreled away in Thousand Oaks, who are busily searching for the next generation of wonder drugs. It costs more than $800 million to discover a new drug, industry officials have said, drawing their figure from a single, frequently cited study from Tufts University. They have to put a high price on yesterday’s discoveries if they are going to conduct the research needed to come up with the next generation of wonder drugs. One set of recent Amgen ads featured a clinician clad in a lab coat peering into a microscope. The caption claimed the company was searching for therapies capable of “dramatically improving people’s lives.”
Half of Amgen’s employees and one of every five dollars earned over the past decade was in fact devoted to what the company called research and development. Yet Amgen’s labs were notoriously unproductive in the decade after its first drugs were approved. “It’s been a while since a major clinical advance has come out of Thousand Oaks,” said Mark Brand, a marketing professor at Johnson and Wales University in Denver who used to be the company’s top public relations man. “Their offerings to physicians have not been major league.”2
In late 2001, after a decade of disappointments, the company’s labs finally produced a new drug capable of generating a billion dollars in sales—the financial holy grail of pharmaceutical industry managers and investors. The drug is called Aranesp. The company touted Aranesp as its most significant medical advance since the arrival of its first two drugs. But Aranesp, like Epogen, was for anemia. Was it a dramatic new treatment for the debilitating condition? Company officials said it was. “We believe Aranesp simplifies the treatment of anemia associated with chronic renal failure, with potentially fewer office visits and less disruption to patients’ lives,” Kevin Sharer, the chief executive officer of Amgen, said.3
It is dramatic health claims like Amgen’s, and the assertion that only industry can produce those benefits, that justify the high cost of drugs in the United States. North America’s pharmaceutical and biotechnology companies have become the primary source of new drugs for physicians looking for new weapons for fighting disease. The hopes of millions of Americans for cures for cancer, Alzheimer’s disease, and the other debilitating conditions of aging rest on the tireless efforts going on in the drug industry’s labs. But there’s a catch. Industry can produce those results only if the American people continue to pay the highest prices in the world for drugs.
This book challenges that assertion by pulling away the curtain that has long shrouded pharmaceutical innovation. It asks two simple questions. Where do new drugs come from? What do they really cost to invent? To answer these questions, I take readers inside the arcane process of drug development for a representative sample of relatively recent discoveries—from their beginnings in academic and government labs to their final approval by the Food and Drug Administration. By viewing the entire process of drug development, I offer an alternative picture to the one painted during the heated debate in Washington over a prescription drug benefit for Medicare, a debate in which politicians and drug industry officials, echoing the Amgen ads, argue that drug prices have to remain high in order to fund innovation.
I first became intrigued by that assertion in 1999 while attending hearings on Capitol Hill devoted to the crisis in Medicare funding. As an economics correspondent for the Chicago Tribune, I was tugged in many directions that winter: Alan Greenspan and interest rates; the aftermath of the Asian financial crisis; the budget battles between the Republican Congress and a beleaguered President Bill Clinton. And, of course, there was the story that from the hindsight of our post-Enron world seems almost laughable: Would the president’s impeachment trial destroy investor confidence?
But something curious happened to me that busy news year. Whenever I wrote about these stories, I received no mail. No e-mails jammed my computer’s inbox. No readers searched out my telephone number. It was as if my dispatches about economic events at the peak of the bubble had disappeared into a black hole. When I mentioned this to my colleagues, they scoffed at my naiveté. Wasn’t it obvious? The American public, enjoying the fruits of a raging bull market, was too busy watching the presidential soap opera known as Monica.
Yet when I wrote about the Medicare reform debate—the National Bipartisan Commission on the Future of Medicare was concluding its deliberations about the same time that Kenneth Starr was concluding his—I received a completely different response. Letters to the editor began appearing in the paper. Senior citizens began sending me handwritten notes. Many came on personal stationery, a touching reminder of a fading era when people penned longhand notes to their representatives about matters that deeply concerned them.
The letters all spoke to the issue that bedeviled and eventually stalemated the Medicare commission. Why, my elderly readers wanted to know, were the prices of drugs so high? Why couldn’t the government do something about it? And why couldn’t the government provide a prescription drug benefit for senior citizens? While 62 percent of Americans take no drugs at all over the course of an average year, three-fourths of the elderly do, and half of them take two or more that require them to follow a daily regimen—usually for chronic conditions like high blood pressure, diabetes, or arthritis. In 2002 the nation’s total prescription drug spending soared to more than $160 billion a year and was rising at an 18-percent annual rate. Americans spent more on prescription drugs than on telephones, radios, televisions, and cell phones combined. Well over half of that came out of seniors’ pockets. In the richest nation on earth, some elderly Americans were hobbling onto buses to cross into Canada to buy cheaper medicines, while others sawed pills in half or did without basic necessities to get to the end of the month.4
There were no significant differences between Republican and Democratic appointees to the commission on the need for adding prescription drugs to Medicare. Everyone agreed that pharmaceuticals had become a key component of modern health care, just as hospital stays and doctor visits had been the main concern when President Lyndon B. Johnson signed Medicare into law in 1965. Yet, with the price of drugs skyrocketing year after year, millions of seniors were forced to choose between paying for their medicine and paying for the other necessities of life. Leaving Medicare without a drug benefit would turn a program that was designed to provide Americans with medical security in their old age into a mirror image of the nation’s health insurance market. A substantial minority of the population would be forced to go without.
Yet the appointees to the bipartisan Medicare commission faced a conundrum in trying to add a prescription drug benefit to the system. How could the government afford to add drugs to a program that was already headed for bankruptcy? Although the state of the economy over this decade will determine Medicare’s ultimate date with insolvency, the government’s actuaries predict the program will begin running chronic deficits just about the time the Baby Boom begins retiring in 2010.
Liberals and progressive Democrats in Congress offered one possible solution to this dilemma. They proposed a Medicare drug program that would act like any other large buyer in the marketplace. Pharmacy benefit managers, who operate drug plans for major corporations, negotiated steep discounts on the prices they pay for drugs. Why couldn’t the government do the same? Wielding senior buying power was one way to hold down costs.
Some experts also proposed limiting the choice of drugs that Medicare recipients could buy. In the jargon of the experts, such preapproved drug lists are called formularies. Government agencies like the Veterans Administration and some private-sector benefit managers already use them. Proponents of formularies argued that it makes no sense for the government to pay for lifestyle drugs like, say, Viagra, which is prescribed for erectile dysfunction but is widely used for sexual enhancement. And when there are two drugs on the market for a condition and both work about the same, the government should not pay for the more expensive brand name. Instead, it should reimburse people only if they buy the cheaper generic. A government formulary could sort through the morass of the modern drug marketplace on behalf of senior citizens.
Conservative Democrats and most Republicans on the panel recoiled in horror at these proposals. They took their cue from industry officials such as Alan F. Holmer, the president of the Pharmaceutical Research and Manufacturers Association, the industry’s main lobbying group in Washington. Holmer was not one of Washington’s more imposing figures. He did not have the golden tan and silver locks of the movie industry’s Jack Valenti, nor the technical expertise and insider savvy of PriceWaterhouse’s Kenneth Kies, the master architect of corporate tax breaks. Holmer often stumbled over his words when giving testimony and took a long time to formulate his response to questions. Yet he wielded enormous clout on drug issues, and his testimony was always the centerpiece of any hearing devoted to the topic. This was driven in part by the industry’s large campaign contributions. According to the Center for Responsive Politics, the pharmaceutical industry raised $26 million for political campaign contributions in the two years before the 2000 election, and in the 2002 cycle it was the tenth largest donor among all industries, up from thirteenth in 2000 and twenty-seventh in 1990. The industry also deployed more than six hundred paid lobbyists on Capitol Hill, more than one for every senator and representative.5
But Holmer’s influence did not depend solely on this largesse. It also rested on a powerful and compelling argument. The pharmaceutical industry’s top official said that without high prices, the innovation that led to new medicine would dry up. It was an argument that invoked fear and consternation among the health-conscious public and their representatives. Americans fervently believe in the power of modern medicine and do not want anything to jeopardize the promising treatments for cancer, heart disease, and Alzheimer’s, which the media routinely suggest are just around the corner.
Yet few Americans grasped the argument’s startling departure from the norms of modern business practice. Most industries view research and development as something they must do to stay one step ahead of the competition, just as they must reduce the cost of production to maintain profit margins. If they fail to innovate, they risk obsolescence and decline. “If we don’t spend our money on research and development, we will die,” I’ve heard more than one chief executive officer say to stockholders at an annual meeting. The drug industry stood this corporate mantra on its head. “If we don’t get your money to spend on research and development, you’ll die.”
The industry was not shy about deploying this argument on Capitol Hill. Testifying before the Senate Finance Committee in May 2000, the industry’s top spokesman warned legislators that a disaster would befall the American people if the government tampered with the prescription drug market. Holmer told the committee that adopting a senior citizen drug benefit that imposed any kind of restrictions on price or relied on a formulary would dry up the revenue stream needed for innovation. The U.S. industry was responsible for 370 new drugs and vaccines in the 1990s, half of all pharmaceutical innovation in the world. An industry-sponsored study came out a year later that said the average price tag for developing a new drug had risen to more than $800 million. To raise that kind of money for research, the industry needed every dime it collected from the American people. To limit the price of drugs or to limit the number of drugs that a plan might buy would jeopardize the industry’s ability to come up with new breakthroughs. “Government price controls are unacceptable to the industry because they would inevitably harm our ability to bring new medicines to patients,” Holmer testified.6
Public interest groups, insurance companies, and health care advocates cried foul, claiming the drug companies’ reasoning was nothing more than a scare tactic. They wielded studies that tried to poke holes in estimates of the research-and-development costs of a single new drug. They complained bitterly about the industry’s wasted search for drugs that mimic those already on the market. They attacked its marketing practices, including the expensive advertising sprees that encouraged patients to ask their doctors for new medicines that were no better than ones just coming off patent. And they pointed to the oversized profits racked up by the industry. Yet most legislators refused to do anything to hold down prescription drug prices when they passed a Medicare bill in November 2003 because they accepted the industry’s core assertion that its financial health was the key to innovation.
This book challenges that assertion by delving into the process by which drugs are actually developed. By recounting the history of several of the most significant new drugs of the past two decades, this book shows that the inception of drugs which have truly made a difference in recent years and which will make a difference in the twenty-first century can almost always be found in the vast biomedical research enterprise funded by the federal government. Taxpayer-financed medical research, whether in NIH labs or through government grants to academic and nonprofit medical centers, reached $27 billion in 2003, almost equal to industry spending. But a dollar comparison does not begin to describe the critical nature of the taxpayers’ role. Over the years, NIH-funded research played not only the key role in virtually all of the basic scientific breakthroughs that underpin modern medicine but also a central role in the application of those findings to the search for many new therapies. In some cases, government-funded researchers not only conducted the basic research but went on to identify the new drugs and test them in animals and humans, thereby completing the vital tasks required for regulatory approval.
This is not to say that many of the fifty thousand scientists, technicians, and office personnel working in industry labs do not play a crucial role in the successful development of new drugs. Significant advances in medicine require a complex interaction between scientists in the public arena and scientists in industry. The most successful drug companies maintain sophisticated in-house staffs capable of keeping up with the latest breakthroughs in public research. The companies also house scientists who can rapidly synthesize new chemicals that may become new drugs, develop new tools of high-throughput screening and rational drug design, and employ physicians adept at designing and monitoring clinical trials for testing them. But at the same time drug companies and their biotechnology cousins are deploying these skills for the commercialization of important new medicines, a sizable portion of the industry’s $30 billion research budget—perhaps as much as half—is spent on drugs that add nothing significant to physicians’ armamentarium for fighting disease.
Moreover, big pharmaceutical firms increasingly farm out many critical tasks to highly specialized firms willing to do that work for anyone. There are a growing number of biotech companies, specialty chemical companies, and clinical research firms willing and able to design drugs, screen chemicals, and conduct animal and human studies for researchers who think they have identified a new way to combat a disease. Anyone can take advantage of their services, including research organizations in the public and nonprofit domain.
Yet in the twenty-first century, the breakthroughs that lead to pharmaceutical innovation will take place long before those firms are employed. As the economist Alfonso Gambardella pointed out in Science and Innovation, a recent academic review of U.S. pharmaceutical research in the 1980s, “The generation of new drugs depends in large measure on activities that occur at the outset of the research-and-development process. Early research stages play a more meaningful role than in other industries, and they are the most creative steps of the drug innovation cycle.”7
Over the past two decades, the U.S.–funded research establishment in government, universities, and medical schools has developed an extremely efficient conveyor belt for moving the patented products and processes of these “most creative steps” into the private sector. Virtually the entire biotechnology industry is made up of firms begun when an individual investigator or group of investigators decided to try to get rich using the patents they took out on their government-funded inventions. There’s nothing wrong with that. Indeed, it’s the American way. The technology commercialization conveyor belt is the product of a deliberate government policy adopted in 1980 to foster innovation in medicine as it has in other high-technology fields.
But when the senior citizen medical insurance system is headed for bankruptcy; when the cost of health care, largely driven by the high and rising price of drugs, is taking up a greater and greater share of the overall economy; and when a growing number of Americans cannot afford the fruits of the pharmaceutical innovation system they funded—then the public has the right to ask how rich the commercial side of the partnership needs to be to ensure their continued participation in the system. The drug industry consistently reports profit margins approaching 30 percent of revenue. And while the industry also spends slightly more than 20 percent of its revenue on research and development, this book shows that nearly half of that research is more properly categorized as either a marketing expense or of minor medical significance aimed only at coming up with drugs that replicate the action of those already on the market. Indeed, the financial press is filled with dire accounts about a looming industry crisis precisely because the industry’s vaunted research and development pipelines have not generated the medical breakthroughs promised to investors and consumers alike.
In the immediate aftermath of World War II, George W. Merck, the patrician head of the most research-oriented firm in the industry, laid out a credo for his scientists. “We try never to forget that medicine is for the people. It is not for the profits. The profits follow, and if we have remembered that, they have never failed to appear. The better we have remembered that, the larger they have been.”8 The company still puts his words in its annual report, and makes them the centerpiece of its displays at medical conventions and scientific meetings.
But a half-century later, the former head of global research and development at Hoffmann–La Roche, Inc., after surveying the pharmaceutical industry’s research landscape, reported just how far the industry had drifted from Merck’s ideal. Jürgen Drews raised the specter of large pharmaceutical companies disappearing from the face of the earth, like the dinosaurs. “There can be no doubt that drugs could be discovered and developed outside the pharmaceutical industry,” Drews concluded in his 1999 book In Quest of Tomorrow’s Medicines. He suggested that public institutions such as NIH, the Medical Research Council (England’s NIH equivalent), or the German state-funded institutes could pick up the mantle of drug commercialization, relying on the same contract organizations that industry now uses for many of its research tasks. “An industry that becomes disconnected from its true purpose will gradually become replaceable,” he said.9
Amgen’s brief history is a good place to start in understanding how the drug industry got into this fix. But to understand its early successes and more recent disappointments, one must first travel to Chicago, where a bull-headed scientist working at the dawn of the biotechnology era made a discovery that would provide hope, energy, and extended life for millions of people, and from which he would never earn a dime in royalties.