Five

Industrial Education

The legislation that would end up helping generations of young Americans escape lives of plowing, planting, and picking, originally grew out of a movement to establish the nation’s first agricultural colleges. The progenitor of this movement was a professor named Jonathan Baldwin Turner, who had moved to Jacksonville, Illinois, in 1833, after earning a degree in classical literature at Yale. Turner was a deeply religious man who hoped to become a missionary out on the great frontier of the American West, among the Native Americans. But after seeing Potawatomi Indians suffering with cholera they’d contracted from white men like himself, he turned to another of his era’s great race-based injustices by becoming the editor of a Jacksonville abolitionist newspaper, an operative on the underground railroad, and, in his classroom at Illinois College, an outspoken critic of slavery.

Turner was keenly interested in agriculture. In the late 1830s, he spent much of his time outside of the classroom searching for a plant that might be used as a hedge to divide large expanses of open prairie for tilling and cultivation. Eventually he settled on the Osage orange plant, which worked so well that he began selling the seeds to Illinois farmers, who used it widely as a precursor to barbed wire, which wouldn’t be developed until a few decades later. Turner’s interest in agriculture deepened after 1848, when he was pressured to resign from his teaching post by colleagues and school administrators who disapproved of his abolitionist activism. It was then that he established the Illinois Industrial League, an organization dedicated to advocating for a publicly funded system of higher education aimed at the “industrial,” or working classes.

Turner unveiled his vision of a public university at the Putnam County Farmers’ Convention, held at the Granville Presbyterian Church on November 18, 1851, where he had been invited to speak by a group of local reformers called the Buel Institute. The group wanted Turner to discuss taking “steps toward the establishment of an agricultural university,” but for him it was a chance not only to publicize his new ideas about education, but also to sharpen his carefully prepared remarks on the establishment of colleges focused on agriculture and industry and open to all. As he became a recognizable figure, associated once more with what some considered to be radical ideas, Turner again paid a steep personal cost for his activism: in 1853, his farm was burned to the ground by activists who felt America’s educational system was better left alone, to carry on in the model of religious colleges and the aristocratic academic institutions of Europe.

Turner was not without influence, however, and a good number of powerful people admired his ideas. In February of 1853, the Illinois legislature adopted a resolution, drafted by Turner, calling for the state’s congressman to push for the establishment of a land-grant bill to fund a national system of industrial and agricultural colleges, with one in each state. Illinois Senator Lyman Trumbull thought Turner’s idea was too important to fail and too radical to succeed if not properly executed. It stood a much better chance, he thought, if it were introduced by a congressman from an Eastern state, rather than one from the west, which was still an untamed and uncivilized frontier in the minds of many Americans. He convinced Justin Smith Morrill of Vermont to introduce the bill.

Morrill did not share Turner’s vision of an equal grant for each state to start its own industrial college. Instead, he opted for a bill that allocated land based on the number of senators and representatives each state had in congress, giving an advantage to more populous Eastern states like his own. It took years for the Morrill Act to wind its way through the corridors of Washington politics, and in the meantime Michigan Governor Kinsley Scott Bingham seized the initiative in 1855, signing a bill establishing the Agricultural College of the State of Michigan, now known as Michigan State University, as the first so-called “industrial college” in the nation.

The Morrill Act was finally introduced to Congress in 1857, and it passed in 1859, but was vetoed by President James Buchanan, a northern Democrat who sided with Southern Democrats on the issue of education, which, along with slavery, they contended was a state matter, not a federal one. It was reintroduced in 1862, with amendments guaranteeing that public colleges would teach military tactics, engineering, and agriculture, but more than any of these amendments, the new iteration of the bill was helped by the fact that the states opposing its passage most vehemently had seceded from the Union by the time it crossed Lincoln’s desk on July 2, 1862. It was the last of three laws he signed that day, after one banning polygamy in the territories and another requiring a loyalty oath for government officials, and it followed just a day after he scrawled his signature on the land-grant bill that funded the transcontinental railroads.

As much as the railroads, America’s public university system generated immense wealth and economic opportunity. Very much unlike the railroads, which reproduced inequality by generating wealth for the wealthy first, public universities helped transform the poor, serf-like taxpayer into a better educated, more empowered citizen. And so these universities would also help pave the way for a growing American middle class, creating the country’s most stable and enduring model for social mobility simply by building colleges meant for people from all walks of life.

By 1867, twenty-two states had accepted the federal grant of thirty thousand acres of land per congressman for the endowment of a public college built “upon a sure and perpetual foundation, accessible to all, but especially to sons of toil.” These schools, focused on “agriculture and the mechanic arts,” included Kansas State Agricultural College, Illinois Industrial University at Urbana–Champaign, the Agricultural College of Minnesota, and Cornell University. Existing schools, like Brown University, gained agricultural or industrial programs because of the Morrill Act.

Engineering and agriculture were, at the time, cutting-edge fields, making land-grant universities as trailblazing as the first colleges to offer computer science courses; they were also radical in terms of admissions, not only for educating people who previously didn’t go to college, but also because many of them admitted women on an equal basis with men, though they would not be required to do so until many decades later; and by admitting nearly any student who earned good marks in state schools, they established a pipeline that allowed non-elites and the nonreligious to think constructively about an education beyond their immediate circumstances.

President Franklin D. Roosevelt invested anew in the possibility of an egalitarian and meritocratic higher education system, first with New Deal programs giving public universities the funds to build six hundred new campus buildings, then with the Servicemen’s Readjustment Act of 1944, better known as the G.I. Bill. This bill offered Americans returning from the Second World War benefits like cut-rate mortgages, health care, and money for college, which proved to be such an effective tool for rebuilding American society in the aftermath of the war that it led to more government programs aimed at building up the country’s higher education system. The U.S. government got into the student loan business with the National Defense Act of 1958 and the Higher Education Act of 1965, which extended student loan programs while adding work-study programs and federal grants to the aid options available to students. These programs especially benefited non-legacy and first-generation college students, who enrolled in far greater numbers after all land-grant colleges were at last forced to become racially integrated.

In the wake of the Second World War, the public university became a primary instrument for reconnecting American society with the idea that democracy was firmly rooted in a kind of meritocracy that flourished with greater access to higher education. The movements for civil rights and women’s liberation added to this rising pressure for a truly equal university system, giving more Americans access to the “germ of power” described by Tocqueville.

With the Cold War, however, came a new set of imperatives—from various branches of the U.S. government, its military, and its intelligence apparatus—for funding America’s public universities. Transformed into the world’s best research institutions by millions of dollars in government funding, some of it covert, America’s universities became instruments of empire. In the process, a terrifying truth came to light: America’s public universities are fundamentally weak at an institutional level, open to virtually any kind of corruption and influence from outside forces, especially when those outside forces were allowed to obscure their influence. Throughout the Cold War, at the level of the institution, the administration, the academic department, and the individual scholar, America’s universities proved themselves willing to do virtually anything for money, especially when they were allowed to operate in the shadows.


The Central Intelligence Agency (CIA) began partnering with American universities as early as 1950, with a series of psychological experiments that were eventually brought under the umbrella of one many-tendrilled program called MK-Ultra. The MK-Ultra program is largely remembered for the many failures of its first phase, which lasted between 1953 and 1956: a bizarre hodgepodge of esoteric experiments involving hallucinogenic drugs, hypnosis, and suggestion, with results that ran the gamut from the hilarious to the morbid—from CIA experiments at Stanford that administered LSD to Ken Kesey to former tennis pro Harold Blauer dying at the age of forty-two in a New York State psychiatric ward after being used, in the words of a federal judge, “as a guinea pig in an experiment to test potential chemical warfare agents for the U.S. Army.”

This first phase of MK-Ultra was also memorable for its immeasurable cruelty. When experiments called for what CIA Director Allen Dulles deemed “extraordinary techniques,” unwitting or unwilling human subjects were obtained using a vast array of methods ranging from random to exceptionally strategic. North Korean prisoners were used as unconsenting test subjects for injectable drugs, for instance, while hallucinogens, which could be more easily administered to unsuspecting targets, were used on private U.S. citizens ranging from a summer camp full of children to men who had been lured into a CIA safe house by prostitutes who had been paid to dose them with LSD.

What is often forgotten about MK-Ultra is that its second, less spectacular phase, which ran from 1956 until 1963, became the basis for the CIA’s first manual for interrogation and torture. This was, in many ways, a logical progression considering the fact that an MK-Ultra precursor, called Project Bluebird, used as its starting point Nazi interrogation techniques. In fact, the CIA might have arrived at the study of torture sooner if not for an almost pathological early obsession with LSD experiments, which one MK-Ultra scientist later testified was based on rumors that the Soviet Union had secured a massive supply of the drug. When the second phase of MK-Ultra did arrive, the psychology and human behavior research developed from it turned out to be far more consequential than the dark comedy of the program’s first act.

The CIA’s accomplice in its MK-Ultra research was America’s university system. Between 1953 and 1963, MK-Ultra and related programs paid out $25 million for human experiments performed at eighty different institutions, including twelve hospitals and forty-four universities. The academics and medical researchers who participated in MK-Ultra experiments, and who received covert CIA funding for their efforts, seemed to know what kind of devil’s bargain they were making: Richard Helms, who oversaw the project, convinced CIA director Dulles that the Agency’s university contractors should be allowed to work without signing formal contracts.

Helms recognized that academics and researchers were “most reluctant to enter into signed agreements of any sort which connect them with this activity since such a connection would jeopardize their professional reputations.”

Some of the results from these experiments were also secret, known only to the government agencies that funded them. Experiments conducted at Columbia University, for example, produced both secret results that were reported only to the CIA, and public results that were published widely in medical journals, with key facts about methodology obfuscated for obvious reasons. Consequently, it’s anyone’s guess how reliable any research study funded by MK-Ultra might prove to be if repeated under intellectually honest conditions.

When the CIA’s years-long obsession with LSD and other hallucinogens came to an end, the agency began building more formal relationships with universities, often through intermediaries like the Office of Naval Research (ONR), which had become an early patron of experimental psychology starting in 1950. Within just a few years, the ONR was sponsoring psychological research at fifty-eight universities, in close collaboration with the CIA, with an end goal of developing more effective methods of interrogation and torture. This research was often funded through a system of philanthropic money-laundering that funneled CIA funds through private foundations, like the Ford Foundation and the Rockefeller Foundation. In this way, somewhere between $7 million and $13 million flowed into universities each year for research on behavioral science. The Agency also funded this kind of research through slightly more direct channels. In 1950, the U.S. government established the Bureau of Social Science Research (BSSR) at American University, which was used to underwrite studies on the effects of torture on prisoners of war, under the guise of social-psychology research. Among the BSSR studies conducted at American University was an Air Force–funded experiment aimed at determining the relative utility of drugs and violence for the purposes of interrogating prisoners.

Some of the most important psychological studies of the century were conducted at research universities funded by the CIA, including Yale psychologist Stanley Milgram’s controversial experiments, which concluded that virtually any person could be made to torture another human being. The eminent psychologist Albert Biderman, whose experiments were funded by the CIA both directly and indirectly, accidentally revealed in a widely published study on isolation the sinister aim of the experiment. Psychological torture, Biderman concluded, was “the ideal way of ‘breaking down’ a prisoner,” and isolation produced a similar effect to being beaten, starved, or forced to endure sleep deprivation.

University psychology departments were especially easy targets for government intelligence agencies to infiltrate because so many psychology professors had served in the military during the war. Even those who hadn’t were likely to have done contract research for the Pentagon, which had spent broadly on research aimed at understanding what might happen to its soldiers in various phases of combat. The CIA sent operatives each year to the American Psychological Association (APA) conference, with orders to build relationships and report back on people who were doing particularly relevant or interesting research. Such close ties were developed with so many professionals in the field that the CIA sometimes used government aircraft or chartered planes to transport groups of psychology researchers and professors to conferences. And however outlandish that may sound, these were far from the boldest or most visible partnerships between the U.S. government and the public universities it sought to co-opt through investing in research projects.

In April 1965, a University of Pittsburgh anthropology professor named Hugo Nutini traveled to Chile to recruit scholars for a research project focused on counterinsurgency measures and the process of political revolutions in the developing world. Nutini was born in Italy but grew up in Chile. He began working at the University of Pittsburgh in 1963, just as the university was expanding its social science programs, like many other schools around the country. Behind this rapid expansion of interdisciplinary university programs was a U.S. Department of Defense (DOD) program called Project Camelot.

Launched in 1963, Project Camelot sought to predict and control revolutionary political movements in third-world nations through the use of advanced research on group psychology and behavior—the winning of “hearts and minds” through psychological warfare, marketing, and manipulation. The project emerged from the Army Office of Research and Development’s growing worries over “wars of national liberation” in countries like the Belgian Congo, Cuba, and Yemen. But the program itself spiraled out of control in the spring of 1965, when Nutini began telling Chilean scholars about his research project, which he claimed was funded by civilian agencies like the National Science Foundation.

What Nutini didn’t know was that the scholars he was trying to recruit knew that he was working as an advisor for the U.S. government, and that his research projects were, in reality, being funded by Project Camelot. The secret program had been quietly undone by a Norwegian sociologist named Johan Galtung, who had seen Project Camelot’s preliminary research design, including its military sponsorship, and leaked the information to his Chilean colleagues. The whistleblower exposed Nutini, who was not only disgraced but banned from ever returning to the country where he was raised. Project Camelot was canceled by Secretary of Defense Robert McNamara on July 8, 1965.

The widespread media coverage that exposed Project Camelot forever changed attitudes toward U.S. academics in the developing world, where professors could now face credible accusations of collaborating with the CIA or other U.S. government agencies. It also marked a shift in attitudes at home: these alliances were not anything like Princeton University’s Listening Center, which had been established in 1939 to monitor Axis broadcasts before this task was handed over to the Federal Communications Commission (FCC); there was no war effort to support and there was no clear division between civilian academic consultants like Nutini and CIA or DOD operatives. Project Camelot was a dubious exercise in empire building, crafted by architects with no regard for academic objectivity or the sovereignty of other nations, and undertaken by a mix of covert professionals, covert amateurs, and unwitting academic accomplices.

Government funding for projects like Camelot continued despite the controversy. And just two years after the operation’s scandalous end, Arthur H. Brayfield, the director of the APA, once again rubber-stamped defense research of the sort that had dragged the field into such murky territory through operations like MK-Ultra and Project Camelot.

“I think the military should be free to use all reasonable, ethical, and competent tools at its command to help carry out its mission,” Brayfrield said. “And I would say strongly that the use of behavioral science and behavioral scientists is one of those useful tools.”

An important shift came in 1967, when federal expenditures on what the government called the “psychological sciences” peaked at $158 million; after nearly two decades of covert, unethical, and occasionally incompetent employment of behavioral scientists, the DOD sought to improve its image and repair its damaged relationships with academics by openly spreading funds for social science research through Project THEMIS. In its first year, Project THEMIS distributed $20 million to schools around the United States, and two years later its budget had swelled to nearly $60 million—about one- third of the total annual budget for Pentagon programs funneling cash to American universities. One reason behind the success of Project THEMIS, which came along just two years after the public-relations nightmare of the Camelot scandal, was a calculated adjustment the DOD made to the types of academic institutions it funded. Instead of Yale, where Milgram studied the psychology of torturers, or MIT, where the DOD held early conferences to recruit behavioral scientists, THEMIS specifically targeted schools and departments that were underdeveloped and underfunded—a key insight that would later work to the advantage of corporations once the U.S. government began abandoning the universities that had by then reshaped entire departments and disciplines to suit the research appetites of the Cold War.


The social sciences flourished in the years after the end of the Second World War, but the intense existential competition of the Cold War also produced incredible advances in physics, chemistry, biology, and engineering, as government grants funded the development of bacterial weapons, ballistics, computers and communications, and radioactivity. Universities expanded and enrollments increased an incredible 122 percent between 1960 and 1970, the year when subsidies and grants finally began to slow as the government started relying more on think tanks than colleges. University administrators were forced to look elsewhere for patrons, and it wasn’t long before they found them in America’s corporate boardrooms.

This shift took place at an ideal moment for the American tobacco industry, which was then grappling with growing scientific evidence that smoking could cause cancer and other serious health problems. In a 1969 memo, an executive from tobacco giant Brown & Williamson spelled out the industry’s strategy for weakening the scientific consensus over whether smoking was harmful to your health.

“Doubt is our product, since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public,” the memo read. “It is also the means of establishing controversy.”

The tobacco industry’s methodology was later identified by researcher Lisa Bero, who outlined the industry template:

  1. Fund research that supports the interest group position.
  2. Hide industry involvement in research.
  3. Publish research that supports the interest group position.
  4. Suppress research that does not support the interest group position.
  5. Criticize research that does not support the interest group position.
  6. Change scientific standards.
  7. Disseminate interest group data or interpretation of risk in the lay press.
  8. Disseminate interest group data or interpretation of risk directly to policy makers.

This methodology, according to Bero’s research, was dangerously effective. Studies on secondhand smoke, when conducted by researchers with ties to the tobacco industry, were nearly 90 percent more likely to find no harm. By funding research, tobacco companies were able to control the research agenda, influence the design and methodology of studies, and build, through financial patronage, the kind of relationships necessary to help suppress research that might damage cigarette sales. Tobacco industry lawyers, in particular, were deeply involved in the design and execution of studies funded through the tobacco industry’s philanthropic bodies and its trade association. In some cases, industry-funded research was made to appear independent from tobacco companies by funneling payments through law firms like Covington & Burling, which represented Philip Morris. (Covington & Burling’s other clients included Halliburton, Blackwater Worldwide, and Chiquita Brands International, the last of which became the first U.S. corporation convicted of financing terrorism for paying protection money to the United Self-Defense Forces of Colombia, a paramilitary organization.) At tobacco companies like Brown & Williamson, lawyers were even responsible for disseminating the results of any scientific research and studies funded or carried out by the company.

One of the tobacco industry’s main vehicles for undermining the scientific consensus on smoking’s harmful effects was the Council for Tobacco Research (CTR), which was founded in 1954 as part of an industry-wide public relations effort. In the 1970s, as universities and other research institutions began to feel the effects of shrinking government investment, the function of the CTR evolved into something much more proactive, according to an internal 1978 Brown & Williamson memo.

“Recently it has been suggested that CTR or industry research should enable us to give quick responses to new developments in the propaganda of the avid anti-smoking groups,” Brown & Williamson vice president Ernest Pepples wrote. “The industry research effort has included special projects designed to find scientists and medical doctors who might serve as industry witnesses in lawsuits or in a legislative forum.”

Another tobacco industry group, called the Center for Indoor Air Research (CIAR), was founded in 1988 specifically to combat the growing scientific consensus that secondhand smoke was harmful. Between 1989 and 1993, the group spent more than $11 million funding mainstream peer-reviewed research projects, most of which examined indoor pollutants other than tobacco smoke. This served not only to divert attention away from tobacco but also to build up the reputation of the CIAR by associating it with respected research universities and peer-reviewed results. During those same years, the group quietly spent $4 million on non-peer-reviewed research projects that relied instead on “special-review” by a board of directors consisting of tobacco company executives. These studies focused mostly on undermining research on the harmful effects of secondhand smoke, and while they were less rigorous and relied on methodologies crafted by industry lawyers rather than scientists, they benefitted by association from the fact that CIAR funded so much high-quality peer-reviewed research. This means that even universities and professors who accepted tobacco industry funding only on the condition that it went toward conducting sound research unwittingly aided the industry’s efforts to undermine good science.

Tobacco companies were far from the only beneficiaries of the research-funding model that the Cold War helped create. In the 1970s, the University of Wisconsin made an agreement with the Kimberly-Clark paper company and the manufacturing corporation Kohler, among others, to conduct research that was either “too basic or costly for the companies to conduct themselves,” in exchange for financial backing from those companies. Similarly, Harvard Medical School received $23.5 million from Monsanto to perform cancer research. The Hammermill Paper Company not only funded but designed courses in economics at Gannon University in Pennsylvania and Springfield College in Massachusetts. They weren’t ordinary economics classes, but training courses for high school teachers, meaning that the curriculum designed by Hammermill Paper would trickle down to teenagers throughout the region. The aim of the course was to present a pro-business view of economics. Throughout the 1980s and 1990s, arrangements like these gave corporations and their broader networks undue influence on any number of academic institutions. They also gave conservative activists the opportunity to colonize young minds.

In the fall of 1993, Republican congressman Newt Gingrich taught a course called “Renewing American Civilization” at Kennesaw State University in Georgia, for which he solicited contributions from Republican donors and corporations. His goal, he wrote in letters soliciting funds for the course, was to use class lectures as a means of recruiting Republican activists ahead of the 1996 presidential campaign. The House ethics committee later found Gingrich in violation of campaign financing laws because the courses were aimed at indoctrination rather than education, and had been funded using tax-deductible donations from wealthy Republicans and corporations like Hewlett-Packard. In exchange for Hewlett-Packard’s financial support, Gingrich described the corporation in lectures as “one of the great companies in American history.” Corporate donors were also given the opportunity to present promotional videos for viewing in class.

Other aspects of corporate intrusion into the American classroom have taken hold so deeply that it requires some effort to imagine a time when things were different. There was a time, for instance, before the recession of the 1990s, when American universities did not have exclusive beverage partnerships with Coca-Cola or Pepsi, and did not host campus events like the University of Minnesota’s Diet Coke Classic volleyball tournament.

The 1996 Minnesota–Coke deal, which was among the earliest of its kind, was heralded by university administrators as “a national model for beverage partnering in higher education.” And it was only appropriate that Coke and Pepsi should head to college, because they had already graduated from America’s public elementary schools, middle schools, and high schools.


It was no accident of history that the first corporations to advertise inside America’s public schools did so in Colorado Springs. In 1989, when Kenneth Burnley took over as the superintendent of the city’s largest school district, the dozens of schools he was charged with overseeing were running a collective deficit of about $12 million. Classes were overcrowded, teachers were overworked, and facilities were rotting away faster each year as the district fell deeper into the red. By 1993, it had been nearly two decades since voters approved a tax increase to help fund public schools in Colorado Springs, and Superintendent Burnley knew that wasn’t likely to change in the near future. He decided he’d need to turn to someone other than the voters if he wanted to get more money for the schools in his district.

“This was the first school district in the nation to offer advertising opportunities,” he boasted. “Our taxpayers have challenged us to be more creative and businesslike in how we finance the schools, so we decided to take a page out of business’s book. I realized we could sell for cash something we always had, but never knew we had.”

What Burnley had was access to students, whose undivided attention he sold to some fifty different corporate clients for rates ranging from $1,500 to $12,000—advertisements on school benches and buses, loudspeaker announcements at football and basketball games, and, most important, an exclusive contract with Coca-Cola paying the school district $8.4 million over the course of ten years. The decade-long contract with Coke meant that District 11 students who entered the first grade in 1993 would make it halfway through high school before the deal expired, spending several hours of each day, for nine months out of each year, surrounded by Coke products and advertisements. Many of them would have no idea that schools had not always been this way, or that other schools were any different. It was not only a grand marketing experiment, but an experiment in sales techniques, with contractual incentives intended to push district officials to see that their schools sold as many Coke products as possible. These incentives worked.

During the fall of 1998, John Bushey, who oversaw the school district’s contract with Coke, wrote a letter to school administrators urging them to move vending machines to areas where they would be “accessible to the students all day.”

“Research shows that vendor purchases are closely linked to availability,” he wrote.

Bushey also pressed teachers to allow students to drink soda in the classroom, perhaps because the district’s contract offered nearly $3 million in bonuses if the schools sold more than 70,000 cases of Coke products each year. The arrangement proved to be so successful that Coca-Cola and its competitor, Pepsi, began vying to sign similar deals with school districts around the country. It was a valuable investment for these corporations: in 1997, children aged four to twelve spent more than $24 billion, while twelve- to nineteen-year-olds accounted for another $141 billion in total spending.

Dan DeRose, a middle-aged former professional football player who helped broker corporate advertising deals for the school district in Colorado Springs, sensed that this was a profitable niche market waiting to be cornered. DeRose started his own company, DD Marketing, and began brokering million-dollar deals between soda companies and public schools and universities all over the country.

“Our philosophy,” he said, “is if you’re going to allow corporate America into your schools, maximize your return.”

One day in May of 1999, DeRose was giving a presentation on the question of how to maximize returns for his latest client, Newark Public Schools, which had become so underfunded and mismanaged that the state of New Jersey had taken control of them a year earlier.

“Let’s just say everyone drinks one product a day, and let’s just count the students,” he said. “At 45,000 [students] times 180 days of school, that’s 8.1 million cans. At seventy-five cents apiece, that’s six million dollars walking out the front door of your school every year in quarters and dollars.”

His audience was a gathering of school principals, athletic directors, parent-teacher association officials, and a single student from Newark Public Schools, which was seeking bidders on a contract to supply the district’s eighty-two school cafeterias and countless vending machines with soda, water, and other beverages. The buyer would need to be either Coke or Pepsi, DeRose advised, because these companies paid a premium for the opportunity to imprint brand loyalty on so many young minds. He used his own daughter, Anna, as an example, since he had negotiated the Coca-Cola contract with the Pueblo, Colorado, school where she attended first grade.

“From now until she’s graduated, all she’ll drink is Coke,” DeRose said. “She goes out for pizza and we ask, ‘What do you want to drink, honey?’ ‘Coke’; she doesn’t even know how to spell Pepsi.”

Between Colorado and Newark, DeRose negotiated a $5.3 million deal in Kansas City, earning himself a commission of between 25 and 35 percent, just like the other sixty-two deals he’d negotiated.

When he landed his sixty-third deal with Coke in Newark, the first thing he did was draw up a plan to install eight of the soda company’s vending machines in each of the district’s thirteen high schools, and four at each of its sixty-nine elementary and middle schools.

For the schools, the appetite for corporate sponsorship was simple: most states were following Oregon’s lead with cuts to higher-education funding that forced public elementary schools, middle schools, and high schools to compete with state universities for access to dwindling money from a state general fund. Corporate contracts worth millions of dollars were an attractive alternative to the traditional methods of raising funds for public schools.

“The thought of generating that kind of revenue stream, where you didn’t have to raise property taxes, you didn’t have to increase the sales tax, and you didn’t have to do battle with the county commissioners, was very attractive,” said Barry Gaskins, a school district official in Greenville, North Carolina.

By the late 1990s, Coca-Cola had signed deals affecting more than a thousand schools across the country, including those in Washington, D.C., where a school official called the contract “a godsend.” In exchange for the exclusive right to sell Coke products in D.C. public schools, the district received 65 percent of the sales from Coca-Cola, incentivizing the school to sell more soda to its students. This sometimes totaled as much as $50,000 per month for the school district.

Penny McConnell, a former president of the American School Food Service Association, opposed these kinds of contracts vehemently.

“I think it would be a real negative, a bad situation, for a district like ours that’s working so hard to promote nutrition to then profit from ripping off kids with nonnutritious foods,” McConnell said. “If I’m going to teach nutrition, I’m going to serve nutrition.”

The National Soft Drink Association called her argument “an insult to consumer intelligence,” which suggests that corporations thought of children in precisely the same terms as any adult demographic market.

Oxon Hill High School, in nearby Prince George’s County, Maryland, signed a three-year contract with Pepsi, which principal David S. Stofa said paid for everything from landscaping to a scoreboard for athletic events.

“The kids know all that Pepsi has done for the school,” he said. “And they really appreciate it.”

At Greenbrier High School in Evans, Georgia, two students were suspended for wearing Pepsi T-shirts on the school’s official Coke Day, which was established after the school won a district-wide contest called Team Up with Coca-Cola. In exchange for Coca-Cola’s modest $500 prize, principal Gloria Hamilton induced her students to distribute Coke coupons near the school, posed them for an aerial photograph wearing color-coordinated shirts that spelled out “Coke,” and, during class time, made them listen to speeches from Coca-Cola executives. When two students stripped off their sweaters to reveal Pepsi T-shirts underneath them, she suspended the students and accused them of “trying to destroy the school picture.”

“It really would have been acceptable if it had just been in-house,” she said. “But we had the regional president [of Coca-Cola] here and people flew in from Atlanta to do us the honor of being resource speakers.”

In time, lack of state funds for public schools made K–12 education so dependent on contracts with soda corporations that a school in Ohio defied a state order that it comply with federal laws against selling soda and candy during the lunch break. And in Maryland, school districts became open allies with beverage-industry lobbyists and vending machine companies as they banded together to defeat legislation introduced by Senator Paul G. Pinsky, which sought to limit commercialism in public schools.

“The lobbyists kicked my ass,” Pinsky said.


Most industry partnerships with university research began much later, and far less malevolently, than those undertaken by the big tobacco companies. For many industries, it began simply enough, in the 1970s, as corporations outsourced certain research to universities desperate to bring in cash during the waning years of the Cold War, when the military slowed its seeding of college research projects. The shift in funding was dramatic: in 1965, the federal government was responsible for financing more than 60 percent of all the research and development taking place in the United States, while four decades later the reverse would be true, with 65 percent of this same work being funded by various private interests. Initially, serious researchers, especially in medicine, sought to maintain “arm’s-length relationships with their corporate sponsors,” according to Marcia Angell, a former editor of the New England Journal of Medicine. But in 1980, a landmark piece of legislation called the Bayh–Dole Act changed the nature of research-based scientific inquiry at American universities.

The Bayh–Dole Act gave universities and professors automatic ownership of the federally funded research they produced, as well as the right to sell and market that research. More than anything else, the Bayh–Dole Act changed the avenues that academic research took on its way to industry. Prior to 1980, academic knowledge was monetized openly, through publications, conferences, and consulting that was more or less transparent. Since Bayh–Dole, university research has been quietly patented and licensed, creating more direct, less transparent financial ties between universities, professors, and the corporations to which they license their patented discoveries. And when it comes to trials and studies, there is evidence that researchers tend to reach the conclusions most desired by the interest groups funding their work. Research comparing different cholesterol medications, for instance, are twenty times more likely to favor the drug owned by the company that has funded the study, rather than its competitor. And one analysis in the British Medical Journal found that a drug was four times more likely to be looked on favorably in trials if the research had been funded by the pharmaceutical company that owned the patent to that drug.

Two decades after Bayh–Dole’s passage, universities were generating ten times the number of patents they had prior to the landmark legislation, and corporate funding of university research surpassed $2 billion annually. Relationships between pharmaceutical corporations and medical researchers, meanwhile, became so cozy that the editors of the New England Journal of Medicine, the Lancet, and other leading medical journals were forced to come up with policies to determine whether an author could vouch completely for the claims they were advancing, and whether they even had full access to their own trial data. Drummond Rennie, an editor at the Journal of the American Medical Association, said that past a certain point, all they could really do was print a correction or notice when they found that they’d printed some falsehood advanced by a researcher on behalf of their corporate benefactor. “Usually they’re lying on their way to the bank,” she said of such characters. Over time, the bank opened up more branches, until conflicts of interests spread from university researchers up to administrators. One startling example of this could be found at Stanford University’s medical school, where, by 2006, one-third of all the school’s administrators and department heads had reported financial conflicts of interest related to their own research. Seven out of ten members of the committee tasked with policing conflicts of interest at the school also reported conflicts, which included consulting fees, stock options, and shared or licensed patents.

This kind of anything-goes attitude toward research funding eventually found its way into government agencies, and it didn’t need to use the back door to get in—it was, in fact, an invited guest. During the presidential administration of George W. Bush, the Environmental Protection Agency, which is tasked with enforcing the laws meant to protect our environment, responded to budget pressures by outsourcing certain studies to the very groups meant to be subject to those laws and accountable to the agency. Consequently, the American Chemistry Council paid $2 million toward studying the effects of pesticides and household chemicals on young children in 2004; and in 2007, it was America’s livestock producers, not its government or taxpayers, who paid for the first nationwide study to determine whether large-scale animal production had any detrimental effects on air quality.

Today, America’s public universities are a veritable banquet on which corporate interests dine at their discretion. At the University of California, Davis, which grew out of the Morrill Act, Turner’s idea of “industrial education” has taken on new meaning. Pharmaceutical representatives visit the campus regularly for roundtables, meet-and-greets, and coffee hours with professors, who must often sign nondisclosure agreements once introductions have been made and business cards have been exchanged. This is not particularly unusual. In fact, only America’s original agricultural college, the University of Michigan, maintains a public database showing how its research is funded. Virtually all other public universities go to extraordinary lengths to avoid disclosing which corporations are funding them and how much money they’ve been given, often by receiving tax-deductible donations through university foundations, which are exempted from public records laws in many states. The corporations themselves are usually more forthcoming about the universities they fund—the Dow Chemical Company lists on its Web site thirty-four universities it considers “academic partners,” including Stanford, MIT, Cornell, and state schools like the University of Minnesota and Texas A&M. The corporations have philanthropic credit to gain by broadcasting these relationships. The universities, however, stand to lose academic credit.

At Purdue University, it’s food and chemical companies who send representatives to visit the campus, at a rate of nearly one company per day, in search of partnerships in the school’s department of food science. Companies like Conagra, Pepsi, and Nestlé pay Purdue $5,000 a year for a vast array of privileges that include prepublication review of relevant research, influence over the department’s curriculum, and access to students. Some of the school’s professors even take periods of academic leave to collaborate directly with Purdue’s corporate partners. Such partnerships typically have little to do with developing the next big snack cake.

In 2014, as it struggled to combat a slump in sales of sugary drinks, Coca-Cola helped put together a nonprofit group called the Global Energy Balance Network (GEBN), staffed with industry-friendly professors from the University of Colorado School of Medicine and the University of South Carolina (USC), and armed with $1.5 million from the soda company. The goal, according to a leaked email from a professor to a top Coke executive, was to “help your company avoid the image of being a problem in peoples’ lives and back to being a company that brings important and fun things to them.” When The New York Times revealed that Coke was funding the group’s research, which emphasized exercise over diet in combating obesity, the University of Colorado said it would return the $1 million it received from the soda company, not because of ethical concerns but because it was creating a distraction. The University of South Carolina kept the $500,000 it received from Coke, saying that there was really no way to distance itself from the controversy since USC professor Steven N. Blair, who has received millions of dollars from Coke over the years, remains one of the leaders of the GEBN.

A year later, the International Life Sciences Institute (ILSI), which is funded by Coca-Cola, Hershey, and Red Bull, released a research paper suggesting that there is weak evidence behind dietary recommendations on limiting sugar intake—a claim so outrageous that the candy company Mars, Inc., which also funds the ILSI, disavowed the paper. Annals of Internal Medicine, which published the study, eventually issued a correction noting that one of the study’s designers, Joanne Slavin, failed to disclose a $25,000 grant she’d received from Coca-Cola for research she’d done at the University of Minnesota in 2014. She called it an oversight, something that is bound to happen from time to time, and promised to include the Coca-Cola grant in a new disclosure she needed to file for an oatmeal study funded by Quaker Oats, which is owned by Coke’s chief competitor, Pepsi.

It’s easy to understand how a professor could forget one or two of their corporate benefactors these days: Iowa State University’s athletics department is now brought to you by Dow and Monsanto; the University of Michigan receives financial consideration from various automobile and telecommunications companies; and the University of Washington has Amazon Catalyst, which provides grants across all disciplines and departments for any idea that grapples with a pressing societal issue.

It’s been a long time coming: at a 2005 panel that included the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine, former Food and Drug Administration (FDA) commissioner Jane E. Henney said of university-level researchers: “It’s getting more difficult to get that pure person with no conflicts at all.”


There are arguments to be made for the absolute efficiency of markets, but it’s worth remembering that the atom bomb, the Internet, and a great number of lifesaving medicines were created by government-funded research, long before the passage of the Bayh–Dole Act. It’s also worth considering some of the ways in which markets fail to be efficient. One example is a drug called sofosbuvir, which was discovered in 2007 by medical researchers who then sold the patent to Gilead, a U.S. pharmaceutical company, giving them the exclusive rights to market it for twenty years.

Under the trade name Sovaldi, the drug was approved for sale in the United States as a hepatitis C treatment in 2013, and the profit potential was immediately clear to Gilead, which earned $10.3 billion from the drug in its first full year on the market. These massive revenues were due in part to the fact that a full twelve-week treatment with Sovaldi costs an American patient $84,000—a price so steep that in states like Louisiana, Medicaid would not help cover the drug except in cases where the patient’s liver had become severely damaged by the disease. Then, in 2015, the company hit a potentially disastrous snag with its international sales when India’s patent office rejected Gilead’s application to recognize its patent on sofosbuvir. This decision was eventually overturned, and an Indian patent was granted, but not before dozens of Indian firms began producing generic versions of the drug. In the face of crippling losses to counterfeiters, Gilead chose to license the drug to thirteen Indian drug manufacturers, allowing them to sell generic versions to Indian consumers at a dramatic discount—Cipla’s licensed, legitimate generic brand of sofosbuvir costs $14 per pill, compared with Gilead’s Sovaldi, which costs Americans $1,000 per pill; the full course of treatment recommended for hepatitis C sufferers, which costs $84,000 in the United States, can be had for $1,200 in India, a discount significant enough to boost India’s thriving medical tourism industry.

The Bayh–Dole Act has, in other words, created a market system so inefficient that U.S. consumers are better off seeking this lifesaving treatment in a foreign country, where criminal counterfeiting syndicates and government bureaucrats collaborate in order to strong-arm the patent holder into licensing generic versions of the drug for sale at humane prices. It has also made possible a market in which there may be low-cost, equally effective alternatives to drugs like sofosbuvir that consumers will never hear about due to contracts between researchers and their corporate benefactors.

In the 1990s, a British pharmaceutical corporation called Boots UK gave clinical pharmacist Betty Dong $250,000 to conduct a study at the University of California, San Francisco, comparing its own hyperthyroid medication with lower-cost alternatives made by its competitors. But instead of demonstrating its drug’s superiority, the study showed that hyperthyroid patients could save $356 million annually if they used alternatives to Boots UK’s Synthroid, which dominated the $600-million market for synthetic hormones. Boots barred Dong from publishing her blind-reviewed study in the Journal of the American Medical Association, citing fine print in her contract stating the results “were not to be published or otherwise released without written consent” from Boots UK. When word of the study’s results leaked and ended up in the pages of the Wall Street Journal, the pharmaceutical company attacked the very research it had funded. Dong, who was still contractually bound to the agreement she’d signed, was unable to defend herself from the company’s attacks on her work.

There’s reason to believe that this kind of suppression continues unabated. In 2006, a Cleveland cardiologist named Steven Nissen was searching the Internet when he stumbled across forty-two clinical trials for GlaxoSmithKline’s top-selling diabetes drug, Avandia, which caught his attention because he knew only fifteen of the clinical trials had ever been published. Nissen had been concerned for some time about possible health risks associated with Avandia, but had been unable to convince the pharmaceutical giant to release any research on the drug. The clinical trials he found online were there, it turned out, due to the outcome of a 2004 lawsuit alleging that GlaxoSmithKline had concealed negative trial data associated with another one of its drugs, the antidepressant Paxil. An independent analysis of the Paxil trial data showed that children who took the drug were twice as likely to experience suicidal ideation than those given a placebo, and the ensuing settlement forced the pharmaceutical company to post the results of all of its clinical trials online, where Nissen found the data necessary to satisfy his curiosity about Avandia. He published what he found in the New England Journal of Medicine, reporting that Avandia raised the risk of heart attack by 43 percent, prompting the FDA to place its most severe warning on the diabetes medication’s label. An independent analysis carried out by the FDA came to a remarkably similar conclusion, and yet Nissen soon found himself the subject of attacks from colleagues, including Valentin Fuster, who wrote a critique of his work in Nature Clinical Practice Cardiovascular Medicine. Fuster, it turned out, served as the chairman for GlaxoSmithKline’s Research and Education Foundation, and received funding from the pharmaceutical giant.


In 2007, the University of California, Berkeley, entered into a historic and highly unusual partnership with the British oil company BP: the petroleum giant would spend $350 million building an Energy Biosciences Institute (EBI) at the Berkeley campus, and in exchange it would reap the benefits of the school’s research into biofuels. More controversially, BP would be allowed to lease commercial research space on campus, where its employees could work side by side with students and professors. And while the school’s own research would be publishable, anything done specifically for BP would be proprietary, meaning that research funded in part by taxpayers, and carried out in publicly owned research facilities, could be kept secret forever under the arrangement. It was a controversy of the sort UC Berkeley had seen before.

In The Uses of the University, UC Berkeley’s first chancellor, Clark Kerr, described a postwar “multiversity” faced with balancing the interests of the government, the military, and private industry, as well as the interests of students and professors. A year after its 1963 publication, Kerr’s book came to represent, for student protestors, a tacit endorsement of military and corporate incursions into university life and the university mission. Nearly four decades later, it was difficult to see these concerns as anything but prescient; Kerr himself was forced to admit that his notion of a “multiversity” was increasingly vulnerable to corruption by private interests.

“The university ought to remain a neutral agency,” Kerr said. “Devoted to the public welfare, not private welfare.”

Kerr’s concern over the university’s waning dedication to the public welfare was prompted by UC Berkeley’s 1998 partnership with the Swiss pharmaceutical giant Novartis. Over the course of five years, Novartis, which also produced genetically engineered crops, would fund $25 million worth of research at UC Berkeley’s Department of Plant and Microbial Biology. In exchange, the school granted Novartis the right to negotiate licenses on about one-third of the department’s discoveries, including those which were funded by state and federal taxpayers. Most controversially, the agreement gave Novartis two of five seats on the department’s research committee, which was tasked with making crucial decisions about how the department would spend its money; the other three committee seats were held by professors who had received, in total, more than half a million dollars in research funding from Novartis.

The backlash was immediate: students protested, the media descended, and faculty in the Department of Plant and Microbial Biology were isolated from their colleagues, who avoided discussing interdisciplinary research with them for fear that it would end up in the hands of Novartis. One UC Berkeley professor, Ignacio Chapela, criticized the partnership despite the fact that he had previously worked for Novartis.

“I’m not opposed to individual professors serving as consultants to industry,” Chapela said. “If something goes wrong, it’s their reputation that’s at stake. But this is different. This deal institutionalizes the university’s relationship with one company, whose interest is profit. Our role should be to serve the public good.”

In the spring of 2000, the controversial partnership became the subject of tense hearings in California’s state senate, where some Berkeley faculty learned for the first time that their contract with Novartis was not in every way what it had seemed. Access to genomics data owned by Novartis, which had been a key selling point for many UC Berkeley researchers, was contingent on signing a confidentiality provision with the Swiss company. This provision, Senator Steve Peace noted, would shrink the amount of agricultural knowledge available to the public while accelerating the monopolization of basic biological research tools necessary for certain kinds of plant breeding—important aspects of the global food system, increasingly under the control of a few large corporations. Four years later, after considerable rancor, an independent review conducted by a team from Michigan State University concluded what had long been obvious to critics of the deal: the Berkeley–Novartis partnership was a failed venture that should not be repeated.

And yet it was repeated, just a few years later, when Berkeley announced its partnership with BP. Many of the details were familiar: Berkeley’s corporate partner would help choose the institute’s director, while other high-level positions would be filled by appointees or employees of the oil giant; BP could end up co-owning some of the institute’s intellectual property; and despite the fact that Berkeley banned classified military research on its campus, on the basis that scientific breakthroughs are meant to be published and shared, the school’s agreement with BP allowed for a “no obligation to publish” clause. These ethically cloudy waters grew murkier still after BP’s offshore drilling rig, Deepwater Horizon, began leaking oil into the Gulf of Mexico in April of 2010, sparking fresh concerns about the seventy projects Berkeley researchers were working on with the oil company.

Five years later, with oil prices plummeting and $100 million of BP’s promised funds unspent, the oil company exercised its option to pull a third of its funding for 2015, and in each of the following two years. The school, meanwhile, remained on the hook for everything it had spent on the endeavor, including tuition increases, even though the thirst for biofuel research had been sated.

“All renewables are going to have to weather the storm,” institute director Chris R. Somerville said. “We may have to wait it out until oil becomes expensive again.”

Market conditions, in other words, forced the Berkeley institute to stop its research work on converting plants to fuel as it considered what other industries might benefit most from its laboratories and the expertise of its researchers—renewable lubricants, for instance, or perhaps converting sugarcane into jet fuel.

“All the big companies in renewable liquid fuels and chemicals” were on the table as potential replacements for BP, Somerville said, as though Berkeley were a NASCAR team in need of a new title sponsor.

The Bayh–Dole Act, which was meant to incentivize universities and professors to do significant, groundbreaking work, has instead forced one of America’s most impressive universities into a kind of mercenary science; like the tail that wags the dog, the laboratory will let its patron lead the way—toward the next important breakthrough in science, technology, or medicine, perhaps, or maybe toward whatever will get the company’s executives through their next shareholder meeting unscathed. Berkeley’s disastrous partnerships with Novartis and BP were cautionary examples of all that could go wrong when public universities allowed private interests to lease vast swaths of their campus for years at a time. Phil Knight and the University of Oregon, meanwhile, moved toward a venture that seemed to be premised on a different takeaway from Berkeley’s woes: don’t lease what you can buy.