CHAPTER TWENTY-TWO

The Evil Twin

THE END WAS NEAR, and Warburg was preparing. In 1966, he sat down before a video camera and, speaking haltingly, read a short statement that highlighted what he saw as his greatest scientific accomplishments. “Since every real discovery in science means a revolution where there are victors and vanquished, every one of our discoveries has evoked long and bitter fights,” Warburg said. “All were eventually decided in our favor.”1

That the statement reads like an obituary was no accident. Warburg had not forgotten the jarring moment in 1938 when he had been mixed up with the other Otto Warburg and found his own obituary in an English newspaper. The other Otto Warburg, a botanist, did not look much like Warburg, but from Warburg’s perspective, he fit the role of the doppelgänger. In literature, the doppelgänger is typically not only a twin but an evil twin, a dark alter ego who maintains a mysterious connection to the story’s protagonist. The doppelgänger is simultaneously alike and yet radically different, a mirror that exposes even as it reflects. The other Otto Warburg, driven by moral conviction, had stepped away from his science to campaign for the Jewish people. He was Warburg, only turned inside out.

THE WORD “DOPPELGÄNGER” first appeared in German author Jean Paul’s late eighteenth-century novel Siebenkäs at the moment when the twins in Paul’s story depart from one another. In the molecular doppelgänger tale of glucose and fructose, the two molecules that form sugar, the drama begins with a separation. Once sugar makes its way to the small intestine, it encounters an enzyme that separates its glucose and fructose halves.

Released from their embrace, both glucose and fructose will travel from the small intestine to the liver, by way of the bloodstream. For glucose, which can provide energy for cells throughout the body, the journey might only be beginning. But most of the fructose will be metabolized in the liver, which will take the arrival of the fructose as a signal to turn carbohydrates into fat. (Fats, like carbohydrates, are assembled out of carbon, hydrogen, and oxygen.)

To diabetes doctors busily tracking glucose levels in the blood and urine, the fructose departing to the liver seemed inconsequential. Fructose appeared so benign that by 1979, the American Diabetes Association was encouraging diabetes patients to use it as a sweetener in place of sugar. Glucose had been singled out as the evil twin.2

Among those who noted the new fructose recommendation was Gerald Reaven, the Stanford University endocrinologist who had become the world’s leading authority on insulin resistance. Reaven grew interested in studying insulin in 1960 after reading about Rosalyn Yalow and Solomon Berson’s new method of measuring insulin levels. Over the next 20 years, he assembled a remarkable body of research that connected eating carbohydrates to insulin resistance, and insulin resistance to a host of metabolic problems, including high blood pressure, elevated triglycerides (fat in the blood), and low levels of HDL cholesterol.

Reaven was primarily focused on type 2 diabetes and cardiovascular disease. But given our current understanding of the connection between insulin and the pathways activated in cancer cells, what Reaven discovered when he turned his attention to fructose might be every bit as relevant to cancer: in mice, at least, fructose did not look benign at all. Eating large quantities of fructose seemed to be a direct route to insulin resistance and, by extension, elevated insulin. As Reaven later put it to the New York Times Magazine, the effect of fructose was “very obvious, very dramatic.”3

Reaven’s findings on fructose have been confirmed by many other researchers over the last 40 years, in both animals and humans. In one study, Kimber Stanhope, of the University of California, Davis, fed overweight and obese adults a quarter of their calories either as fructose-sweetened drinks or as glucose-sweetened drinks. The people drinking the fructose developed clear signs of insulin resistance; the people drinking the glucose did not. After reviewing her data, as Stanhope told CBS News, she “started drinking and eating a whole lot less sugar.”4

Such feeding studies can only tell us so much. The subjects in them eat more fructose than they would likely ever eat during the course of their normal lives. And the studies last only weeks or months, whereas the chronic diseases driven by insulin resistance develop over many years. Even so, the results make clear that fructose can send the liver into a frenzy of fat production. Recent data suggest that fructose will directly turn on the genes responsible for making fat in the liver.5

That fructose, a carbohydrate, appears to make us add fat more readily than the fat we eat, is counterintuitive. But the basic concept—that carbohydrates can fatten the body—has been well understood for centuries. It was known in ancient Rome that to get the best foie gras (fatty liver), you first needed to feed your geese dates, which happen to be an excellent source of fructose. And it is unlikely that people in the ancient world found this method of fattening geese surprising. Galen had noticed that slaves who worked in the fields would grow fatter as the grapes and figs ripened. Centuries later, the same observation would be made about plantation slaves working to extract sucrose from sugarcane.

The famed German physiologist Justus von Liebig described the phenomenon of carbohydrate-driven fattening in the early 1840s. The “herbs and roots consumed by the cow contain no butter,” Liebig noted, and “no hog’s lard can be found in the potato refuse given to swine.” Liebig, who also studied how bees turned fructose from honey into wax (a fat), thought the fattening effect of carbohydrates was self-evident. One “can hardly entertain a doubt,” he concluded, “that such food, in its various forms of starch, sugar, etc., is closely connected with the production of fat.” The effect was “undeniable.”6

THOUGH RESEARCHERS ARE still working out the specific cellular mechanisms involved, the path from a fattening liver to insulin resistance is now broadly understood. As fat accumulates in the liver, some will be stored locally, and some will be shipped off to fat tissues around the body for storage. When the fat can be safely tucked away in the fat tissue under our skin, it appears to do little metabolic harm. But as the supply of fat grows, storage capacity runs out. Now the fat will end up in places it never should. The liver itself will become marbled. The fat will make its way into the pancreas and even into our muscles.

This misplaced fat may be invisible from the outside, but it is far from benign. It drives inflammation and interferes with how our cells respond to insulin. To overcome that interference, or resistance, the pancreas has no choice but to secrete more insulin, and a dangerous cycle takes off. The additional insulin will likely make us fatter, which in turn can lead to still more misplaced fat and greater insulin resistance.

This particular model of insulin resistance, known as the fat “overload hypothesis,” helps explain why some people who are overweight do not develop metabolic abnormalities and why many people at supposedly “healthy” weights do. In a sense, fat beneath the skin is protective in that it offers a safe place for storage. Scientists have genetically engineered mice that can keep expanding their fat tissue and thus their safe storage capacity for fat. Such mice are said to be equivalent to 800-pound humans, but their metabolism remains healthy. Conversely, people with lipodystrophy—a genetic disorder that leaves them with very few fat cells—become insulin resistant while remaining extremely thin.

Sugar is not the entire story of insulin resistance. Any carbohydrate that is rapidly digested—beer or bread, pasta, and cereals made with refined flour—will also spike glucose and insulin levels. If fat is eaten together with these carbohydrates, the insulin spike will lead that fat to be stored rather than burned, and that dietary fat, too, will contribute to the “overload” problem. But nothing appears to drive the first stages of the process quite like refined sugar. Drinking sugar—in soda or fruit juice—is thought to be worst of all.

Lewis Cantley, the scientist who pioneered the study of how insulin activates the pathways linked to cancer, is among the researchers who have grown alarmed about sugar. Cantley does not write popular books or articles. He has received several of the highest honors in his field, including the 2015 Canada Gairdner International Award, often a prelude to a Nobel Prize. Cantley, in other words, could hardly be more different from the diet doctors on TV or on the cover of magazines in the grocery store. But he has stopped eating sugar himself for a simple reason. His research has led him to the conclusion that today’s “high consumption of sugar” is “almost certainly responsible for the increased rates of a variety of cancers in the developed world.”

Cantley reached this conclusion based on the evidence connecting refined sugar to elevated insulin, and elevated insulin to cancer. But in Cantley’s mouse model of colorectal cancer, he found an even more direct relationship between sugar, in the form of high-fructose corn syrup, and cancer. Mice that were genetically engineered with mutations associated with colorectal cancer and then given a daily serving of sugar equivalent to the amount in a can of soda developed cancers that grew faster and bigger than mice that did not consume the sugar. The fructose in the sugar, Cantley saw, could both turn on the Warburg effect and provide the building blocks for fat molecules that the cancers use to grow. “The evidence,” Cantley said, “really suggests that if you have cancer, the sugar you’re eating may be making it grow faster.”7

While most of the fructose in sugar ends up in the liver, Richard Johnson, a fructose expert at the University of Colorado, explained that fructose can be metabolized by other tissues. He said that fructose appears to be able to directly fuel not only colon cancer but also cancers of the breast, lung, and pancreas. Though Johnson does not argue, as Warburg did, that cancer stems from a problem of cellular breathing, he believes that fructose is the “perfect food” for a growing cancer precisely because it helps cancer cells to survive in the low-oxygen environments. This can be especially important when a cancer spreads to a new location and doesn’t yet have a reliable blood supply to provide oxygen. “If you want to make a cancer happy,” Johnson said, “feed it fructose.”

According to Johnson, the same phenomenon can be seen in animals, like the naked mole rat, that live in low-oxygen environments. In their underground homes, the mole rats have less air than most animals could ever tolerate. They manage by converting a portion of their meals into fructose, which in turn increases fermentation and makes the mole rats less dependent on respiration. For the mole rats, fructose is necessary for survival. And that we crave the taste of fructose suggests that our primate ancestors benefited from it as well. Johnson, in collaboration with the anthropologist Peter Andrews, has put forward a hypothesis that we evolved from prehistoric apes that migrated from Africa to Europe and back. In Europe, these apes initially managed to find fruit for much of the year, but around 12 million years ago, Johnson and Andrews suggest, a cooling period set in that left the apes facing long winters with very little food.

As the apes starved, the capacity to store even a little extra fat was the difference between life and death. Both genetic and fossil evidence suggest that it was during this period of starvation that a mutation arose in the gene coding for the enzyme uricase. While fructose could already be converted to fat in these apes, the uricase mutation made fructose all the more fattening. During a long period without food, survival of the fittest, Johnson wrote, became “survival of the fattest.”8

CANTLEY AND JOHNSON are only two of a growing number of researchers who are alarmed about sugar and cancer. Michael Pollak, who runs the Division of Cancer Prevention at McGill, is another. “Glucose or fructose-based drinks are really among the most unhealthy foods that you could imagine,” Pollak said. It’s okay to have a little sugar, but it should be consumed like a condiment, “in the same way we have pepper.”

Precisely how much sugar is too much may be different for each person, depending on one’s genes and age and exercise habits and capacity to store fat safely. But the path from the refined sugar added to our diets to insulin resistance, and from insulin resistance to cancer, is now well understood and based on widely accepted science. For that reason, the science journalist and author Gary Taubes believes that sugar can be thought of as “a primary cause” of cancer and other diseases linked to insulin resistance and elevated insulin.9

Taubes, who studied physics at Harvard and engineering at Stanford, has spent the last 20 years researching and writing about the links between insulin, obesity, and the chronic diseases associated with the modern Western diet. He does not claim with absolute certainty that eating lots of sugar leads to these diseases, only that it is the simplest answer that fits with all of the available evidence and so, according to the principle of Occam’s razor, should be considered the most likely explanation.

“Too much sugar” might be the simplest explanation for the many obesity-linked cancers, but it is not a simple explanation. It is an idea built upon more than a century of science. Scientists had to figure out, among many other things, how fructose is converted to fat; how fat in our muscles and liver and other organs interferes with insulin signaling; how the pancreas responds by pumping out more insulin; how elevated insulin activates the Warburg effect and other molecular pathways within cancer cells; and how those pathways keep fledgling cancers alive and well nourished.

Warburg’s devotion to “simple” explanations was unrivaled. He liked to cite the wisdom of William Bayliss, an English physiologist who wrote in 1915 that the “truth is more likely to come out of error, if this is clear and definite, than out of confusion” and also that “it is better to hold a well-understood and intelligible opinion, even if it should turn out to be wrong, than to be content with a muddle-headed mixture of conflicting views, sometimes miscalled impartiality, and often no better than no opinion at all.”10

Otto Warburg did not live long enough to see the most convincing evidence linking sugar to the strange metabolism of cancer cells that he discovered. Even if he had, it is unlikely he would have budged from his own oxygen-focused explanation. Warburg, alas, did not absorb another lesson from Bayliss that appears at the bottom of the very same page of his 1915 book: “It is not going too far to say that the greatness of a scientific investigator does not rest on the fact of his having never made a mistake, but rather on his readiness to admit that he has done so, whenever the contrary evidence is cogent enough.”

In the case of sugar and metabolic diseases, there is contrary evidence. Skeptics have argued that if sugar is truly the driving force behind America’s obesity and diabetes epidemics, the rates of these conditions should have gone down in recent years, given that sugar consumption has declined of late in response to warnings about the danger it poses. Whether this argument is “cogent enough” is worth exploring, but as Taubes has pointed out, Americans today are still eating sugar in quantities that would have been unthinkable a century ago—a time at which Haven Emerson was already pointing out that Americans were eating vastly more sugar than their grandparents ever had. As Taubes sees it, to suggest that sugar is not responsible for metabolic diseases based on current trends would be like cutting back from 20 to 17 cigarettes a day and then concluding that smoking could not be responsible for lung cancer if the rate did not fall.

The smoking parallel might also explain why the connection between sugar and cancer can be so hard to accept. Although the definitive studies linking smoking to lung cancer were only carried out in the 1950s, smoking should have been the most obvious suspect long before then. A British physician had connected inhaling snuff to cancer of the nose by 1761. By the end of the eighteenth century, it was known that chimney sweeps developed cancer of the scrotum and that cancer of the lip was more common among pipe smokers. Lung cancer, meanwhile, had become far more common after cigarette smoking had become far more common, and it was understood that cigarette smokers inhaled carcinogens more deeply into the lungs relative to pipe smokers.

As a medical student, Richard Doll himself had become interested in the connection between pipe smoking and oral cancers. And yet Doll—like most British and American doctors of the mid-twentieth century—couldn’t fathom that cigarettes were behind the emerging lung cancer epidemic until his own studies finally provided overwhelming evidence. As Doll once explained, the problem was that “cigarette smoking was such a normal thing and had been for such a long time.” It “was difficult to think it could be associated with any disease.”

In 2016, Cantley said that we may someday “view this era of massive addiction to sugar in America in the same way that we now view the period of massive addiction to tobacco.” In the meantime, almost everyone agrees that we need far more research on the health effects of sugar. Nearly 100 years ago, Elliott Joslin pointed out that if Americans were dying of infectious diseases like typhoid or scarlet fever at the rates they were dying of diabetes, there would be a rapid government response “to discover the source of the outbreak.” The same point could have been made about cancer.11

Many different debilitating conditions appear together with insulin resistance. Our numbness to the suffering they cause might, in the end, be the most debilitating condition of all.

SUGAR HAS LONG OCCUPIED a troubling place in human history and society, even setting aside what happens when we ingest it. The sugar industry, through its reliance on the slave trade, once had a long-standing relationship with evil. In the 1930s, that relationship was renewed when the Nazis took up the cause of the German beet sugar industry, which had been decimated by the Great Depression. To the Nazis, support for beet sugar was politically expedient, a chance to win the votes of poor rural laborers. “No other German industry,” wrote the historian John Perkins, “was so closely integrated as beet-sugar production with the agricultural sector and the rural life that Nazi ideologues, with their ‘blood and soil’ (Blut und Boden) outlook, sought to protect, eulogize and enhance.”

In the early 1940s, the beet sugar industry was integrated into the Nazi agenda in an additional way. After sucrose was extracted from beets, it left behind a brown sludge known as schlempe. In 1898, the German chemist Julius Bueb had discovered that if he heated the schlempe in a closed chamber, he could form a cyanide gas. That gas would become known as Zyklon B. The supply that made its way into the Nazi killing chambers, via a third-party distributor, was created by a company known as Dessau Works for the Sugar and Chemical Industry.12

After the war started, sugar was rationed in Nazi Germany along with other foods, but some were too hooked to cut back. Hitler was a man of addictions, but his addiction to sugar may have been more intense than any other. When Baldur von Schirach, the head of the Nazi Students’ Association, sat down for a meeting with Hitler in 1928, he looked on in amazement. “At tea time, I couldn’t believe my eyes,” Schirach recalled. “He put so many lumps of sugar in his cup that there was hardly any room for the tea, which he then slurped down noisily. He also ate three or four pieces of cream pie.”13

Sugar, of course, cannot be blamed for Nazism or for turning Hitler into a madman. But as his madness grew, so, too, did his taste for sweets. It wasn’t only his cherished Viennese pastries that he longed for. On any given day, Hitler might consume 2 full pounds of chocolates or 2 pounds of pralines. He even added sugar to his wine. Hitler’s valet, Heinz Linge, recalled that while planning for the invasion of Norway, Hitler kept running out of the room for more sweets. Linge asked Hitler if he was hungry. “For me, sweets are the best food for the nerves,” Hitler replied.14

Hitler was aware that his binging was causing him to gain weight, and he tried to slow down. He asked his personal chef, Constanze Manziarly, to serve him only grated apples for dessert. But it was no use. Manziarly recalled that Hitler would “lose control” and attack her cakes. “I bake a lot every day, often for hours,” Manziarly wrote in a letter to her sister, “but in the evening everything is always gone.”

During his final days in his bunker in Berlin, Hitler was so weak he could barely move around. His cake had to be crumbled for him so he could slurp down the crumbs between his rotting teeth. As Norman Ohler asserts in Blitzed, an examination of substance abuse in Nazi Germany, with Hitler’s supply of medicines and stimulants now cut off, sugar “was the final drug.”15

Hitler was always extreme but never original. He stood out not because he was anti-Semitic but because he was fanatically anti-Semitic. The same was true of Hitler’s hunger for sugar and dread of cancer. Many, if not most, Germans were developing a taste for sugar and a fear of cancer during Hitler’s lifetime. Hitler was merely a grotesque caricature of the Germans of his day. That his sugar addiction may have left him more susceptible to the disease that terrified him most was only one more example of how Hitler got everything exactly wrong.

Hitler was born in 1889, the year cancer was first characterized as a problem of “seed and soil.” The metaphor might be the best way to understand Nazi Germany itself. Hitler was the seed, but seeds do not grow without the proper nourishment. Had the German soil not already been so well fertilized with hatred, Hitler could never have metastasized.

IN THE LATE 1960s, Warburg chose Birgit Vennesland, of the University of Chicago, to succeed him at his Institute for Cell Physiology. Vennesland was a natural for the position. Though not auditioning for the job, earlier in the decade she had demonstrated the key qualification: she had become an outspoken champion of Warburg’s thinking on photosynthesis.

Vennesland saw a side of Warburg few others ever would. During a visit to his institute, she asked Warburg whether he believed he had ever made any mistakes. It was the right question for Warburg, and it’s a testament to Vennesland that she was one of the few people—if not the only person—to ask it.

Warburg let the question sit for a moment before answering. “Of course, I have made mistakes—many of them,” he said. “The only way to avoid making any mistakes is never to do anything at all. My biggest mistake . . .” Warburg stopped, an even longer pause this time. “My biggest mistake was to get much too much involved in controversy. Never get involved in controversy. It’s a waste of time. It isn’t that controversy in itself is wrong. No, it can even be stimulating. But controversy takes too much time and energy. That’s what’s wrong about it. I have wasted my time and my energy in controversy, when I should have been going on doing new experiments.”

It was an extraordinary statement. Warburg’s entire life had been spent in controversy. Warburg liked, and even framed, Max Planck’s famous line that a new scientific truth triumphs only when “its opponents eventually die.” But waiting for his opponents to die was not truly the Warburg way. “It is not true that time helps truth to victory,” he once wrote to Dean Burk. Victory required accumulating more and more facts until “nobody can stop the stream.”16

Vennesland wasn’t done with the challenging questions. In the same conversation, she asked why Warburg held so much animosity for the great physicist James Franck. It is likely that Warburg envied Franck, who had been a prized student of Warburg’s father. But Warburg offered another explanation: Franck had once said that Warburg couldn’t measure light. “He was a theoretician,” Warburg told Vennesland. “By himself he couldn’t measure anything, and he said I—I couldn’t—measure. . . .”

“Something curious was happening,” Vennesland later wrote. “Warburg was getting a little incoherent. In the course of telling me about why he got angry, Otto Warburg got angry all over again. He got as angry as he would have been if James Franck had been sitting there in the same room, now, telling him how sorry he (Franck) was, that his (Warburg’s) measurements couldn’t possibly be right.”

Vennesland had seen Warburg for who he was, but she had failed to appreciate that she, too, would eventually face Warburg’s wrath. Though the details are unclear, Vennesland, while working with Warburg in preparation to take over the institute, appears to have committed the unforgivable sin of questioning his photosynthesis findings. Warburg responded by locking her out of the institute.17

In a letter that he sent to multiple colleagues in the first months of 1970, Warburg claimed that Vennesland had a “severe mental disorder” and had experienced a breakdown. As Warburg’s story went, she had shouted “that she had to take the lead immediately because everything we were doing was wrong.” Vennesland, Warburg continued, then “attacked my co-workers violently” and “ran screaming into the street” before being arrested.

Though it would have been understandable if Vennesland, or anyone else, had responded to Warburg by running through the streets screaming, it was almost certainly a fantasy. In 1965, Warburg told Hans Krebs that after their recent meeting in England, he had been on his way to the airport when he spotted a black stallion that had escaped its stable. Warburg said that he had stopped his journey to rescue the horse before it was hit by a car. Krebs, however, was able to confirm that the stable in question had no black stallions. “Was the incident,” Krebs wondered, “merely a dream?”18

IN NOVEMBER 1968, Otto Warburg climbed up the ladder in his library to retrieve a book from a high shelf. He had recently celebrated his 85th birthday, and he was in good spirits. “In many respects,” Warburg had written to a friend earlier that year, it was better to be old than young. “The struggle for existence”—the same Darwinian phrase he used to explain how competing cells turn cancerous—“is over, and, if one possesses luck and reason, one can still live for many years.”

Whatever regrets he had expressed to Vennesland about spending his life picking fights were now long forgotten. In a letter to von Ardenne the previous year, Warburg encouraged his friend to continue his cancer studies in the face of criticism, just as he himself had: “The more resistance I found, the more I attacked and the better my weapons became.”

Warburg’s one known ailment, heart palpitations, had first been diagnosed while he was a medical student. (One of Warburg’s professors, the famed Ludolf Krehl, made the diagnosis himself and told Warburg to focus on lab work: “No one has yet died from biochemistry,” Krehl pointed out.) The palpitations returned in the last months of 1956, and Warburg feared the problem might somehow be related to a blood clot. A doctor suggested cutting back on coffee but assured Warburg that he had no reason to be concerned.19

At 85, Warburg might have lived a while longer. But on the third step of the ladder, his foot slipped. He fell backward, crashing to the floor of his library, where he lay with a fractured hip, helpless beneath Pasteur’s cold gaze. Because the door to the library was closed—a sign that Warburg did not want to be bothered—no one immediately came to his rescue.20

The fall appeared to be a cinematic ending for Warburg: the Faustian hunger for knowledge and the preordained punishment encapsulated in a single act. In the film, the camera, looking down on him from above, would pull away slowly. Warburg, motionless on the floor of his Dahlem palace, would become smaller and smaller: a breathing speck, a single isolated cell.

Warburg survived the fall, but was no longer the same. In the last week of July 1970, he felt a pain in his leg and was diagnosed with the blood clot he had long feared. Late in the evening of August 1, the clot broke free and traveled up his body until it came to rest in a narrow passage between his heart and lungs.

Although Warburg did not live to see the revival of his research, he never doubted that he would eventually be proved correct about cancer. And everything else.

img

Otto Warburg, date unknown.