CHAPTER 2
White Mice and Windowless Rooms
It’s just a lot of little guys in tweed suits cutting up frogs on foundation grants.
—WOODY ALLEN AS MILES MONROE IN SLEEPER (1973)
In 2008, John Porter, a Washington, DC, lawyer and former Republican member of Congress, stood in front of a group of scientists at a meeting of the American Association for the Advancement of Science (AAAS). Channeling General George S. Patton, Porter issued a challenge. “You can sit on your fingers or you can go outside your comfort zone and get into the game and make a difference for science,” he said. “Neither we, nor the AAAS, nor any other group can do it for you. Science needs you. Your country needs you. America needs you…fighting for science!” According to Porter, the time had come for scientists to spread out across the country to explain what they were doing and why they were doing it—to make their case to the media and to the people.
Although scientists are probably in the best position to explain science to the public, several factors are working against them—and they’re not trivial: specifically, their training, their personalities, and how they and their work are perceived by the public.
••••
IN 2003, WHILE STUDYING FOR HIS ROLE AS A GEOPHYSICIST IN THE movie The Core, the actor Aaron Eckhart spent time with several real geologists. To his surprise, they didn’t seem much different from anyone else. Eckhart noted that scientists were “just as concerned as you or I about everyday things.” Nonetheless, the stereotypical images that many people have of scientists—as portrayed on television shows like The Big Bang Theory and movies like Back to the Future—isn’t that far from the truth. Scientists are often shy, quiet, introverted, and thoughtful—far more comfortable working in isolation than carousing in public. You don’t see scientists doing stand-up comedy, appearing on reality television shows, or screaming shirtless in subfreezing weather at football games.
The public’s perception of scientists is consistent and ingrained.
In 1957, the anthropologist Margaret Mead asked thirty-five thousand American high school students to complete the following sentence: “When I think of a scientist, I think of…” They wrote, “The scientist is a man who wears a white coat and works in a laboratory. He is elderly or middle-aged and wears glasses. He may wear a beard. He may be unshaven and unkempt. He may be stooped and tired. He is surrounded by equipment: test tubes, Bunsen burners, flasks and bottles, and weird machines with dials. He spends his days doing experiments. He pours chemicals from one test tube to another. He scans the heavens through a telescope. He peers raptly through a microscope. He experiments with plants and animals, cutting them apart. He injects serum into animals. He writes neatly in black notebooks.”
Twenty-five years later, in 1982, an Australian educator named David Chambers asked forty-eight hundred elementary school students to draw a scientist. In each case, the scientist wore a white lab coat, had unkempt, tousled hair, peered out from behind thick, dark-rimmed glasses, and was male.
Why were these children so uniform in their responses? Where do these images come from? In his book Mad, Bad, and Dangerous? The Scientist and the Cinema, Christopher Frayling explains their origins.
The wild hair, Frayling argues, comes from the world’s most famous scientist: Albert Einstein, the human symbol of genius. Although other iconic scientists like Archimedes, Marie Curie, Charles Darwin, Galileo, Isaac Newton, Louis Pasteur, Linus Pauling, James Watson, or Francis Crick equally could have been revered, Einstein’s universal popularity lies in his simple and easy-to-remember formula: e = mc2. Albert Einstein is so famous that his face has become a cultural icon. His wise and sympathetic eyes appear on E.T. in E.T. the Extraterrestrial (1982), his forehead on Yoda in Star Wars (1977), and his wild hair on Dr. Emmett Brown (played by the actor Christopher Lloyd) in Back to the Future (1985).
The white lab coat, writes Frayling, is a “symbol of neutrality, cleanliness, separation from the rest of the world, and standards—usually male—of professionalism.” Frayling also proposes a more ominous meaning. In the mid-1960s, the social psychologist Stanley Milgram—attempting to understand the horrors of Nazi Germany—found that people were more likely to submit to authority when the person running the experiment was wearing a white lab coat. (When I am filmed in my laboratory, producers invariably ask me to put on a white lab coat, which I never wear. This, presumably, is to make me look like I actually know what I’m talking about.)
Thick, black glasses are also part of the uniform. Dr. Clayton Forrester (Gene Barry) in the original War of the Worlds (1953), the biologist Diane Farrow (Sandra Bullock) in Love Potion Number 9 (1992), and the MIT graduate David Levinson (Jeff Goldblum) in Independence Day (1996) all wear thick, black glasses. Coke-bottle glasses, according to Frayling, “are often an outward and visible sign of the scientist’s perceived incompleteness as a human being, a shortsightedness that cuts him or her off from the mainstream.”
In short, we perceive scientists as brilliant in the laboratory but unfit to navigate the real world. In Independence Day, the wild-eyed, strange-haired Dr. Brakish Okun (Brent Spiner), the director of research in Area 51, meets President Thomas J. Whitmore (Bill Pullman). “Mr. President!” he says. “Wow! This is…what a pleasure…. As you can imagine, they…they don’t let us out much.” In I.Q. (1994), Albert Einstein (Walter Matthau) and two other real-life scientists portrayed by actors pal around in a Three-Stooges–like buddy movie for geniuses. Collectively, they can’t drive a car or retrieve a badminton birdie from a tree. “Three of the greatest minds in the twentieth century,” notes a friend, “and amongst them they can’t change a light bulb.” Most pathetic is the pick-up line of the Princeton mathematician John Nash (Russell Crowe) in A Beautiful Mind (2001): “I don’t exactly know what I’m required to say in order for you to have intercourse with me.”
In the worst case, the scientist is seen as someone who creates monsters, either literally, like Frankenstein’s monster, or figuratively, like genetically modified organisms (GMOs). “In these images of our popular culture,” wrote the historian Theodore Roszak, “resides a legitimate public fear of the scientist’s stripped down, depersonalized conception of knowledge—a fear that our scientists, well-intentioned and decent men and women all, will go on being titans who create monsters…the child of power without spiritual intelligence.”
Because only 0.3 percent of Americans are professional scientists, most people have probably never met one. And so the stereotypes persist. We have no idea who scientists are, what they do, or why they do it. One story, and it is no doubt apocryphal, involves Albert Einstein traveling on a train from New York City to his home in Princeton, New Jersey. Einstein is explaining his theory of relativity to a group of journalists gathered around him. An elderly man sitting across the aisle listens carefully to Einstein’s descriptions. When the train reaches Princeton, the man sidles up to Einstein and says, “So tell me, Mr. Einstein. From this you make a living?”
I’m fairly typical of most scientists. I first started working in a scientific laboratory in 1981, studying a virus called rotavirus: a common cause of fever, vomiting, and diarrhea in young children. At the time, every year in the United States about four million children were infected with the virus, seventy thousand were hospitalized with severe dehydration, and sixty died from the disease. Because rotavirus had only recently been shown to be a cause of human disease, not much was known about how to prevent it. Our laboratory was the first to develop a small-animal model using mice to study this infection. For the next twenty years, every morning I walked into a small, concrete-blocked, windowless room in the animal facility at the Wistar Institute in Philadelphia to inoculate mice and collect their blood, breast milk, and feces. Listening to classical music, I would spend several hours a day, seven days a week, in the “mouse house.” As you can imagine, talking to mice every morning alone in a tiny room wasn’t the best way to prepare for appearances on The Colbert Report.
Indeed, nothing about my job requires me to be good with people. On the contrary, it selects for someone who is perfectly comfortable being apart from people—or at most, working next to them like a child engaged in parallel play. No schmoozing. No backslapping. No gathering around the coffee machine to tell interesting stories from the night before. The opposite of a “people person.”
In fact, scientists are so reticent to appear in public that they are often appalled when other scientists do it. They feel that these “celebrity” scientists, by pandering to the media, are selling out; that, by simplifying their work for the public, they’re lessening its importance. Perhaps no two people have been punished more for their frequent media appearances than Carl Sagan, whose award-winning 1980 television series, Cosmos, sparked an interest in astronomy among thousands of young people, and Jonas Salk, the inventor of the first polio vaccine and one of the first scientists to appear on television. Members of the National Academy of Sciences—one of the most prestigious scientific organizations in the world—refused admission to both Sagan and Salk because of their celebrity. Surely, no real scientists would prostitute themselves by doing what they had done.
Another force working against scientists is their training. Early on, scientists learn that the scientific method doesn’t allow for absolute certainty. When scientists formulate a hypothesis, it’s always framed in the negative; this is known as the null hypothesis. When communicating science to the public, the null hypothesis can be a problem.
I’ll give you an example. Suppose you want to know whether the measles–mumps–rubella (MMR) vaccine causes autism. The null hypothesis would be “the MMR vaccine does not cause autism.” Studies designed to answer this question can result in two possible outcomes. Findings can reject the null hypothesis, meaning that autism following the MMR vaccine occurs at a level greater than would be expected by chance alone. Or, findings cannot reject the null hypothesis, meaning that autism following the MMR vaccine occurs at a level expected by chance alone. The temptation in the first case would be to say that the MMR vaccine causes autism and in the second that it doesn’t. But scientists can’t make either of those statements. They can only say that one thing is associated with another at a certain level of statistical probability.
Also, scientists can never accept the null hypothesis; said another way, they can never prove never. Brian Strom, formerly the head of the Center for Clinical Epidemiology and Biostatistics at the University of Pennsylvania, used to call it “the P word.” He wouldn’t let his trainees say prove because epidemiological studies don’t prove anything. When trying to reassure people that a particular health scare is ill founded, the scientific method can handcuff scientists.
Here are some practical examples of Brian Strom’s “P-word” problem. When I was a little boy, I watched the television show Adventures of Superman, starring George Reeves. One thing that any child watching that show knew to be true was that Superman could fly. When you’re five years old, television does not lie. I believed that if I walked into my backyard, tied a towel around my neck (to simulate Superman’s cape), and jumped from a chair, I could fly. After several attempts (spoiler alert), I found that I couldn’t. But this didn’t prove that I couldn’t fly. I could have tried a million times, and that still wouldn’t have proved that I couldn’t fly. It would only have made it all the more statistically unlikely. You can’t prove that weapons of mass destruction weren’t hidden somewhere in Iraq; you can only say that they weren’t anywhere that you looked. You can’t prove that I’ve never been to Juneau, Alaska (even though I’ve never been to Juneau, Alaska); you can only show a series of pictures of buildings in Juneau with me not standing next to them. Scientists know that you can never prove never. The point being that, unlike mathematical theorems, when it comes to studies designed to determine whether one thing causes another, there are no formal proofs—only statistical associations of various strengths.
One example of how the scientific method can enslave scientists occurred in front of the House of Representatives Committee on Government Reform. On April 6, 2000, a Republican member of Congress from Indiana, Dan Burton, certain that the MMR vaccine had caused his grandson’s autism, held a hearing to air his ill-founded belief. At the time, one study had already shown that children who had received the MMR vaccine had the same risk of autism as those who hadn’t received it. (Since that hearing, sixteen additional studies have found the same thing.) The scientists who testified at the hearing, however, knew that no scientific study could ever prove that the MMR vaccine does not cause autism. They knew they could never say, “The MMR vaccine doesn’t cause autism.” So they didn’t. Rather, they said things like, “All the evidence to date doesn’t support the hypothesis that the MMR vaccine causes autism.” To Dan Burton, this sounded like a door was being left open—like the scientists were waffling or worse, covering something up. “You put out a report to the people of this country that [the MMR vaccine] doesn’t cause autism, and then you’ve got an out in the back of the thing,” he screamed. “You can’t tell me under oath that there is no causal link, because you just don’t know, do you?”
Another force working against scientists is the difficulty of reducing complex scientific issues into simple sound bites. A scientist’s instinct is to fully explain an issue—including the nuances and complicated parts—so that ambiguity can be reduced or eliminated. Trying to condense a difficult concept into a sentence or two not only feels intellectually dishonest, it is intellectually dishonest.
The “sound-bite” problem is impossible to avoid. For example, in the late 1990s parents became concerned that a mercury-containing preservative in vaccines called thimerosal might cause autism. At the time, thimerosal was present in several vaccines given to infants and young children. The preservative was used in multi-dose vials (which typically contain ten doses) to prevent contamination with bacteria or fungi that might have been inadvertently injected into the vial while removing the first few doses. (Before preservatives were added to vaccines, these bacteria and fungi occasionally caused severe or fatal infections.) Exercising caution, the Public Health Service urged vaccine makers to take thimerosal out of vaccines and switch to single-dose vials, which would eliminate the need for a preservative. This would make vaccines more expensive—about 60 percent of the cost of a vaccine is its packaging—but better safe than sorry.
This wasn’t a trivial issue. The firestorm created by the Public Health Service’s directive to remove thimerosal from vaccines drew international media attention and gave birth to at least three anti-vaccine groups: Generation Rescue, which believed that children with autism could be cured by removing mercury from their bodies; SafeMinds, which believed that the symptoms of autism were identical to those of mercury poisoning; and Moms Against Mercury, which advocated for mercury-free vaccines. Mercury is never going to sound good. There will never be an advocacy group called the National Association for the Appreciation of Heavy Metals standing up in defense of mercury. Frankly, any parent could reasonably conclude that if large quantities of mercury can damage the brain (which they can), then even small amounts like those once contained in several vaccines should be avoided.
The media had questions. Why was mercury being injected into babies? And why had the federal government allowed it to happen? Unfortunately, few scientists stepped up to provide answers. And it’s easy to understand why. Trying to explain briefly that thimerosal in vaccines had never caused a problem—and why it never would have caused a problem—was nearly impossible. To fully explain this issue, a scientist would have had to have made the following points: (1) Thimerosal, the preservative in vaccines, is ethylmercury; (2) environmental mercury, which is the kind of mercury that can be harmful, is methylmercury; (3) ethylmercury and methylmercury are different (at this point, you’ve already lost the audience; prefixes like ethyl and methyl are meaningless; it sounds like you’re talking about gasoline or alcohol, which also seem like things that shouldn’t be injected into babies); (4) ethylmercury (thimerosal) is eliminated from the body ten times faster than methylmercury, which is one of the reasons that thimerosal doesn’t cause harm; (5) methylmercury is present in everything made from water on this planet, including infant formula and breast milk; (6) the quantity of methylmercury in infant formula and breast milk is much greater than the trace quantities of ethylmercury that were contained in vaccines; (7) because methylmercury is ubiquitous in the environment, all children, including newborn babies, have methylmercury in their bloodstreams; (8) children injected with ethylmercury (thimerosal)-containing vaccines have levels of mercury in their bloodstreams well within those considered to be safe; and (9) seven studies have shown that children who had received thimerosal-containing vaccines were no more likely to develop mercury poisoning, autism, or other developmental problems than children who had received the same vaccines without thimerosal. Try to reduce that into a ten- or fifteen-second sound bite for television.
I struggled mightily with the thimerosal issue when speaking with the media. I would say things like, “Babies are receiving much greater quantities of mercury from breast milk or infant formula than from vaccines.” This wasn’t particularly reassuring. Now parents were scared of everything they were putting into their babies—trapped in an environmental hell. Or I would say, “Studies have shown that children who received thimerosal-containing vaccines are at no greater risk of autism than children who received the same vaccines without thimerosal.” This, also, wasn’t particularly reassuring. From the parents’ standpoint, mercury is bad, so it shouldn’t have been in vaccines in the first place. At one point, I had to testify in front of a hearing during which a member of Congress said, “I have zero tolerance for mercury.” What I really wanted to say was, “Mercury is part of the Earth’s crust. If you have zero tolerance for mercury, you should move to another planet.” (If you’ve ever testified in front of Congress, you would understand why having representatives move to another planet is not the worst idea.)
In the end, thimerosal was removed as a preservative from virtually all vaccines given to young children, and, because single-dose vials replaced multi-dose vials, the cost of vaccines increased, with no benefit. Advocates perceived vaccines as being safer even though they weren’t. They were just more expensive. Scientists who stood up for the science on this issue sounded like they didn’t care about children—like they were perfectly willing to stand back and watch babies get injected with a heavy metal. In truth, scientists who stood up for the science of thimerosal were standing up for children. Because vaccines were now more expensive, they became less available, putting children at needless risk, especially in the developing world.
Another factor working against scientists is their commitment to precise language; as a consequence, they’re intolerant of even the slightest inaccuracies in how science is portrayed to the public. This, in combination with the inevitably inverse relationship between the popularity of a scientific movie or television show and its accuracy, can make for problems. The most dramatic example of this issue is the astrophysicist Neil deGrasse Tyson’s reaction to the movie Titanic.
Released in 1997, Titanic, starring Kate Winslet and Leonardo DiCaprio, was one of the most popular movies ever made, seen by an estimated four hundred million people. And its animated depiction of how and why the RMS Titanic sank was perfect. Although people didn’t watch the movie because they wanted to learn about the physics of the RMS Titanic’s sinking, they learned anyway. (Come for the love story. Stay for the science.) Neil deGrasse Tyson, however, was appalled at the inaccuracy of one scene, so he sent a message to James Cameron, the director. “Neil deGrasse Tyson sent me quite a snarky email saying that, at that time of the year, in that position in the Atlantic in 1912, when Rose is lying on the piece of driftwood and staring up at the stars, that is not the star field she would have seen,” said Cameron. To his credit, Cameron revised the scene for the 3D version of the film. One can only imagine that of the hundreds of millions of people who saw Titanic, Neil deGrasse Tyson was the only one who watched that scene and thought, “That’s not what the sky in the North Atlantic would have looked like at that time of the year in 1912! What the hell was James Cameron thinking?”
My favorite movie mistake is in The Wizard of Oz (1939). After the Wizard gives him a diploma, the scarecrow recites the Pythagorean theorem: “The sum of the square roots of any two sides of an isosceles triangle is equal to the square of the remaining side. Oh joy, rapture, I’ve got a brain!” An isosceles triangle has two equal sides. The square roots of one of those equal sides plus the third side cannot possibly equal the remaining one. He should have said, “The square root of the longest side of a right triangle is equal to the sum of the squares of the other two sides.” (My children argue that it’s not fair to complain about the scientific accuracy of a movie that includes a scene in which an entire house taken up by a tornado in Kansas kills a wicked witch in Munchkinland.)
Given all of these issues, scientists might appear to be the last group able to effectively communicate science to the public. They do, however, have one thing going for them. And it’s probably going to surprise you. Scientists are good storytellers. They have to be. Federal funds to support science are limited, and it’s hard to get scientific papers published in well-respected journals. To survive, scientists must compel people listening to their talks or reading their articles or reviewing their grant proposals that what they are doing is important. Otherwise, they won’t be able to do it much longer.
I’ll give you a personal example. Many years ago, I published a paper in the Journal of Virology titled “Molecular Basis of Rotavirus Virulence: Role of Gene Segment 4.” Admittedly, this title won’t grab people in the same way that, say, “Disco Bloodbath” would; nonetheless, there was a story to tell. In the first few paragraphs of this and my other scientific papers, the reader would learn that rotaviruses killed about two thousand children a day in the world; that when children died, they died from shock caused by severe dehydration; that the World Health Organization, in an attempt to treat a disease that killed more infants and young children than any other single infection, started a program to provide developing-world countries with oral rehydration fluids; that the program was entirely unsuccessful because it’s hard to get children to hold down fluids when they’re vomiting; that, because of this, the best way to prevent these deaths was with a vaccine; and that our team at the Children’s Hospital of Philadelphia had taken a first step in understanding at least one part of this virus that was making children sick. This was a good story, a sort of quest. It wasn’t a Homeric quest—no Minotaurs, Trojan Horses, Sirens, or Golden Fleeces. But a quest, nonetheless.
Here’s another personal example of how scientists are compelled by story. In March 2016, I was asked to speak at the Graduate School of Public Health at the University of Pittsburgh. For me, this was a homecoming, as I had done my pediatric training at the Children’s Hospital of Pittsburgh in the late 1970s. When I was an intern, I saw a little girl die from a rotavirus infection. The mother, from rural Western Pennsylvania, was fiercely devoted to her daughter. When the little girl developed fever and began vomiting, she called her doctor, who told her to give the child small sips of water containing salt and sugar. Despite her attempts, the child couldn’t hold anything down; the vomiting was just too severe. Later that night, less than twelve hours after her illness had begun, the child was listless. So the mother brought her little girl, only nine months old, into the emergency department of our hospital. The minute the mother walked through the door, a nurse whisked the child to one of the treatment rooms so that we could insert a catheter into a vein in her arm to give her the fluids she so desperately needed. Unfortunately, the child was so dehydrated that we couldn’t find a vein. So we called a surgeon to come to the emergency room to thread a catheter into a vein in her neck. While waiting, we placed a large-bore hollow needle (typically used for removing bone marrow) into the middle of a bone just below her knee. We hoped to get enough fluid into her bone, which would then enter her bloodstream, to prevent the impending shock. But it was too late. Before the surgeon had arrived, the child’s heart had stopped beating. We tried to resuscitate her, but couldn’t.
The next moment was worse than anything you can imagine. Now, we had to walk out of the treatment room, into the waiting room, and tell a young mother that her nine-month-old daughter, who was perfectly healthy two days ago, was dead. That’s the story I told when I began my talk in Pittsburgh about our work on a rotavirus vaccine. And it’s the story that was always in the back of my mind during the twenty-six years that we developed that vaccine. We are, all of us, compelled by story.
Indeed, scientists have always been good storytellers.
For example, Galileo was the first person to observe the phases of Venus, the moons around Jupiter, and hundreds of previously unseen stars in the Milky Way. On March 13, 1610, he published his observations under the title The Starry Messenger. What a great title. He could have chosen something far more scientific sounding, like Planetary Observations, but he didn’t. The Starry Messenger suggested not only that he had been observing the heavenly bodies, but that they had been observing him. Later, Galileo angered the Church by claiming that the Earth revolved around the sun, not the other way around. His observation contradicted the biblical statement that the Earth was “the center of the firmament.” Galileo was tried by a Church inquisition, found guilty of heresy, and placed under house arrest. Fortunately, we don’t do that anymore. We don’t arrest scientists and put them in jail when they say things we don’t like. Now we just send them threatening emails.
Historically, scientists have also been good entertainers. Electricity, for example, was popularized at county fairs and expositions; people were amazed when they received small shocks, when their hair stood on end, or when they saw a dead frog’s leg twitch. Probably no one was more impressed by the frog-leg-twitching demonstration than Mary Shelley, who, at the age of 18 years, wrote a book about how electricity brought a monster to life: Frankenstein. (It’s alive!)
What Galileo did for astronomy and Mary Shelley did for electricity, Louis Pasteur did for vaccines. In May 1881, Pasteur separated sheep, goats, and cows into two groups. To one group, he gave two shots of what he hoped was a vaccine to prevent anthrax, a common cause of disease in animals and a bane to farmers. To the other group, he gave nothing. Thirty days after the injections, he infected both groups with live anthrax bacteria. All of the animals in the unvaccinated group died, whereas those in the vaccinated group survived. The public was amazed by the result, especially when two of the unvaccinated animals dropped dead in front of them.
Many scientists, past and present, have been wonderful storytellers. Biochemists like Isaac Asimov, oceanographers like Jacques Cousteau, physicians like Siddhartha Mukherjee and Atul Gawande, evolutionary biologists like Stephen Jay Gould and Richard Dawkins, mechanical engineers like Bill Nye, and theoretical physicists like Albert Einstein and Stephen Hawking have all found a way to make difficult concepts accessible and fun.
Scientists willing to enter the world of science communication should also be heartened by the fact that most people actually trust scientists and value what they do. A recent poll asked participants to rate various fields on how much they contribute to society. Topping the list was the military; 84 percent of those polled felt that members of the military made important contributions. Next, were teachers at 77 percent, scientists at 70 percent, medical doctors at 69 percent, engineers at 64 percent, the clergy at 40 percent, journalists at 38 percent, artists at 31 percent, and lawyers at 23 percent. Trailing the field were business executives at 21 percent.
So, as John Porter has urged, scientists need to get in the game. If they read a story in a magazine or newspaper or watch a television program or hear a radio show in which the science is inaccurate, they should call the writer or producer and educate them about their field. It won’t change the story that has already been published or aired, but at least those journalists might think again before writing a similar story. And, best-case scenario, scientists might be called upon for their comments the next time the subject is raised. Also, no venue is too small. Scientists should speak at elementary, junior high, and high schools about their areas of expertise—they should become an army of science advocates out to educate the country. Because science is losing its rightful status as a source of truth, now is the time.
Regarding speaking at elementary schools, however, I offer only one piece of advice: Avoid the following scenario.
In 2007, when my daughter was in the eighth grade, her biology teacher asked me to speak to her class about vaccines. I was thrilled. My daughter was mortified. During the drive to school that morning, she spent the entire time trying to convince me not to tell jokes. “Don’t tell jokes, Dad. Kids my age won’t think you’re funny. They don’t get old people’s jokes. Trust me on this.” My daughter attended an all-girls school in suburban Philadelphia. During the talk, the twenty or so girls in her class seemed to be enjoying it. My daughter, on the other hand, stared grimly forward. Never moved. A look on her face that said, “Don’t embarrass me in front of my friends. I’m not kidding.” It was, without question, the most harrowing talk I’ve ever given. And I’m including talks in front of congressional subcommittees, talks that have been protested by people marching outside, and talks in front of live national television audiences. The worst-case scenarios in those situations is that people will scream at me, send me hate mail, lobby my hospital’s CEO to fire me, or try to get my medical license revoked. But embarrass your daughter in front of her eighth-grade friends and you’re a dead man. A walking dead man. (Just for the record, I haven’t always been scared of my daughter. It’s only been since she learned to speak in complete sentences.)