The twentieth century was filled with ideas, discoveries, and inventions based on the benefits of freeing humans from bacterial, viral, and parasitic contamination. Wonderful scientific discoveries significantly reduced infant mortality, lengthened life spans, and drove medical technologies. However, the fundamental approach to human biology behind these advances unintentionally ushered in an epidemic of diseases afflicting humanity in the twenty-first century.
The two fatefully mistaken fundamental concepts were:
In the first section of this book I demonstrate how and why these two misguided principles came to underpin our understanding of medical science, producing a flawed paradigm that had untold adverse effects for the long-term health of our species. The desire for a biological purity that doesn’t exist and the dream of a future for medicine built solely on the human mammalian genome have led us astray. The perspective I present flies in the face of much medical history and contradicts the thoughts of many brilliant Nobel Prize–winning scientists and educators. Nevertheless, our understanding of who and what humans truly are is undergoing a profound shift. It is a shift more and more researchers in the scientific community are recognizing.
In 1890 Robert Koch, a German physician and microbiologist, presented what later became known as Koch’s postulates. These notions drove the infectious-disease paradigm of human medicine. They are simply four criteria used to establish the causal relationship between microbes and disease:
Using Koch’s criteria, specific microbes were rapidly found to cause many of the killer diseases of the early twentieth century, including typhoid fever, cholera, tuberculosis, and influenza. It soon became obvious that if you could kill the microbes causing disease and keep humans free of these pathogenic bacteria or, alternatively, produce protective immunity against some viruses using vaccines, you could reduce the burden of the killer infectious diseases.
And so we happily entered the era of antibiotics for bacterial diseases and vaccines for certain viral diseases. Penicillin was a game changer during World War II. Previously, wounded soldiers often died of subsequent bacterial infections. The only available drugs were quite toxic. But large-scale production of penicillin allowed field treatment of soldiers as well as treatment connected with surgeries. This helped to prevent death from gangrene as well as septicemia (blood poisoning). Some have called penicillin the greatest weapon developed during World War II. Ironically, it was a weapon against pathogens and thus a lifesaver.
Antibiotics helped to control cholera and typhoid fever. They mostly supplanted what had been the only strategy for dealing with deadly bacterial diseases: to separate and isolate patients until they died. Two diseases that killed countless victims and tore families apart were tuberculosis, also known as the White Death, and leprosy. Tuberculosis (TB) was known to the Egyptians and Greeks and killed an estimated one billion people during the nineteenth and twentieth centuries. For the first half of the twentieth century we sent TB patients to special hospitals called sanatoriums. Relative control of the disease was not possible until the development of a second generation of antibiotics (streptomycin) beyond penicillin. During the early twentieth century, TB sanatoriums dotted the US landscape and were essentially a place to keep patients comfortable as they were waiting to die. In 1919, there was even an unincorporated town established in Texas bearing the postmark “Sanatorium, Texas.” Patients with leprosy, an ancient bacterial disfiguring disease, were sent to special isolated colonies. The leper colony on the island of Molokai is thought to have housed at least 8,000 individuals who were forcibly relocated between about the 1860s and 1960s. There they lived together until death. Antibiotic treatments saved lives and allowed families to stay together.
Viruses were every bit as feared, and this spurred the push for vaccines. Polio is a virus-induced illness that attacks the nerves in the spine, creating a debilitating, neuromuscular condition that can be fatal. Because children were more likely than adults to get the disease, it struck terror into the hearts of parents for several decades of the twentieth century. Though crippling to many, polio was responsible for only 6 percent of deaths in children who were five to nine years old in the early 1950s.
Perhaps the most prominent polio victim was President Franklin D. Roosevelt (FDR). Due to his own experiences and struggles with this disease, FDR became a medical philanthropist. This began with his trip to the mineral baths in Warm Springs, Georgia, to experience their healing properties. He was so impressed that he bought the site, created a foundation in 1927, and persuaded his law partner Basil O’Connor to run it. In 1933 FDR and O’Connor engaged in some early crowd funding when O’Connor began coordinating a Birthday Ball each January on FDR’s birthday to raise money for polio-patient care. The balls were such a success that in 1938 they were merged into the national organization that eventually became the March of Dimes.
Most important, FDR’s polio led him to initiate a major research effort to find a way to eradicate the disease. In 1954 the largest, most expensive medical experiment of that time was conducted. It involved using a killed-virus vaccine developed by the University of Pittsburgh's Dr. Jonas Salk. More than one million young children received either Salk’s killed-virus vaccine or saline in a randomized double-blind study (neither the children nor their doctors knew which they were getting) that cost more than $5 million. When the study was completed, the National Foundation for Infantile Paralysis (NFIP) approved Salk’s vaccine, and the specter of polio has rarely reared its head since.
It is human nature, at least in Western civilization, to identify a culprit when something goes wrong. We prefer to avoid considering how a complex biological system might be nudged toward better health. Maybe, unlike with polio, a single factor can’t be used to achieve better health. This kind of problem is harder. But when it comes to human health, it is no longer avoidable.
Unknown in the golden age of the infectious-disease medical paradigm, these new treatments had a deadly side effect. Penicillin didn’t only destroy the bacteria that sickened and killed so many in droves; it was nondiscriminatory as to which bacteria it killed. Unfortunately, it destroyed friendly bacteria right alongside deadly bacteria. The us-versus-them mentality viewed purging the microbes and creating a biologically pure human as the ideal outcome. This was the guiding path throughout the twentieth century in response to tuberculosis, typhoid fever, influenza, leprosy, and polio, and it has been hard to shake that war against microbes in the face of today’s drug-resistant infections from HIV and mad cow disease to Ebola and MRSA (methicillin-resistant Staphylococcus aureus).
The idea was logical in the face of mortal epidemics and twentieth-century biology. But things have changed. We have met the microbes, and they are us.
The big problem with striving for human mammalian purity and spending much of the twentieth century obsessed with killing microbes is that it goes against our very nature. We are, as whole healthy humans, composed of thousands of microbial species and about 100 trillion cells. But the majority of those cells are microbial. If we indiscriminately wage war on microbes, we wage war on ourselves. For example, recent estimates of just bacterial cells range from a low of 57 percent to a high of about 90 percent of total human cells.
In humans there are more than 10,000 different microbial species in residence, although no single person carries them all. In one person with a healthy microbiome you are likely to find approximately 1,000 different gut bacterial species, with another 300 species in the mouth, 850 on the skin, and tens to hundreds in the urogenital tract. That is not counting the viruses, fungi, and parasites that also make up our microbiome. One square inch of our skin can contain up to six billion microorganisms, and you have about 3,000 square inches. You wear billions of microbes every day of your life. Different body locations vary widely in the specific microbes taking up residence. For example, if your feet have fewer bacterial species than your forearm, they more than make up for it given the fungi that absolutely love to live on your sweaty toes.
We are not just mammals. Not by a long shot. We need our microbial co-partner species. They have been there for centuries helping to support our ancestors. It is only recently that we have unintentionally cut them out of our lives in our modernized world of antibiotic-administered, formula-fed, cesarean-delivered babies growing up in urban environments, surrounded by hand sanitizers and antibacterial soaps. In doing so, we have compromised our own health. A new biology is emerging, demanding a different way of thinking about what it means to be human, to be whole, and to pursue a healthy life on earth for ourselves and our children.
In integrative medicine, practitioners talk about caring for and treating the whole human. It is a useful approach. The challenge for us now is that treating “the whole human” usually meant considering all the physiological, psychological, and spiritual systems together in approaching nutrition and medical treatment strategies. But now we must move far beyond those familiar notions to consider what is good for our microbes. The first revolutionary step in embracing the new biology is to start thinking of yourself as more than simply a mammal.
This flies in the face of some very basic biological principles you probably learned early on in grade school and that were first set out by the brilliant eighteenth-century Swedish biologist Carl Linnaeus. Linnaeus founded the field of taxonomy in which the identity and relatedness of biological organisms could be shown. He brought order to what seemed like biological chaos, and his work guided generations of evolutionary biologists and especially inspired Harvard’s Stephen Jay Gould. While taxonomy remains vitally important even in analyzing the microbiome, the problem lies with the idea of species separation. The assumption has been we are a single species. Using the old taxonomy of Linnaeus, we would be categorized as Homo sapiens, a type of mammal, but operationally, that would be largely wrong. It would be wrong not just in how our bodies are composed but also in the genes we transmit to our next generation. Our operational taxonomy is that of a single species labeled as a human mammal. Instead, we each are a superorganism made up of thousands of species, biologically diverse. Be proud.
Human diseases of the twenty-first century present new challenges. They are what we used to call chronic diseases and now refer to as noncommunicable diseases, abbreviated as NCDs. They were originally called chronic diseases because they persist in the individual. Unlike a cold produced by a virus, these diseases do not go away in a week. In fact, once you get them, you often have them for life. We don’t transfer them by sneezing or coughing, but they disable and kill just the same.
NCDs include allergies, cancer, heart disease, obesity, and even psychological disorders such as depression. They look nothing like what our ancestors encountered even a century ago. These twenty-first-century diseases seem to have emerged out of nowhere. They have changed not only when we die and what we die of, but also how we live, meaning quality of life, limitations, and challenges we face while we are alive. These new diseases comprise an epidemic, one we are as of yet mostly unprepared to deal with.
The present, growing epidemic is deadlier and costlier than influenza, measles, and Ebola combined. In fact, according to the World Health Organization, NCDs kill almost three times as many people (68 percent of deaths) as infectious diseases (23 percent of deaths). Yet NCDs tend to be a hidden epidemic. We have government organizations and academic departments in place to fight infectious diseases, but NCDs, as a whole, not so much. The efforts that do exist are usually partitioned into piecemeal programs directed at only one type of NCD such as cancer, obesity, heart disease, autism, or Alzheimer’s disease. Comprehensive efforts to address NCDs have lagged way behind this epidemic.
The NCD epidemic is not restricted to any one culture, socioeconomic class, or geographic area. Almost three-quarters of the deaths due to NCDs occur in low- to middle-socioeconomic countries, although, proportionally, the rate is greater in more affluent countries, where NCDs cause up to 87 percent of all deaths. Distressingly, the epidemic only promises to get significantly worse in the years to come. But have you heard anything about this epidemic on the news—CNN, Fox News, or Huffington Post? Is it part of your Facebook feed or Yahoo Alerts? Is it trending yet on Twitter? No? If the epidemic is worldwide, then why not? Why the silence?
Unlike influenza, measles, and Ebola, the agent creating this epidemic is noncommunicable. You can’t pass it to your family, friends, and neighbors by coughing, sneezing, or shaking hands. You can’t see it spread. There is nothing to immunize against, and quarantines would be useless. Without the ability to prevent, vaccinate against, or cure these diseases, health practitioners are usually reduced to the helpless state of medically managing symptoms. In turn, this dramatically impacts personal productivity, quality of life, and socioeconomic viability. Individuals are reduced to a lifetime of drug management, which oftentimes creates a whole new layer of complications. Most drugs can have side effects, and as these side effects arise, they are often managed by prescriptions of yet more drugs. Our lives can become a series of alarms going off each day to remind us of the ever-increasing numbers of drugs we must take. Is this the life you planned for yourself? Is it what you want your children to experience?
The NCD epidemic is hard to pin down. We are used to chasing bacteria, viruses, or pathogens as a cause of human disease, and until recently many people were still trying to do that for NCDs such as cancer. But this is different. This epidemic involves an ecological system out of balance. Instead of being homogenous and relegated to one disease or one specific pathogen, such as a virus producing the flu, this epidemic comprises a myriad of different illnesses that each target different organs in the body and involve different medical treatments. Because of this, the epidemic has been more difficult to identify as a whole, harder to spot, and more challenging to nail down. It has been easier for health professionals and politicians alike to dismiss. Seeing it requires a new perspective, a new paradigm of human biology.
The new face of disease is there to see in our lives every day. It is anywhere people and the environment interact. They struggle to breathe the air, to eat the food of their parents, to move about, or in some cases to gather in crowds. They have to be increasingly cautious of their surrounding environment and how they interact with it. Those interactions are now dangerous for an ever-increasing percentage of people who are growing up sick, often isolated and seemingly ill matched for today’s world through no fault of their own.
Welcome to the twenty-first-century epidemic of NCDs. In addition to causing 68 percent of all deaths, NCDs are the number one cause of disabilities, and a massive drain on our economies. In fact, it is estimated they will cost us $47 trillion per year in just a little more than a decade. They are already a global crisis requiring the highest level of attention of the World Health Organization and the United Nations, because the severity of NCDs appears to be increasing.
The NCDs are all too familiar ailments like autism and the autism spectrum disorders, food allergies, Alzheimer’s, arthritis, asthma, cancer (all of them!), heart disease, celiac disease, diabetes types 1 and 2, inflammatory bowel disease, lupus, metabolic syndrome, osteoarthritis, sarcoidosis, thyroiditis (both Hashimoto’s and Graves’), hypo- and hyperthyroidism, and the list goes on. From mental health, to what we can eat, to our very bones, this amazing and frightening spread of diseases targets just about every place in the body.
Beyond premature death, the NCD epidemic takes a toll on our everyday lives. How hard is it to cater to your six-year-old’s birthday party guests? Mystified parents who feel they are doing everything right for their families are witnessing a loss of health and function in their children while the medical community is seemingly incapable of a timely, effective response. We have been moving toward an increasingly invalid society, with more and more children unable to experience life as their parents knew it and, in many cases, facing uncertain futures as adults. In the future, will we be able to associate with one another in ways most meaningful for human families, communities, and societies?
Before we attempt to understand the rise of noncommunicable diseases, let us look over the consequences of this raging epidemic. To keep things simple, let’s look at just one aspect of modern life. Since John Denver wrote “Leaving on a Jet Plane” in 1966 and Peter, Paul and Mary made the song famous in 1969, air travel has burgeoned. It has become an essential part of our economy, many work lives, and often our leisure time. I remember my grandfather, who was a city councilman in Texas, taking the first commercial jet flight from San Antonio to Dallas’s Love Field. This was a 287-mile trip that, as a child, I recalled as being a long, hot, grueling drive taking hours and hours. Yet my grandfather made the flight to Dallas in just forty-five minutes, had photos taken, then returned immediately. The entire trip took just ninety minutes of flying time. I was so incredulous that this was possible, I kept asking my granddad, “You were really there?” Jet travel changed everything for long-distance travel. But with the rise of NCDs, air travel is changing again, and not for the better.
In August 2014, blond-haired four-year-old Fae Platten from Essex, England, boarded a plane with her parents on their way home from Tenerife in the Canary Islands. She had a severe peanut allergy, which her mother had alerted the airline to, and flight attendants announced three separate times that no one should open peanuts during the flight. At 30,000 feet a man four rows away opened a packet of peanuts and disaster struck. Fae’s mouth immediately swelled; her lips blistered; she struggled to breathe and finally passed out. Only an injection of adrenaline from an EpiPen saved her life. All of this horror due to small particles of peanut dust recycled through the plane’s air-conditioning system. News reports called the man who opened the peanuts “incredibly selfish.” But was he? He was no more absentminded than any of us might be from time to time. Maybe it was something else. His misfortune was to be living during the time of the noncommunicable disease plague.
For someone with diabetes, a severe drop in blood sugar can be as life-threatening as a peanut allergy to little Fae Platten. Let’s suppose the man had insulin-dependent, type 2 diabetes. The Mayo Clinic has a long list of guidelines for diabetics when they travel, particularly out of their country of origin.
First, diabetics are encouraged to get a supply of insulin for the entire time they expect to be out of their home country and a doctor’s letter to go with it. The insulin must be the exact same brand and type that they have regularly been using, since any changes may cause alterations in their blood sugar levels. The insulin must also be kept in a cooled container. Not only that, the diabetic must take into account changing time zones, changes in altitude, and changes in diet. The diabetic person must test his blood sugar levels more frequently for unusual changes and adjust accordingly. A significant drop in blood sugar can cause the diabetic to lose consciousness and, if he doesn’t get sugar quickly, to pass into a coma and die.
On top of that, the Mayo Clinic recommends that diabetics keep food with them at all times. One of the leading foods listed is peanut butter because it is an ideal source to both raise and stabilize blood sugar levels.
Now I have to wonder: Is it possible that the man four rows from Fae Platten was an insulin-dependent diabetic? Perhaps he followed medical guidelines and tested his blood sugar levels when the plane reached the cruising altitude of 30,000 feet. Was it possible that he noticed his blood sugar levels were plummeting and now had his own medical emergency on his hands? What a dilemma. To open the packet of peanuts could jeopardize a little girl’s life. To not open the packet of peanuts could jeopardize his own life. Was this the scenario that day on the plane? Probably not, but it is a very real possibility. In fact, the likelihood of this exact scenario is increasing every day.
But a diabetic on a plane needing a blood sugar boost is not the only problem. What if the person had celiac disease? Another dilemma. The only free snacks ever provided on flights are peanuts, pretzels, and cookies. However, for the person with celiac disease, even the leftover traces of wheat on a baking pan can cause a severe reaction. So for that person, pretzels and cookies are out, and the peanuts are the only snack that is safe.
My wife and I were visiting Texas. We and an acquaintance went to a craft village. While there, we entered a small bakery where a woman was selling fudge. The treat looked wonderful to all of us; however, my acquaintance has celiac disease, so he used caution and quizzed the woman about the ingredients she had used in making the fudge. She brought out the box the mix had come in. We all scrutinized the ingredient list carefully; no wheat or gluten was in sight. Relieved and happy, we purchased several flavors of fudge and indulged in a few squares on our drive back to his house. By the time he got home, his gut was in agony, and he spent the rest of the night locked away in the bathroom. The next morning he returned to the craft village to query the fudge lady further. Come to find out, she had laid the fudge out on trays that had previously been used to bake cookies. Though they had been washed, enough of a trace of gluten had remained to make him violently ill for hours.
These are three NCDs that could easily all be present in people on the same plane, or the same cruise ship, or even in the same school cafeteria. The issue has become so prevalent and so challenging that a court settlement ruling was handed down in 2009 that now brings celiac disease and food allergies under the umbrella of the Americans with Disabilities Act, thus requiring accommodations be made particularly in schools and colleges. And those are just the challenges of dealing with food allergies, diabetes, and celiac. What about NCDs like autism or inhalant allergies? As recently as 2002, the prevalence for autism spectrum disorders (ASD) was 1 in 150 US children. By 2010, it was 1 in 68. In 2014, it was 1 in 45.
Recently, an Oregon family took their fifteen-year-old girl with autism to Walt Disney World in Florida. Individuals with autism are frequently very sensitive to temperature and the way things feel in and on their body. In the case of this girl, she absolutely needed to eat her food steaming hot. She also needed to eat very soon after getting hungry; otherwise she would have a meltdown, become agitated, and scratch due to the discomfort and the frustration from her limited ability to communicate. The girl caused some disruption on the plane as a result of this condition. The mother convinced the flight attendant to heat some food; the girl ate it, calmed down, and began quietly watching a movie. Still, the plane made an emergency landing in Salt Lake City, police entered the plane, and the entire family was escorted off the plane because of the earlier disruption. While not life-threatening, this particular NCD created issues for the flight staff and tremendous embarrassment, humiliation, and outright fear for the family. If the mother holds to her word, it will also mean a hefty lawsuit for the airline involved.
Unbeknownst to most people, obesity is also an NCD, not the result of a lack of willpower. It has also become an epidemic in its own right, having more than doubled in prevalence since 1976. At present, more than one-third of the US population is considered obese, with that figure estimated to rise to 42 percent by 2030. The implications for air travel are not trivial.
Given that airlines have resorted to packing more coach seats into the same amount of space, obese passengers are facing increasing challenges when flying. In the case of a 518-pound man from Wales, an airline forced him to book two seats for a round-trip flight to Ireland. However, the airline staff themselves were clueless about the policy requiring him to purchase two tickets and gave him two nonadjacent seats—one on the aisle and one on the window, with a seat in between—for the first flight. On the return flight, airline staff booked him in seats that were two rows apart and obviously useless to him. That may sound merely silly, but it is not an isolated case of obesity causing problems we prefer to ignore.
Kevin Chenais, a twenty-two-year-old, 500-pound Frenchman with a serious hormone disorder, flew to the United States from France for medical treatment. In 2013, a year and a half later, he attempted to return home on British Airways, which informed him that they “were unable to safely accommodate the customer on any of [their] aircraft.” Kevin and his family who had accompanied him were forced to take a train to New York City and cross the Atlantic Ocean by boat in order to get home.
These kinds of responses are quickly becoming standard policies with air travel. Three US airlines, Southwest, American, and United, require passengers who are too large to fasten a seat belt to purchase two seats. And Samoa Air has implemented a policy of pricing air tickets according to the weight of the passenger.
While barely noticing, we are becoming a society so biologically dysfunctional that our movements near and far are becoming restricted. The 2008 Disney-Pixar animated film Wall-E featured a frightening futuristic view of superobese people. The movie implied that the obesity was caused by poor eating choices and lack of exercise. There is another explanation for this social phenomenon.
Beyond air travel, obesity is becoming a political and legal issue. In the spring of 2015, Puerto Rico introduced legislation that would fine parents of obese children and register them as child abusers. Granted, there are parents who don’t or can’t provide healthy food choices for their children. It isn’t hard to find a young mother feeding her baby pieces of French fries in a fast-food restaurant. With a busy, challenging work schedule, it’s far easier to let the television, DVD player, and Xbox become the babysitter—especially now that parents are being cited for “free-range parenting” in which children are allowed to play outdoors without adult supervision.
That Puerto Rican legislation ignores recent scientific evidence suggesting that childhood obesity can result from a dysfunctional microbiome. Parents who obediently follow the recommendations of physicians to have a cesarean section delivery (complete with prophylactic antibiotics), and often accept physician-prescribed rounds of antibiotics for their children’s upper-respiratory and ear infections, unknowingly contribute to problems with their child’s microbiome. These state-of-the-art medical protocols prevent the baby’s microbiome from seeding and maturing as needed. This, in turn, significantly increases the risk of childhood obesity. So what is, in reality, a lack of understanding of human biology by physicians, parents, and politicians could soon result in a charge of child abuse.
This is the tip of the iceberg. There are hundreds of different NCDs, and each one comes with its own particular set of limitations and daily risks.
What if the very environment in which you live and work could kill you? Such is the challenge of Viscount Jan Simon, a deputy speaker in Parliament’s House of Lords and a Labour Party member. He developed asthma that is triggered by severe allergies to perfumes, tobacco smoke, and chemical fumes. Just a whiff of perfume, aftershave, or cigarette smoke can leave him fighting for breath in twenty seconds. Due to this, he collapsed and needed oxygen after a baroness who had washed her hair with scented shampoo sat next to him. On another day, while serving as the deputy speaker, he was handed a message written on faintly scented paper. With one whiff, Lord Simon was gasping for breath and had to be helped out of the chamber.
By his own report, Lord Simon has not been to see a movie in a theater since 1986. He has been unable to travel by train, bus, or plane for as long. And a casual restaurant meal is a thing of the past. His wife has had to change every personal care product she has ever used. Her nose is so acute that she often goes ahead of him to detect scents that could cause an attack. Even visitors to his home must follow a strict list of instructions in order for him to safely entertain guests.
Already schools have peanut-free zones, parents must check menus before parties, and at-risk people in enclosed public transportation can do little to protect themselves from their fellow passengers. Is it possible that one day twenty years down the road we’ll see the beginning of a new era of segregation having absolutely nothing to do with race and everything to do with life-threatening NCDs? Imagine schools with entire zones and facilities for people with different food allergies. Many teachers are already specially trained to administer emergency adrenaline for this new kind of disease.
Other societal adaptations are under way in response to how we have changed as humans. The US Centers for Disease Control and Prevention (CDC) first established its own scent-free workplace in 2009, encouraging others to follow. Additionally, the Centre for Occupational Health and Safety of Canada offers guidance for implementing a fragrance-free workplace policy. We are simply not able to tolerate things that our ancestors enjoyed. The issue of fragrance allergies is so pervasive that recently the European Union banned three particularly allergenic ingredients that are part of the formulation for many popular perfumes. Two major perfumes affected are Chanel No. 5 and Dior’s Miss Dior. The former is credited as the world’s most popular perfume (based on sales), dates to at least 1921, and was given a mid-twentieth-century popularity boost by Marilyn Monroe.
How could long-standing, favorite fragrances that were just fine a few decades ago suddenly become poisonous for more and more of the population? The perfume formulations didn’t change; we did. There are more and more people like Lord Simon among us.
For my wife and me, the social restrictions of NCDs aren’t just theoretical and based on lab research. We each have our own NCD challenges that very often include foods and odors. Recently, we both attended a New York conference where I was to give a keynote address and we were to give a daylong workshop together. The conference was intellectually stimulating, and the attendees were wonderful, gifted people. However, food was an immediate issue. What was available from the conference caterers managed to hit all of our food allergies and sensitivities. We were both ill the first night and next morning.
In stark contrast, I had just given an invited lecture at the International Scientific Association for Probiotics and Prebiotics (ISAPP) in Washington, DC. All of the attendees were polled for dietary restrictions ahead of time. And throughout the entire conference, a variety of food options were readily available. Of course this is the new normal—and a nightmare for any group organizing human gatherings. It affects everything from church functions to meetings of Masonic groups to school athletic awards ceremonies. The NCD epidemic seems to be leading us to a world of isolation from our peers, colleagues, and families. We are becoming a race of disabled people.
This book offers an alternative.