Chapter 2

images

RISK FACTORS

In successive waves through the 1990s, hundreds of thousands of peanut-allergic children arrived for kindergarten at public schools across the United States, Canada, Australia, the United Kingdom, and other Western countries. Critical mass, it seemed, was achieved almost overnight, catching educators off guard and prompting sudden changes in social behavior, shopping, and eating habits. Doctors had watched for years as prevalence of the allergy climbed and had become a special concern for children. With this unanticipated acceleration in children, a sense of urgency and desperation marked the medical literature until the early 2000s when research shifted to allergy treatments. Researchers analyzed any feature no matter how unlikely that seemed to distinguish peanut-allergic children from others.

Risk factors for the allergy such as atopy, maternal age, cesarean birth, socioeconomic status, and heredity ultimately were seen to contribute to allergic tendency; but none explained the specificity of the peanut or its sudden increased prevalence. Even geography was a misleading factor since it had been used as a convenient way to demarcate patterns of food consumption.1 For this allergy, studies of peanut consumption and methods of peanut preparation and cultivation yielded few clues. Geography did hold significance but for a new pattern of toxicity seemingly specific to the Western lifestyle. A profile of the person most liable to develop a peanut allergy emerged from the known risk factors: a male child (male to female 2:1) born in a Westernized country after about 1990 whose ability to detoxify was challenged.

GEOGRAPHY

Geography was believed to be a primary risk factor for developing the allergy. An acknowledged but puzzling feature of the allergy was that it seemed to exist at first only in certain Western countries—the United Kingdom, parts of Europe, Canada, the United States, and Australia. Doctors were quite convinced that it simply did not exist in developing and Eastern countries such as China, India, or parts of Africa. In fact, explanations for the general rise in allergies such as the hygiene hypothesis rested on this East-versus-West observation. Starting in 2005, however, reports of peanut allergy in unexpected prevalence emerged from Hong Kong, Ghana, and Singapore. When word of these new outbreaks reached the medical community, the response was utter silence. No one could explain it.

The first reports regarding these new outbreaks indicated only serologic evidence with limited actual reactivity, but that quickly changed. By 2008, severe peanut reactions in children living in Hong Kong had increased: 1% of children aged two to seven were found to be reactive.

And in Cape Town, South Africa, again serologic evidence of peanut sensitivity was found in 5% of children studied in 2007 although reactivity was limited and nonanaphylactic. The reason for this hyporeactivity was already known. Helminths or intestinal worms dampened every immune reaction, and these children were heavily infested. However, by 2011 a study of Cape Town children with eczema found that peanut allergy was on the rise. The study concluded that the South African population had joined the food allergy epidemic widespread in Westernized countries. Similarly, a 2013 study reported that children 1.5% of children living in Ghana were reacting to peanut. This number was up from .53% in 2009.

And in Singapore, where peanut allergy in children was a “worrying trend” in 2007, by 2013 it had become a “common trigger” (see Appendix).

In fact, doctors had failed to notice a trend in the way the allergy suddenly emerged. A similar phenomenon had occurred in the United States. In 1994, 6% of the US population was reported to be sensitized to peanut although the vast majority was nonreactive. By 1997, 0.4% of American children under eighteen were reactive, but as in Hong Kong, this number quickly rose to 0.8% by 2002, to 1.4% by 2008 and in 2011 2% to 2.8% for children under 14 (see Appendix).

Taking this observation to a logical extreme, if sensitization grew at the same rate as documented reactivity in children, by 2002, 12% of the US population—about 35 million people—would be sensitized to peanut. In 2002, an estimated 1.04% of the US population was reactive to peanuts. Were the numbers then about 1:12? Meaning about three million actively peanut-allergic people for every 35 million sensitized? If that was true, whole populations were rapidly being sensitized somehow to their own food.

Using that modest percentage of 1.04% in 20022 for the top 5 countries, the total peanut-allergic population for 2002 was more than 4.3 million people: 3,057,600 in the United States; 327,704 in Canada; 613,600 in the United Kingdom; 92,332 in Sweden; and 205,165 in Australia. Although difficult to confirm, some believe this number in 2009 to have risen to an estimated 2% of these populations—7.36 million people.

Tracking the growth of peanut allergy from the start was a little like train spotting. Data was generated in small isolated studies that were then shared obsessively across the Internet and in medical journals. The upward trend in children continued to be monitored in the United Kingdom, the United States, and Australia, in particular. And yet in other countries—Norway, Denmark, Germany, and Japan—prevalence of the allergy was low (see appendix). In Estonia, Lithuania, and Russia, the allergy was of limited or no significance. And in sub-Saharan Africa and India, it seemed virtually nonexistent although this must be qualified by the paucity of studies and data available. The numbers of peanut-allergic children included the following:3

•     2.3% (2012) up from 1.71% (2007) for Canadian children

•     2% (2011) 1.4 % (2008) for US children

•     2% (2009) for UK children

•     0.45% or more (2002) for French children

•     1.2% to 2% (1999) or more (1998) for Swedish children

•     0.5% (2005) for Danish children

•     .6% (2012) for Jewish Israeli children aged 13-14 which is up from .17% (2002). 2.5% (2012) for Arab Israeli 13-14 year old children

•     2.5% (2011) and another study 0.53% (2009) for Ghanaian children

•     1.11% (2009) for Australian children living in Tasmania

•     2% (2009) for Australian children living in the Australian Capital Territory

•     3% (2011) for Australian children living in Melbourne

•     1.08%–1.35% (2008) for Singaporean children

•     0.57%–1% (2009) for Chinese children living in Hong Kong

PEANUT CONSUMPTION

Ingestion of peanuts was originally believed to be a primary risk factor for developing the allergy. Researchers looked closely at methods by which peanuts were prepared, where they were grown, and levels and modes of exposure to them before and after birth. They were surprised to learn, ultimately, that the role of peanut consumption in the epidemic was anything but straightforward.

Boiling peanuts was thought to be the reason for the virtual nonexistence of the allergy in children living in China. This method of preparing peanuts reduced their allergenic proteins. Roasting, on the other hand, a common method of preparation in the United States, tended to enhance their allergenicity. However, using this observation to explain the lack of prevalence of the allergy in China proved difficult. Intact peanut proteins were still being consumed. As well, the allergy was not known to exist in India at this time where peanuts were prepared in a variety of ways, including roasted. As well, the emergence of the allergy in Hong Kong and Singapore, where boiling peanuts was ostensibly the preferred method of preparation, completely upended the idea. And finally, a 2010 study of 1 to 2 year olds from Chongqing, China, proved through oral challenge testing that food allergy had doubled between 1999 and 2009, from 3.5% to 7.7%.4

And farming methods and differences in soil were found to have no bearing on allergenicity of the peanut. Whether grown in Israel, India, or the United States, peanut proteins were the same.5 Note has been made of the fact that peanuts also contain histamine and other substances that may affect allergenicity. Little research has yet been conducted on these in relation to proteins.6

And the common sense idea that eating a lot of peanuts contributed to the allergy was also problematic. In Sweden, where consumption of peanut was low, prevalence of the allergy in children was the same as it was in the United States.7 The inverse was true for Jewish children living in Israel where peanut consumption was high and prevalence of the allergy was relatively low at .6%. And so, perplexed researchers turned next to analyze the allergic child and the tender age at which he or she first consumed peanut.

The most debated risk factor related to peanut consumption was whether consumption by pregnant and nursing mothers contributed to the prevalence of the allergy. Through the late 1990s, medical opinion on the issue swung from one extreme to the other—should mothers eat peanuts or stay away from them?—without consensus.

Some believed sensitization occurred in utero, before the child was born.8 Others suggested it occurred during breastfeeding or through ingestion of refined peanut oil in baby formula. Some considered that since IgE antibodies do not cross the placenta, perhaps peanut proteins did and that perhaps fetuses swallowed IgE from amniotic fluid that then resulted in sensitization.9 Doctors believed they had evidence from aborted fetal tissues showing that from the second trimester onward fetuses were capable of producing an allergic reaction. A French study of fifty-four infants who were less than eleven days of age and seventy-one who were seventeen days to four months of age found that 8% of the babies had a positive skin prick test for peanut.10

Another doctor urged that there was a lack of convincing evidence that manipulation of maternal diet during pregnancy had a lasting effect on the development of a food allergy. And therefore, it was thought lactation was a more likely route of primary sensitization.11 Yet, another study in 2003 found no association at all between allergy and maternal peanut consumption while pregnant and breastfeeding.12 Not content to leave the issue, further study in 2010 insisted that “maternal consumption of peanut during pregnancy is associated with peanut sensitization in atopic infants.”13

A provocative university dissertation in 2007 determined that avoidance of peanut reduced the prevalence of the allergy but only in the child’s first year of life.14 Avoidance had no benefit after age one. The author Ting Liang admitted that this was difficult to explain. Liang suggested that there could be a “subtle” and as yet undiscovered environmental exposure to sensitizing proteins.

Ultimately, a worried British Department of Health issued a warning to pregnant and nursing mothers who had a history of atopy (other allergies) to avoid peanuts and all nuts to prevent peanut allergy. In 1998, the UK Committee on Toxicity of Chemicals in Food, Consumer Products, and the Environment issued a statement of avoidance.15 The American Academy of Pediatrics also recommended delayed introduction of peanuts until three years of age for infants with a family history of allergies and maternal avoidance of peanuts during pregnancy and breastfeeding for mothers of such infants.16 Observation that exposure to peanut oil hidden in vitamin supplements, nipple ointments, or soy formulae may contribute to sensitization was not rigorously examined. However, doctors had found some children were sensitized in this manner, and some even reacted.17

And yet, not only did this avoidance strategy have no effect on reducing prevalence of the allergy, but also it resulted in the highest number of peanut-allergic children yet seen in the United Kingdom.18 One study put the number at 2.8% of children. A report to the UK parliament in 2007 concluded simply that this seemingly sensible government advice may have made things worse.19

And so, doctors made a complete about-face and asked whether it would be better for pregnant and nursing mothers to embrace peanuts and eat significant quantities of them. Exposure to peanuts during childhood was now thought to be crucial in developing immunological tolerance—it might even prevent the allergy. Conversely, a lack of peanuts could enhance sensitization.

A study in which English and Israeli children were compared concluded that the early introduction of peanuts in an Israeli food called bamba may have promoted tolerance and prevented peanut allergy.20 In two populations of Jewish children ages four to eighteen in Tel Aviv and London, researchers looked at the roles of timing, frequency, and quantity of peanut consumption in the development of the allergy. The prevalence of the allergy was 1.85% in London and 0.17% in Israel in 2002. Among children ages four through twelve, the prevalence was 2.05% in England and 0.12% in Israel. When analysis was restricted to those at high risk for peanut allergy, those with confirmed eczema, the prevalence was 6.46% in England and 0.79% in Israel. By nine months of age, 69% of Israeli infants and 10% of English infants were eating peanuts. The main source was peanut butter made from roasted peanuts.

But if reduced exposure to peanut proteins led to allergy, then the protein-reduced boiled peanuts consumed in China should actually have created scores of peanut-allergic kids. The early introduction prevention theory made little sense especially in consideration of the prevalence of sesame allergy in Israel. Sesame allergy in Israeli children was almost as high as peanut allergy was in UK children. Ironically, Israeli studies claimed that the prevalence of sesame allergy in children was the result of their consumption of the food too early in life.21

Ultimately, the LEAP study (the Learning Early about Peanut Allergy study) emerged as a way to settle the issue of when and how to consume peanuts and to determine what was the best dietary strategy for prevention of the allergy.22

The much anticipated LEAP study published in February 201523 reiterated the theory of oral immunotherapy, what has been known for decades and what had been observed in Israel related to peanut allergy. The study indicated that early introduction and regular consumption of peanut in high risk children (those already with food allergy and/or eczema) resulted in a decreased risk of becoming reactive to peanut. But a discussion of what made the children high risk was not put forward. What caused the allergies and atopy that then made the children vulnerable to peanut and other serious food allergies?

To say that eating more peanut products regularly early in life solves the ‘mystery’ of the epidemic is to forget that peanut allergy became a societal concern only recently and suddenly in the early 1990s—prior to this time, ostensibly, we had been happily consuming peanuts in any amount without concern as to when or how.

One part of the LEAP study included 530 children aged 4 to 11 months of age with severe eczema, egg allergy or both who had had a negative skin-prick test result, no wheal, to peanut. From here, the children were randomly assigned to either avoid (limited exposure) or consume a baseline amount of peanut regularly for 60 months. It was determined that at 60 months of age 13.7% of the avoidance and 1.9% of the consumption group were allergic to peanut—as indicated by blood tests.

In both groups, IgE levels increased although it was higher in the peanut avoidance group. IgG was higher in the consumption group and wheals were smaller—all of this mirroring again what is known about oral immunotherapy, small doses of an allergen for some people may be consumed to reduce sensitivity. Whether the children retain this decreased sensitivity is not yet known. It was also suggested that exposure to peanut dust through the skin in children with eczema may contribute to sensitization. Peanut consumption was not possible at all for some children who were just too sensitive.

Again, the LEAP study was inspired by an observation that peanut consumption by children in Israel appeared to reduce prevalence of allergy. And yet, peanut allergy is on the rise in Israel.

In Israel, a study of adrenaline auto-injector dispensing rates from 1997 to 2004 showed a 78% increase.24 According to the Israel Food Allergy Support Network (YAHEL) 8% of Israeli children now have serious food allergy. And topping the list of reactions according to ER hospitalizations in one Israeli hospital were nuts and milk.25 By 2012, one study indicated that .6% of Jewish Israeli children aged 13-14 were allergic to peanut.26 In 2002, that number was, again, .17%.

Allergy in children has been climbing so fast everywhere that despite the early consumption of any or all foods, the prevalence of anaphylaxis has and will continue to soar for the simple fact that no one is looking at causes. Treatments yes, but causes no.

The major clue to what is occurring in children, again, is found in the sudden development of anaphylaxis just in children, just in specific western countries and at the same time—no one knew that the children had become anaphylactic in such numbers until they showed up for kindergarten in the early 1990s.

There is only one functional mechanism of sensitization with the power to create this condition with such precision. The issue of when and how to consume peanuts was and is an expensive distraction.

In fact, a researcher on this team suggested that eating peanuts may not be the only method of sensitization: “The index allergic reaction usually occurs soon after the first known oral ingestion—which suggests that peanut sensitization does not always occur via the oral route.”27 Some doctors simply admitted that the exact route of primary sensitization was unknown although the gastrointestinal (GI) immune system was likely to play an important role.28 29

The GI system was long understood to play a significant role in allergies. For one, the successful catalytic effects of enzymes on proteins are crucial to limiting the movement of complete proteins into the bloodstream through the tissue of the bowel wall. A balance of specific microbes in the gut also helps maintain its integrity. With a compromised bowel, ill-digested proteins may be allowed to pass through and bind with blood serum, resulting in sensitization. As an explanation for epidemic anaphylaxis, however, scores of children would have had to experience bowel disease suddenly and at the same time in all the affected countries. In the extensive literature specifically on peanut allergy, there is no apparent investigation into a relationship with bowel disease. What would cause a sudden onset of bowel disease in children?

Allergist Kenneth Bock suggested in 2007 that low-grade infection in the GI tract possibly from vaccination would encourage allergic sensitization. Inflammation from such an infection sends out immune cell messengers that trigger even more inflammation in distant parts of the body. It can result in inflammation in joints, the lining of the GI tract, and even the brain. All inflammation, Bock suggested, contributes to allergies.30

In Food Allergy: When Mucosal Immunity Goes Wrong,31 allergist Hugh Sampson suggests that the inflammatory bowel conditions such as celiac disease intersects with food allergy but may also reflect an IgE-mediated allergy to digestive flora. And inflammation of the gut may be accompanied by ongoing eosinophilic airways inflammation—this has been identified in peanut-allergic children.32 At this juncture, two important question arise: did bowel inflammation and gut permeability lead to allergy? Or did the allergy develop first and lead to further inflammation? If it was the former, the search for a cause would have to start with a sudden onset of a gut condition in children starting around 1990, corresponding to the documented abrupt increase in peanut allergy in children at that time. If the latter then, again, there must be a specific mechanism of sensitization with the power to create allergy and atopy immediately and simultaneously just in children in specific westernized countries. This thought points directly at injection and vaccination the designed purpose of which is to provoke the immune system. But again, this is an area no one is willing to investigate. In fact, anecdotally, it has been banned in universities as an unwritten rule.

But as for simple consumption, in a 2007 report from the UK House of Lords, Science and Technology Committee confessed that levels and timing of consumption by mothers and children appeared to have an unclear relevance in the increasing prevalence of the allergy.33 They offered that consumption of peanuts was up to individual discretion in consultation with a doctor. By 2010, the UK government advice was revised again—early-life exposure to peanut was no longer considered a risk factor for peanut allergy.34 And finally, by 2015 the LEAP study raised questions about the “usefulness of deliberate avoidance of peanuts as a strategy to prevent allergy.” The study suggests that timing and amount of peanut consumption contributes to lower risk of developing the allergy but should be consumed under the care of a doctor.

The act of eating peanuts as a mechanism for mass sensitization was at the least mired in conflicting information. At the worst, it was like seeing a tree but missing the forest. Mere consumption would not cause an epidemic of anaphylaxis in hundreds of thousands of healthy children, just in certain countries and with such abrupt prevalence. If atopy including eczema and other food allergies make one more vulnerable to developing peanut allergy, what was causing the atopy? The focus on a single mechanism of sensitization (eating or even peanut ‘dust’ through oozing skin) missed the probability that there was an additional and more powerful and specific determinant of creating allergy. In support, the specific and sudden rise of peanut allergy is a massive clue to understanding the big picture.

ATOPY: ECZEMA & ASTHMA

Atopy, from the Greek atopos, which means “out of place,” is a medical term used to describe a tendency of an individual toward allergic conditions like eczema and asthma. Atopy that indicates increased levels of IgE antibodies is a generally accepted risk factor in the development of additional allergies.35 Doctors were divided, however, on its relevance in the peanut allergy epidemic.

In a cohort study of American children referred for the evaluation of atopic dermatitis between 1990 and 1994, the prevalence of allergic reactivity to peanuts was nearly twice as high as that in a similar group evaluated between 1980 and 1984.36 A 1996 study concluded that peanut allergy was rarely an isolated manifestation of atopy.37 A Melbourne study of 620 atopic Australian infants in 1997 indicated that 1.9% were peanut allergic although egg (2%) and milk (3.2%) were more common at age two.38

In contrast, according to a 2006 study of Jewish children in Israel and the United Kingdom, the propensity to atopy did not explain the increasing prevalence of peanut allergy. Atopy was as prevalent in the UK children as those in Israel while peanut allergy was very low in Israel and high in the United Kingdom. Despite the study’s unsteady conclusion that prevalence was related to a lack of early exposure (see “Peanut Consumption” above), it suggested that peanut allergy was independent of atopy. The allergy was seen in both supposed low-risk and high-risk children. The differences in peanut allergy could not, in this study, be explained by generalized differences in atopy.39 The LEAP website indicates that just 20% of children with atopy (especially eczema or egg allergy) also developed peanut allergy. A 2011 study offered that food allergy has emerged as an unanticipated ‘second wave’ of the allergy epidemic in children, while asthma, rhinitis and eczema precede and predict food allergy. However, it was crucially important to point out that ISAAC (International Study of Asthma and Allergies in Childhood) that had compiled the data for the study did not include food allergy at its inception. Food allergy was considered more difficult to accurately determine and, therefore, ignored. This seems naturally to upend the idea that atopy preceded food allergy.40

But even if asthma and atopy were predictors for food allergy, what was causing the atopy? Was atopy genetic and independent of peanut allergy, or was there perhaps another underlying element that coincidentally linked both allergic conditions in children?

In 1997, one group of researchers thought they had found the connection. A Christchurch, New Zealand, longitudinal study followed 1,265 children born in 1977, twenty-three of which did not receive DPT and polio vaccinations. The nonvaccinated children had no recorded asthma episodes or consultation for asthma or other allergic illness before ten years of age. In the immunized children, 23.1% had asthma episodes, 22.5% asthma consultations, and 30% had consultations for other allergic illnesses. Similar differences were observed at ages five and sixteen years.

This study pointed to the pertussis toxin as having a direct IgE-inducing effect. Two other factors that promoted atopy in children were the aluminum-based vaccine adjuvants and the reduction in clinical infections in infancy.41 In the process of building healthy immunity, the period from birth to six months of age was considered to be crucial. Some believed that certain vaccines had altered this process leading to atopy.42 In fact, a retrospective study published in 2008 of Canadian children born in 1995 found that a delay in the DPT (whole cell pertussis) vaccination by 2 months was associated with a 50% reduction in asthma.43

At the same time, however, Japanese researchers saw an inverse association between tuberculin responses and atopic disorder: exposure and response to Myobacterium tuberculosis in a BCG vaccination appeared to inhibit atopic disorder through the lowering of serum IgE.44 Ten years later, researchers found the opposite, stating that there was an absence of relationships between tuberculin responses and adult atopy and that the data was inconclusive.45

Doctors danced uncomfortably around the relationship of allergy and vaccination through the 1990s until finally the discussion was sidelined by a new and all-encompassing concept of allergy and immunity—the Th1/Th2 paradigm. This model of adaptive immune function argued that a balanced immune system could be disrupted by any number of factors, not just vaccination. Poor nutrition, vitamin supplements, parasite infection, lack of parasite infection, naturally acquired disease, or lack of disease could lead to atopy. The role of vaccination in atopy and allergy was thus engulfed by a massive construct that reduced it to just one of many immune-altering exposures.

BIRCH POLLEN ALLERGY

In Sweden, peanut consumption is low but the allergy to peanut is high. This being difficult to explain, researchers suggested that cross-sensitization to inhaled birch pollen was causing peanut allergy—this is known as ‘oral allergy syndrome’ or pollen-food allergy. A 2008 study of children in Sweden explained that the increase in allergy to peanut and tree nuts “probably reflects an increasing prevalence of allergy to birch pollen and pollen-related reactions to foods.”46 Researchers just weren’t sure. And yet, even if it is assumed that the peanut allergy was caused by the birch pollen, how can the initial pollen allergy be explained in so many children and so suddenly, just in the last 20 years?

As early as 1959, researchers found that mice could be made severely allergic to grass pollen by using an adjuvant, an additive to provoke an immune response. In this instance, the mice were inoculated with a pertussis vaccine.47

A similar study took place in the late summer of 1973. Mice vaccinated again with a pertussis vaccine became sensitized to ragweed pollen that happened to be in the air at the time.48 Subsequent intravenous injection of the mice with pollen extract resulted in an anaphylactic reaction. Creative researchers also found that they could create peanut allergy in mice through inhaled peanut if it was mixed with a toxic bacterium.49 In these pollen-food or inhalation-ingestion experiments, again, a toxic adjuvant was needed to create allergy. This has yet to be investigated within the risk factor of pollen allergy.

TH1/TH2 PARADIGM DYSREGULATION

Doctors have religiously cited a general malfunction of the immune system as a risk factor for peanut allergy in children. The Th1/Th2 paradigm neatly organized the immune system by splitting it into two sides with two distinct thymus (T) white blood cell responses to pathogens and allergens. Doctors suggested that an upset in the balance between the two sides would almost certainly result in allergies. While it was a handy concept, the paradigm was later labeled as so much dogma unlikely to explain something as highly complex as the immune system. And as a risk factor in the epidemic, a dysregulation of the Th1/Th2 paradigm was not linked specifically to peanut allergy. It was just too broad.

White blood cells called T cells (matured in the thymus) and B cells (matured in bone marrow) circulate in the lymph, spleen, skin, and gastrointestinal tract where they react with antigens (i.e., bacteria, viruses). B cells proliferate and produce quantities of different antibodies, including IgE (immunoglobulin epsilon), the antibody most associated with atopy and allergy to deal with invaders that are outside the cells of the body (i.e., allergens). The B cells rely, in part, on information from T cells.

T cells deal with invaders that cause damage inside cells (i.e., viruses). They secrete cytokines (cell movers). There are two categories of T cells: cytotoxic T cells (killer T cells) that kill infected cells and helper T cells that enhance responses of other cells like macrophages and B cells. There are two types of helper T cells: Th1 and Th2. Th1 stimulates cell-mediated immunity. When Th1 cells recognize a viral antigen, for example, they secrete cytokines (interleukin 2 IL-2 and interferon IFM) to signal killer T cells to lyse or penetrate and destroy infected cells.

Th2 stimulate humoral immunity. When they recognize antigens, they produce cytokines (IL-4, 5, 10) that stimulate B cells to produce antibodies including IgE. The antibodies then bind to specialized IgE receptors on the surfaces of mast cells, basophils, and eosinophils in the bloodstream and connective tissue. These cells that contain allergy-inducing chemicals, such as histamine, are present in the connective tissues, in the respiratory tract, gastrointestinal tract, urinary tract, nasal passages, and skin. IgE antibodies can circulate in the bloodstream and become distributed on mast cells throughout the body. The allergic response begins when an allergen binds to IgE antibodies that are, in turn, bound to mast cells thereby activating the mast cells to degranulate and release their chemicals. Mast cell degranulation can result in vomiting, diarrhea, constriction of airways, coughing, sneezing, skin itch, a drop in blood pressure, and, in severe cases, shock and death.50

In the Th1/Th2 paradigm, a balance of the two T cell functions was seen as crucial. Nonatopic people show mainly Th1-immunity characteristics. They produce interferon that inhibits the growth of the Th2 cells. Again, some doctors suggested that vaccination was upsetting this balance by stimulating the Th2 side to produce excessive numbers of antibodies and limiting the Th1 (which would result in symptoms of disease).51

There was evidence that childhood infections (with fever and malaise) were important in the development of a balanced immune system by “teaching” the body how to handle other infections and that their vaccination-induced decline meant this developmental role was lacking.52 And yet too much vitamin D,53 some suggested, or genetic predisposition, could just as easily cause the imbalance.

By 2003, the paradigm that compressed the immune processes into a tidy concept itself came under fire. The “dogma that Th1 and Th2 cells are associated with cell mediated and humoral immunity, respectively, has recently been reevaluated … It appears that the mechanism of protection involves a complex combination of antibody and T-cell responses.”54 Such reevaluations suggested that antibody count—such as that of an IgE RAST test, as a measure of sensitivity to an allergen, or other antibodies, as a measure of vaccine efficacy—was just a small part of the total immune response.55 The rigid model could not accommodate new data.56

AGE OF ONSET

The age of onset is the age at which the allergy is first discovered and not the moment of sensitization. Sensitization occurs before onset. Children born after 1990 carried a surprisingly increased risk of being both sensitized and reactive to peanut according to the Isle of Wight cohort studies, ER records, and eyewitness accounts as outlined in chapter 1. US studies indicated that prevalence of reactivity began to increase in those born between 1991 and 1997, and that by the year 2000, the percentage of children allergic to peanuts had surpassed the adult prevalence. By 2008, 1.4% of US children were peanut allergic.57

While the prevalence of the allergy in children climbed, the adult statistic remained relatively constant at about 0.5% of those over eighteen.58 However, information pertaining to actual age of adult onset—age when those surveyed first discovered the peanut allergy—was not available as of 2010. An adult surveyed at age forty-nine (b. 1959) in the 2008 US phone survey, for example, may have developed peanut allergy after a heart attack and following the use of prescription drugs. This kind of information that would have helped pinpoint causes of peanut allergy was not included in the published US surveys.

As the numbers of allergic children climbed, the age of onset dropped. A review of pediatric peanut-allergic patients at Johns Hopkins University indicated a median age of peanut exposure and reaction were twenty-two and twenty-four months respectively for children born between 1995 and 1997; for those born before 2000, the ages were nineteen and twenty-one months; and for those born after 2000, their ages were twelve and fourteen months.59 The ability to identify first exposure was anecdotal, using history and phone survey.

BIRTH MONTH

A correlation was discovered between the risk of developing severe allergy and the month of a child’s birth. A study indicated that 55% of children born during January through March had their first reactions to peanuts during those same months.60 Similarly, 57% of children born in October through December experienced their first reactions during that three-month period. The same phenomenon was noted for those born between April and June. The correlation prompted speculation that dietary changes on or near a child’s first birthday could explain the trend.

A Netherlands study detected an increased risk of cow-milk and egg allergies in patients born in November through January with a decrease in May. The same correlation was identified between period of birth and period of first peanut reaction.61 In a study from Duke University, 31% of the peanut-allergic patients were born in October through December, compared with 18% in April through June. This leaves 51% born in July through September. This observation pointed to a possible relationship between environmental or seasonal factors, but lack of data prevented further speculation.

GENDER

The strong gender ratio difference in the peanut allergy went unnoticed by researchers until the later 2000s. Its significance was little understood but echoed the same striking trend in autism. Prevalence of peanut allergy was higher in boys than girls—in a ratio greater than 2:1.

Of 140 patients at a Duke University pediatrics clinic (70 born between 1988 and 1999, and 70 born between 2000 and 2005), 66% of those that were allergic to peanuts were male.62 In a FAAN online survey, 67% of peanut-allergic children respondents were male. Similarly, of a Johns Hopkins University group, 63% were male.63 Other studies supported this trend.64 In an Australian ten-year survey of clinical consultation for food allergy in children under five, 60% were male.65 A male predominance of peanut allergy was reported in children younger than eighteen years—1.7% versus 0.7% between males and females.66 Only one study, a US CDC report from 2008, indicated that girls and boys were about even in prevalence of overall food allergy. While there was no explanation for the disparity between peanut-allergic boys and girls, a parallel phenomenon appeared in children with autism and Asperger’s syndrome.

Hans Asperger, who identified the syndrome on the autism spectrum, originally believed that no girls were affected by the condition he described in 1944, although he later revised this conclusion. This gap was as high as 10:1 for Asperger’s and 4:1 for autism. In 1964, Bernard Rimland observed that boys tended to be more vulnerable to “organic damage” than girls whether through hereditary disease, acquired infection, or other conditions.

The rate of autism and peanut allergy in children increased within the same window of time starting around 1990 with a concomitant gender ratio difference. The rate of autism in the United States was one in one thousand after 1970. Prevalence of autism spectrum disorders in the United States in 2006 was in the range of nine in one thousand children aged eight years67 with an increase in those diagnosed starting in the late 1980s.68 Peanut allergy had no significant profile prior to 1990. By 2009, it appeared in about one in seventy-five children in the United States and many other Western countries.

Increasingly, the health of boys and the birthrate of boys have been impacted by environmental pollutants at a higher rate than girls. The global decline in male births was nowhere more evident than in the Aamjiwnaang First Nations community in Ontario, Canada. In this small population downriver of polluting petrochemical plants, female births outnumbered male births 2:1 in 2003.69

RACE

Race was not seen to be a risk factor for developing peanut allergy.70 However, geography, access to medical care, cultural norms, and socioeconomic and political factors can be associated with race. These factors may be reflected in the 2008 CDC National Health Interview Survey. In this survey, food allergy was reported in 3.1% of Hispanic children under eighteen years of age. This is significantly different from non-Hispanic white and non-Hispanic black children of whom 4.1% and 4% respectively had food allergies.

One study asked whether minority children were being underdiagnosed or undertreated for allergic conditions or whether they truly had a lower incidence of such allergies.71 This 2005 study found significant racial, ethnic, and socioeconomic differences in the prevalence of childhood allergic disorders, especially peanut or tree nut allergy, but only as it related to prescribed injectable epinephrine.

Food-allergy reactions appeared to occur at a higher rate in Asian children living in Westernized countries.72 One study found allergies in general to be higher in Asian than in European children in the United Kingdom.73 In contrast, food allergy in Asian children living in China is traditionally low. Around 2005, however, this freedom from allergy changed when 1% of children living in Hong Kong were found to be peanut allergic and there was a doubling of food allergy in 1 to 2 year olds in Chongqing, China ~ from 3.5% in 1999 to 7.7% in 2009.

MODE OF DELIVERY AND INTESTINAL FLORA

Researchers suggested that a child born by cesarean section had an increased risk of developing allergies. They postulated that this mode of delivery used for one-third of US children delayed the growth of important flora in newborn intestine thereby impacting the immune system. Approximately 60% to 70% of the body’s immune system is in the gut. While they conceded that cesarean was not linked to any specific allergy, it appeared to intensify atopy in general.

A Norwegian study focused on birth by cesarean section, the use of antibiotics in creating dysbiosis (bacteria imbalance in the digestive system), and low levels of digestive flora as risk factors in reactions to egg, fish, and nuts. Among the 2,803 children whose mothers were atopic, birth through caesarean section was associated with a sevenfold increased risk of reactions to foods. The association between caesarean and food allergy was not significant in children of nonatopic mothers nor was maternal or infant use of antibiotics.74 These conclusions were echoed by a German study.75

Cesarean sections delayed the colonization of flora in newborn intestine. Balanced intestinal bacteria were seen as important for digestive health and the integrity of the colon. If ill-digested proteins managed to escape through the colon walls and enter the bloodstream, allergies could result. A subsequent article offered, however, that rather than increasing the overall risk of food allergy, cesarean simply made allergies worse.76

Yet another study from Finland found that allergic children had different fecal microflora with less lactobacilli and bifidobacteria. Probiotic and prebiotic supplements were given to 1,223 children over five years. Less IgE-associated atopy occurred in 24% of cesarean-delivered children who took the supplements.77

Cesarean births in the United States peaked at an average 24.7% of births in 1988 and then steadily declined between 1989 and 1996 before increasing yet again.78 While the WHO recommended a maximum of 15% of births by cesarean, the US rate in 2006 was 31.1%.79 In the United Kingdom, the cesarean birth rate was 10% in 1980, 11% in 1990, and 22% in 2002. In Israel, the cesarean birth rate was 10.7% in the 1980s,80 16% in 1999,81 and 17–18% in 2006.82

While UK and Israeli cesarean rates were roughly comparable, prevalence of peanut allergy was significantly higher in the United Kingdom than in Israel (2% versus 0.17%, now increased to .6%). These figures were the inverse for sesame allergy in children. In Israel, 1.2% children were allergic to cow’s milk and sesame, followed by egg (2002, 2008).83 Sesame allergy in UK children in a 2005 study was low at 0.1%.84 These significant differences cannot be explained by mode of delivery.

Cesarean birth like so many other factors appeared in general to exacerbate a tendency to allergy. But there was no specific link to the peanut allergy.

MATERNAL AGE AT DELIVERY

Since the average age of first-time mothers had gradually increased, doctors wondered whether it was a risk factor for allergy in children. As with many of the proffered general risk factors, it failed to shed light on the peanut allergy.

An American study of fifty-five severely food-allergic children sought to evaluate whether maternal age at birth was higher for children with IgE-mediated food allergy than for those without.85 The mean maternal age at birth of children with food allergies was 31.2 years compared to the mean maternal age at birth of children without food allergies at 29.2 years. Mothers of children with a food allergy had 2.88 times greater odds of being aged above 30 years at the time of delivery compared to control patients: 78% compared to 55%.

While an explanation for this disparity was not ventured in the study, environmental factors may have contributed. According to the CDC statistics, there was a greater tendency for older mothers 30+ to have cesarean.86 This mode of delivery was shown to exacerbate atopy in children born to atopic mothers. Again, though, there was no connection between maternal age and the puzzling features of the peanut allergy—including its epidemic acceleration after 1990.

SOCIOECONOMIC STATUS

The destruction of the Berlin Wall in 1989 and the fall of communism offered an opportunity for allergy researchers to better understand the divergent allergy trends in the two-halves of the city.

People in the less-affluent East Berlin had a significantly lower prevalence of atopic conditions than people living in the wealthier West. Within ten years of reunification, however, a study of children indicated that the two-halves had become the same in this regard. With reunification, researchers concluded, there came greater access to Westernized health care and lifestyle. Cleaning products, antibiotics, and other amenities had generally altered conditions for children born in the East and contributed to the increase in allergy, it was thought.

Between 2003 and 2006, the German Health Interview and Examination Survey for Children and Adolescents collected information on asthma, atopic dermatitis, hay fever, and eczema for 17,641 children aged one to seventeen. The survey revealed that there was increased sensitization to twenty common allergens.87

A loss of disease burden with improved socioeconomic conditions was presumed to have caused a dysregulation in the immune systems of children.88 However, it seemed unlikely that in just ten years, East Berlin would have become sufficiently germ reduced and affluent to have prompted such a significant increase in atopy. This abrupt development suggested a more immediate and invasive cause.

Nevertheless, the Westernization of East Berlin did not explain the specificity of the peanut. In fact, prevalence of peanut allergy appeared to be low for children in Berlin, East and West.89 In a randomly selected population-based survey of children in Berlin, the food allergens most commonly identified by oral challenge were apple, hazelnuts, soy, kiwi, carrot, and wheat. Food allergy symptoms, although no anaphylaxis, was shown in 4.2% of the children. All reactions were mild and mainly due to pollen cross-associated food allergy (oral allergy syndrome). In this study, peanut was not an issue. However, in a 2005 analysis of physician reported cases of 103 anaphylactic children in Germany, foods were the most frequent cause of the reaction (57% and 20% [eleven] to peanut of this number) followed by insect stings (13%) and immunotherapy injections (12%). Peanuts and tree nuts were the foods most frequently causing the reactions.90

The description of this risk factor, however, stopped short of delineating the specific features of socioeconomic status that were altered in East Berlin starting in 1989. The specific conditions that gave rise to peanut allergy elsewhere were presumably largely absent from East Berlin at this time. Given that the prevalence of the peanut allergy was still low ten years after reunification, those conditions may have continued to exist to some degree.

Other research noted a connection between socioeconomic status and prevalence of anaphylaxis in the United Kingdom.91 Researchers used a map to highlight the more affluent areas of the country in which there was a slightly higher prevalence of anaphylaxis by virtue of ER admissions. Of these, reactions to drugs constituted over 50% of recorded triggers, and food made up almost 20%; children under five made up the majority of these patients. And yet, another UK study published in 2003 based on a longitudinal study in Avon saw no statistically significant associations between peanut allergy and any socioeconomic factor.92

LARGE HEAD CIRCUMFERENCE

Researchers in 1999 correlated the rise in allergy in children with the size of their heads. A study of newborn cord blood samples and measurements revealed an increase in IgE related to large head circumference.93 IgE is traditionally thought to be produced only by the mother and cannot pass to the child. While this idea is being reviewed in the literature the level of IgE in the mother is thought to be a means of predicting a risk of future allergic tendency in the child.

The explanation for the relationship between a large head at birth and future allergy was curious. In affluent societies where nutrition is generally good, researchers explained, the fetus will grow rapidly during the early stages of pregnancy and will remain programmed to grow at this rate.94 As the pregnancy progresses, the child will have a high nutrient demand, which is difficult to meet. In a poor community with poor nutrition, the fetus is programmed to grow slowly and have lower nutrient demand. The high-demand fetus is suddenly in a position where nutrient delivery is constant and therefore does not sustain growth. The brain and head continue to grow at the expense of the body that results in a big head and normal birth weight but poor nutrient delivery to other parts of the body. This, in turn, modifies the immune system. Apparently, the Th1 side is more susceptible to being switched off in adverse circumstances than Th2. Thus, it was suggested, the relationship of a large head and suppressed Th1 side explained the increase in allergy in affluent societies.

While having a large head at birth may point to future allergies, according to limited studies, it did not explain the sudden accelerated prevalence of peanut allergy starting around 1990.

HEREDITY

A broadly accepted risk factor in the development of allergy in children is heredity. While statistics revealed common allergy threads between siblings and mothers, again, this risk did not fit with the simple facts of the peanut allergy—its abrupt emergence around 1990 and its rapid spread. Genes do not change that quickly.

In 1996, it was observed that peanut allergy was more common in siblings of people with peanut allergy than in the parents or the general population.95 The higher rate of peanut allergy in the siblings of people with peanut allergy compared with the general population was about 7% versus 1.3% in one study.96 Peanut allergy was reported by 0.1% (3 out of 2,409) of grandparents, 0.6% (7 out of 1,213) of aunts and uncles, 1.6% (19 out of 1,218) of parents,97 and 6.9% (42 out of 610) of siblings according to a 1996 study.98 In 2000, one researcher performed a study of monozygotic and dizygotic twins aged one to fifty-eight in which one of the pair had the peanut allergy. Skin tests performed suggested that because more nonreactive monozygotic pairs had a positive test, than dizygotic pairs that there was a genetic influence on peanut allergy.99

This study defaulted to the idea that there may be an “allergy gene” that condemned some families to allergy.100 It avoided discussion of differences between the twins as individuals with different medical experiences, gender, history, digestive health, kidney function, and abilities to detoxify waste.

The same differences may have been at play in a provocative study using different strains of genetically engineered mice. The study showed that injections of peanut-induced anaphylaxis in some but not all strains.101 After vaccination with peanut, the mice were injected again three or five weeks later. Researchers were unable to induce anaphylaxis in two strains of albino mice AKR/J and BALB/c—neither IgE or IgG1 were found in these two strains although IgG2a was increased. Anaphylaxis was easily induced in the gray-brown C3H/HeSn strain. A general genetic explanation was offered for these differences that, researchers proposed, might also exist in humans.

This vision of DNA was formulated by Francis Crick who, together with James Watson, deciphered the structure of the DNA molecule in 1953. Crick came to believe that DNA controls life, a ubiquitous concept that is usually accepted as incontrovertible fact.

While moderate doctors such as Kenneth Bock suggested “genetics may load the gun but environment pulls the trigger,”102 a new generation of scientists has rejected the Crick DNA dogma outright. Genetics did not load the gun, the environment did under the new concept of epigenetics.

Cell biologist Dr. Bruce Lipton in his research at Stanford University School of Medicine in the early 1990s suggested that the environment, operating energetically through the membrane of the cells, actually controls the behavior and physiology of cells. In other words, the human body is not a biochemical machine at the mercy of self-actualizing genes (turn themselves on and off), but rather environment and the individual’s perception of it control gene activity. Cells possess the ability to reprogram their own DNA as they are affected by diet, chronic thoughts, and even vaccination. Such rewriting accounts for up to 98% of evolutionary transformation. In short, epigenetics suggested that people are masters of their own biology. If this were true, there would be implications for a broader understanding of peanut allergy and perhaps also how to recover from it.

The traditional understanding of genetics, however, cannot explain the peanut allergy epidemic. Hundreds of thousands of children would have had to experience a simultaneous change in their genetic profile starting around 1990 and occurring regularly and increasingly since then to account for the current peanut-allergy phenomenon. This would have been highly unlikely.

IMMUNE SYSTEM OVERLOAD

The four As—allergy, autism, ADHD, and asthma—have emerged from fundamental dysfunctions in nutritional, immune, and inflammatory factors, suggested Ken Bock in Healing the New Childhood Epidemics (2007).103 Contributing to these unhealthy conditions in many children were the following: fungal overgrowth, especially candida that has spread from the gut; poor diet and eating habits; and deficiencies in probiotics (beneficial digestive flora), essential fatty acids, stomach acid, and digestive enzymes. Further crippling a child’s immune system were antibiotic overuse and childhood vaccinations.

If a robust Th1-side immune response is not established, Bock offered, a child can develop a chronic low-grade infection from an injected vaccine antigen, such as measles. Such an infection can linger in the gut resulting in inflammation, which in turn sends out immune cell messengers, cytokines, that trigger even more inflammation in distant parts of the body. This can lead to inflammation in joints, the lining of the gastrointestinal tract, and even the brain. All inflammation contributes to allergies, stated Bock. And allergies cause even more inflammation.

While these overlapping factors contributed to the prevalence of the four As, the smoking gun that explained the sudden prevalence of peanut allergy had yet to be unearthed.

VACCINATION

Bock (2007) pointed to the mercury preservative thimerosal in vaccines as a significant factor in the epidemics of autism, ADHD, asthma, and allergies in children. He also expressed concern over the increase in the number of vaccines, the multidose single shots, unidentified health conditions of the child at the time of vaccination, and more.104

For scientists, it was difficult to confirm the role of vaccination because there were no studies of nonvaccinated populations in the United States—there were so few children who had not been vaccinated. According to the CDC, US vaccination rates have been at record highs. While 77% of US kids met all vaccination goals in 2007, at least 90% met the goal for each vaccine except for DTaP, but even those kids received three out of the four recommended doses.105

Recognizing that there was a significant gap in medicine’s understanding of vaccination outcomes, two members of the US Congress introduced the Comprehensive Comparative Study of Vaccinated and Unvaccinated Populations Act in 2007. This bill was slow to achieve support and was reintroduced to the 111th Congress in 2009.106

In 2007, a grade 9 Connecticut student Devi Lockwood107 conducted an ad hoc study of the few nonvaccinated populations in the United States. Devi looked at the Old Order Amish who discouraged vaccination. In these communities, peanut allergy was virtually nonexistent. But because the Amish communities were genetically connected, their example in understanding the role of vaccination in allergy was rejected by the CDC.108

And so, Devi turned to Vashon Island, Washington, a haven for alternative medicine where 1,600 school-aged children were unvaccinated. Devi looked at two schools, elementary and middle, with high exemption rates on Vashon Island and two similar schools in his hometown of Ridgefield that had low-exemption rates. Where the vaccination rate was high, the prevalence of peanut allergy increased significantly. Where vaccination rate was low, so, too, was the prevalence of the allergy. At the two Vashon schools, there were three peanut-allergic children. In Ridgefield, there were twenty-two at the two schools. All peanut-allergic children had been vaccinated. Significantly, there were no unvaccinated children with peanut allergy in this small study.

Risks associated with vaccination were infrequently ventured in medical literature through the 1990s although increasing in the later 2000s. Vaccination, an event shared by the vast majority of Western children, carried clear political, social, and economic implications. Doctors would not or could not dissuade the public from getting their shots. And yet, the connection between allergy and vaccination was not new—the literature was clear about the allergencity of vaccines and even provided an example of a causal role in the outbreak of gelatin allergy in children starting in 1994. The relationship between injection and allergy has long been established (see chapter 4) and as the potency of adjuvanted vaccines increased through the 1970s researchers warned “that the regular application of aluminium compound-containing vaccines on the entire population could be one of the factors leading to the observed increase of allergic diseases.”109 And yet, vaccination is largely absent from the plethora of research on peanut allergy. As already noted, as a potential means of causing atopy, vaccination was lumped together with every other risk factor within the broad shouldered hygiene hypothesis (See chapter 3).

EAR INFECTIONS, ANTIBIOTICS & GUT FLORA

Children receive an average of 2.2 antibiotic prescriptions in the first year of life according to a 2013 study.110 The study looked at two groups of children born between 2007 and 2009—those diagnosed with food allergies and those without. In this study, children who had had 2.65 antibiotic exposures compared to 1.84 exposures were at greater risk for food allergies as measured by IgE antibodies. The study speculated on the destructive impact antibiotics have on digestive flora and how this might contribute to allergies.

In 2014, a study of mice from the University of Chicago Medical Center indicated that Clostridia, a common gut bacteria, reduced allergic sensitization. In the study, mice were treated with antibiotics then force-fed peanut thereby inducing a rise in IgE antibodies against the peanut. By re-introducing Clostridia to their digestive tracts the mice experienced a reduction antibodies specific to the peanut. According to study authors, “Clostridia caused innate immune cells to produce high levels of interleukin-22 (IL-22), a signaling molecule known to decrease the permeability of the intestinal lining.”111

It is important to note that none of the mice actually reacted to peanut. Rather, antibodies found in blood work indicated sensitization.

These findings support the ‘Expanded Hygiene Hypothesis’ (see chapter 3) which implicates pharmaceuticals in the general rise in allergies.

But why are children being given such quantities of antibiotics in their first year of life? Ear and throat bacterial infections, often strep, are common reasons for antibiotic use in children.

Between 1988 and 1994, there was a significant increase in ear infections among US children.112 In this time frame, the number of children with ear infections under one year of age went up by 3 percent and recurrent ear infections (3 or more) increased by 6 percent—this latter figure representing an increase of 720,000 more children in the 6-year period.

But what was causing this soaring number of infections for which antibiotics were prescribed? Many doctors and naturopaths point to back undiagnosed food allergies.

In a 1994 Annals of Allergy study, 86% of children had a significant reduction in ear infections simply by eliminating allergenic foods from their diet (wheat, eggs, corn, etc.).113 Another study showed that children allergic to cow’s milk were twice as likely to have recurrent ear infections. This allergy often goes undiagnosed although the relationship between dairy allergy and ear infections has been known for decades. A respiratory imbalance caused by dairy allergy known as Heiner syndrome causes recurring ear infections. Dr. David Hurst specializing in ENT has dedicated a web site to the role of allergy in ear infections stating that “allergy is the cause of most chronic middle ear fluid, and that aggressive use of standardized allergy management can solve the problem:”

It is my contention that the middle ear behaves like the rest of the respiratory tract and that what has been learned about the allergic response in the sinuses and lungs may be applied to the study of the ear to help in understanding the pathophysiology of chronic otitis media with effusion (OME).114

Again, if the root cause of ear infection is food or environmental allergies that in turn causes inflammation and fluid that invites infection for which antibiotics are required (causing more allergies) what caused those initial allergies?

This appears to be a circular discussion. However, one might suspect that while antibiotics make one more vulnerable to allergies, there is yet another sensitizing mechanism at play such as vaccination which is a well known cause of allergy and anaphylaxis. With vaccination the goal of immunity and allergy are inseparable—both defenses are provoked simultaneously.

GENETICALLY MODIFIED FOODS & HERBICIDES

The safety of genetically modified foods has been a significant concern since the 1980s.

Concern regarding the transfer of allergens from one plant or seed to another without informing the consumer emerged in the 1990s. For example, when genes from brazil nuts were used to enhance soybeans allergy experts objected. Guidelines rather than regulations seemed to apply but common sense ruled the day after testing patients allergic to brazil nuts with the soybean. The risk of reactivity through GMOs to those allergic was no longer theoretical.115

There was also a risk for the creation of allergies to previously unknown genetically engineered proteins (whether consumed directly or through gene transfer via animals fed the GM plants). These proteins new to humans are cause for concern but there has been no specific evidence linking them to a rise in allergy much less the sudden documented rise around 1990.

GM ‘roundup ready’ soybean by Monsanto was approved for growth in the US in 1995. The ‘roundup ready’ gene made the soybeans resistant to Monsanto’s Roundup herbicide known as glyphosate—this gene allowed farmers to spray the herbicide on the engineered soybean crops to kill surrounding weeds without harming the soybean plants.

Roundup/glyphosate was first introduced to farmers in 1974. The herbicide inhibits an enzyme ‘shikimate’ pathway in normal plants. This pathway synthesizes (makes available) essential amino acids tyrosine, tryptophan and phenylalanine. These amino acids are called ‘essential’ because humans need them to live and are unable to produce them without plants and bacteria. Again, glyphosate/roundup disrupts the shikimate pathway and kills plants that are not ‘roundup ready’ ie. the ‘weeds.’

In this process of ‘improving’ crops it was assumed that humans who do not have a shikimate enzyme pathway would not be impacted by the herbicide. It turns out, however, that common bacteria in the human digestive tract do have this pathway and that glyphosate interferes with their synthesis of these amino acids. And so, consuming this herbicide via soybean in processed foods, for example, would disrupt our gut microbiome and reduce the availability of these amino acids. Additional studies conducted by Dr. Stephanie Seneff of MIT indicates that the impact of this herbicide also impairs sulfate transport and other enzymes crucial in our abilities to detoxify.

While the disruptive role of herbicides is clear, that of GM foods is not. Certainly, as discussed in the section on antibiotics the loss of certain bacteria has impacted the integrity of the small intestine. Glyphosate no less contributes to this loss and ‘leaky’ gut which can lead to food sensitization and a rise in IgE.

However, neither the consumption of herbicides which has been constant nor GM foods correlates with the epidemiological facts of the peanut allergy epidemic. The sudden phenomenon of this allergy as it emerged just in children, only in specific countries and at the same time 1988 through 1994 is too abrupt, too precise a development to be explained by herbicide exposure that had been constant for decades or the gradual increasing consumption of GM foods.

THE NOCEBO EFFECT & OWNING YOUR BODY

The word placebo was first used in a medical context by British physician William Cullen (1710-1790). In his 1772 lectures, Cullen described the use of a remedy for an incurable case in order to bring comfort and ‘please’ the patient. Cullen was a follower of ‘sympathy’ or ‘vitalism’ and coined the word ‘neurosis’ suggesting in First Lines in the Practice of Physick (1790) that “almost all diseases considered on a certain point of view could be considered nervous”. Our current meaning of placebo in drug experiments, for example, compares subjects given an active drug with others given a sham or placeco substance (sugar pill). When those receiving the sugar pills experience positive effects or were ‘pleased’ by the placebo it is suggested that the subject’s mind or positive belief caused the improvement.

In contrast, the nocebo reaction mentioned first in 1961 by physician Walter Kennedy is one in which a negative belief causes negative physical symptoms. The nocebo effect has been documented many times. In a study of drug allergic patients 27% of those allergic reacted to an inert substance believing it was the drug to which they were allergic.116 While it seems clear that one’s expectation and beliefs are linked to reactivity it is tantalizing to speculate on how belief might contribute to initial sensitization.

A 2011 study showed that what one believes about one’s own body can contribute to an ‘up-regulated’ immune response.117 In an experiment at the University of South Australia, a team of neuroscientists discovered that if one has a ‘lost sense of ownership’ over a part of their body or even the entire body, the body will reject it to some degree (part or all) with a corresponding intense up-regulated immune response. In their experiment, the team injected histamine into the arms of volunteers while they were under the false impression that one of their arms had been substituted with a rubber arm. The immune response at the ‘replaced’ or ‘disowned’ arm was consistently and significantly bigger.

Lead researcher Lorimer Moseley was quoted in Science Alert (Jan. 2012):

“These findings strengthen the argument that the brain exerts some kind of control over specific body parts according to how strongly we own them.’’

“OUTGROWING” PEANUT ALLERGY

No one knew how or why a child “outgrew” a peanut allergy. And even when a child did, this resolution of the allergy was not always permanent. Statistics reported in 2001 indicated that as many as 22% of peanut-allergic children developed tolerance to the food later in life.118 Chances of outgrowing the allergy were improved if the child had low levels of peanut-specific serum IgE antibodies in infancy (less than 5kU per liter).119

A 1998 study of fifteen children compared those who had outgrown their clinical reactivity to peanut and those who had not. All children “resolvers” and “persisters” had reacted to peanut first at the age of about eleven months, and they were retested at about age five.

Although in skin prick tests, resolvers had much smaller wheals, blood tests showed that the IgE total and peanut-specific levels between the two groups did not differ.120 Allergy to other foods was less common in resolvers (2 of 15) than persisters (9 of 15).

In a phone follow-up two years later with the resolvers in this study, only one had reacted by vomiting after eating peanut. A note of caution no doubt accompanied the news of this apparent resolution, however, since some children previously thought to have outgrown the allergy have reverted.121

An unusual case was reported in 2005 of a child whose peanut allergy resolved following a bone marrow transplant.122 In this instance, not only was a food challenge negative, but also specific IgE to peanut was found to be undetectable (<0.35 KU A).

SUMMARY OF RISK FACTORS

The frantic attempts to find common ground between the hundreds of thousands of peanut-allergic children as the 1990s unfolded revealed a profound level of confusion. Intense study had been made into peanut consumption, the role of atopy, and other possible risks leaving researchers with more questions than answers. And despite the preponderance of research, significant risk factors such as gender, bowel condition, and vaccination were given little or no attention.

Deepening the concern of perplexed doctors was the unanticipated appearance of the allergy in China and Africa through the 2000s. Many had bet their reputations on the idea that just eating peanuts was a primary risk for developing the allergy and that boiling peanuts had protected the Chinese from it.

What emerged from the tangle of research ultimately was a partial profile of the person most at risk for developing the allergy: a child (male 2:1) born after 1990 in a Westernized country and whose ability to detoxify had been challenged by environmental factors. These factors appeared to include vaccination, but such a proposal amounted to a hypothesis very much at odds with official medical explanations for the phenomenal rise in allergy in the twentieth century.