Genetics is well under way with its present revolution, but how does it impact you in your daily life? Simply, through adaptation. Dinosaurs adjusted so well to their environment that they dominated life on Earth as the major predator. They pushed the climate barrier, moving into colder zones that are now in the Arctic (because of the shift of tectonic plates). In their diets, some dinosaurs were vegetarians and some carnivores. But superb as this adaptability was, a cataclysmic event destroyed the dinosaurs. A giant meteor collision with Earth, thought to be in the region of present-day Yucatán in Mexico, created an overnight change in climate. Dust from the impact clouded the sun all over the globe, the temperature dropped precipitously, and dinosaur DNA didn’t have time to change.
Or did it? Some present-day reptiles survive freezing climates by hibernating through the winter, which allows snakes to live in New England, for example. But adaptation takes a long time, eons even, only if a species has to wait for random mutations. Adaptation can occur much more quickly in an individual through gene expression.
In 1942 a Dutch veterinarian and anatomist named E. J. Slijper reported on a goat born in the 1920s with no functional forelegs. The baby goat adapted to its unfortunate condition by learning how to hop, kangaroo-like, on its hind legs. The goat survived a year before dying accidentally. When Slijper performed an autopsy, he found several surprises. The goat’s hind leg bones had elongated. Its spine was S shaped, like our human spine, and the bones were attached to the muscles in a way that looked more like those of a human than those of a goat. Two other human characteristics had started to form—a broader, thicker plate of bone protecting the knee and a rounded inner cavity in the abdomen.
It’s startling to think that in one year a new behavior, walking upright, could make it appear as if a goat were becoming human, or at least like an animal that walks on two legs, because all of these changes are associated with the evolution of bipedal motion. Gene activities had changed to remodel the goat’s anatomy. For a long time Slijper’s goat attracted no serious attention. In the standard Darwinian view, how humans learned to walk on two legs was by random mutations that changed our carriage from the stooped posture of other primates, and that such mutations almost always occur one at a time. Even without Slijper’s observations, it’s quite challenging for evolutionists to explain how all the anatomical adjustments needed for human beings to walk upright could credibly occur one at a time. However, they all work together, and the goat proved that they could arise together, not as mutations but as adaptations. Can the epigenome actually pass along a complete and interconnected set of changes?
While the argument rages back and forth, there is no backing down from the speed of adaptations in human beings. The question of how much your lifestyle will affect your children and grandchildren hasn’t been settled. But the changes occurring in you are indisputable.
This is why identical twins are not actually identical. Beginning at birth, they start to live different lives and thus become different people, despite carrying virtually duplicate genomes. Identical twins can be quite variable in their susceptibility to disease and in their behavior. Genetic studies of identical twins have traditionally been used to determine what is referred to as the heritability of disease. If one twin gets a certain disease, what are the odds that the other will get that disease within fifteen years or so? It’s a simple calculation, actually. After studying hundreds of pairs of identical twins, researchers determined that the probability for Alzheimer’s disease occurring in both twins is 79 percent if one of them is afflicted. This means that lifestyle accounts for 21 percent of the probability of developing Alzheimer’s, even with identical genomes.
In contrast, for Parkinson’s disease, the heritability is only about 5 percent; therefore lifestyle would appear to play a hugely greater role. For hip fractures under the age of seventy, the heritability is 68 percent, but after age seventy it goes down to 47 percent. For coronary artery disease, the inheritability is about 50 percent, no more than random chance. For various cancers—colon, prostate, breast, and lung—the heritability in identical twins ranges from 25 percent to 40 percent, which is why the current view holds that the majority of cancers, perhaps a large majority, are preventable. Epigenetic changes associated with cancer can be induced by factors like chronic exposure to asbestos, solvents, and cigarette smoke. Yet these cancer-causing epigenetic changes could be offset with a healthy diet and exercise—it’s a highly promising possibility.
Physical changes don’t always need physical causes. Sometimes the stimulus can simply be a word. If you meet someone new and fall in love, there’s a dramatic shift in brain activity—this has been thoroughly documented—and if the person you’re smitten with says “I love you,” as opposed to “I’m seeing someone else,” the gene expression in the emotional center of your brain will be dramatically altered. At the same time, chemical messages sent via the endocrine system will create an adaptation in your heart and other organs. To be accepted by a beloved can make you lovesick; to be rejected makes you heartsick. There’s a unique gene expression for both.
There’s solid science behind these age-old experiences. In a 1991 study by microbiologists at the University of Alabama, mice were injected with a chemical that boosted their immune system. This chemical, known as poly I:C (polyinosinic:polycytidylic acid), causes greater activity in part of the immune system called natural killer cells. At the same time as the mice received the poly I:C injection, the odor of camphor was released in the air. The mice were quickly trained to associate the two, and thereafter a tiny amount of poly I:C would be enough to stimulate the mice’s natural killer cells as long as the smell of camphor was in the air.
The mice’s bodies on their own manufactured the chemicals needed to stimulate their immune system. All they needed was a small trigger. This is an impressive finding, because it shows that genes can adapt in a specific direction with very little motivation. The actual molecules of camphor passing from the nose to the brain of a mouse have no effect on the immune system. It was the association of the camphor that created the effect. We’ve gone one step further than the zapped cattle, whose behavior was changed by remembering the pain of being shocked. The mice did no conscious learning. Their bodies adapted without the mind (such as it is) having to learn or even think.
Human beings can think, of course, but our body is constantly being affected when we aren’t aware of it. As far as smell is concerned, pheromones given off by the skin are connected to sexual attraction in mammals and seem to play a part in human attraction. In an experiment to test aromatherapy, researchers found that people reliably report a positive mood change after smelling lemon oil as compared with no change after smelling lavender or odorless water. This mood elevation happened whether the subjects had ever experienced aromatherapy before. In fact, one group wasn’t told anything about the aromas or what to expect, and their mood also improved upon smelling the lemon oil.
Yet the power of expectation is undeniably strong. In the placebo effect, a subject is given an inert sugar pill and told that it’s a drug for relieving symptoms like pain or nausea, and in 30 to 50 percent of people, the body steps in and produces the chemicals needed to bring about the expected result. As familiar as the placebo effect has become, it’s still remarkable that mere words (“This will help your nausea”) can trigger such a specific response in the connection between brain and stomach. You can even give the subject a drug that causes nausea, and just by being told that it’s an anti-nausea pill, some people will experience that their nausea goes away. To complete the picture, there’s the nocebo effect, in which giving someone a harmless sugar pill and telling them that they won’t feel any benefit from it can even create negative effects.
We seem to have wandered rather far from how adaptation failed the dinosaurs, but all of these findings are highly relevant. If a mere odor or the words “This will make you feel better” can alter gene expression, and if a totally inert substance can create nausea or make it go away, the whole world of adaptation is wide open. Instead of being like Pavlov’s dogs, which salivated every time they heard a bell associated with mealtime, humans insert another step—interpretation.
In a mouse trained to associate camphor with a stronger immune response, there isn’t any interpretation. Stimulus leads to response. But all attempts to train human behavior run at least a fifty-fifty chance of failing. Positive incentives like money, power, and pleasure affect everyone, but there’s always the person who says no and walks away. Negative incentives like physical punishment, bullying, and extortion are very likely to make people do what their tormentors want them to do, but there are always some who resist and don’t comply. Between the stimulus and the response comes the conscious mind and its ability to interpret the situation and respond accordingly.
So what we have is a feedback loop that is at work in every experience. There’s a triggering event A, leading to mental interpretation B, resulting in response C. This response is remembered by the mind, and the next time the same event A arises, the response won’t be exactly the same. This feedback loop is like a never-ending conversation between mind, body, and the outside world. We adapt quickly and constantly.
This result became even more fascinating when the experiments took the same odor of camphor and introduced it while the mice were injected with a drug that lowered immune response. Once more, after a period of time it took only the camphor smell to impair the mice’s immune response. In other words, the same stimulus (camphor) could induce a specific response and its exact opposite.
In spite of the growing base of evidence supporting epigenetics, some evolutionary biologists are certain to insist that the evolution of our species is entirely random and based solely on natural selection. To even imply that there may be some highly interactive epigenetic program driving the evolution of our species will prompt many a staunch, card-carrying evolutionary biologist to froth at the mouth and label you a “creationist” touting notions of “intelligent design.” We are certainly not suggesting “intelligent design.” However, considering the mounting evidence for the effects of epigenetics on overall health, it’s time to seriously consider what the new genetics is teaching us about our own evolution.
Current findings could make a life-or-death difference. For nearly three decades at Ohio State University, Professor Janice Kiecolt-Glaser and her colleagues have been examining the effects of chronic stress on the immune system. The general picture was already well known. If you are subjected to repeated stress, resistance to disease goes down. In addition, you run the risk of developing disorders like heart disease and hypertension. But people are much less familiar with the dangers of everyday stress, the kind we don’t like but feel we should put up with.
Kiecolt-Glaser’s group looked at a stress that has become far more common recently, taking care of someone with Alzheimer’s disease. The baby boom generation is being sidelined more and more by taking on the responsibility for aging parents with Alzheimer’s, and because professional care is limited and too expensive, millions of grown children find themselves being the last resort for caregiving. As much as we love our parents, round-the-clock caregiving imposes serious chronic stress, day in and day out.
A genetic price is being paid. As a research website from Ohio State reported: “Earlier work by other researchers has shown that mothers caring for chronically ill children developed changes in their chromosomes that effectively amounted to several years of additional aging among those caregivers.” When their attention turned specifically to Alzheimer’s caregivers, it wasn’t surprising that Kiecolt-Glaser’s team found higher indications of depression and other psychological effects. But they also wanted to target the specific cells that showed evidence of genetic changes.
They found them in the telomeres of immune cells. Telomeres, you recall, are the caps that end a DNA sequence like the period at the end of a sentence. Telomeres fray as cells divide over and over, which gives a marker for aging. “We believe that the changes in these immune cells represent the whole cell population in the body, suggesting that all the body’s cells have aged that same amount,” says Kiecolt-Glaser. She estimates that this accelerated aging deprives Alzheimer’s caregivers of four to eight years of life. In other words, the adaptability of our bodies has serious limitations.
Kiecolt-Glaser pointed out that there are ample existing data showing that stressed caregivers die sooner than people not in that role. “Now we have a good biological reason for why this is the case,” she said. As Rudy was quick to appreciate while sequencing through entire genomes of over fifteen hundred Alzheimer’s patients and their healthy siblings, the genome is chock-full of repetitive sequences of A’s, C’s, T’s, and G’s. Some of these repeat sequences in the DNA can bind certain proteins residing deeply inside the nucleus of the cell in order to control the activities of genes in their vicinity. Other repeats lie at the tips of chromosomes, and their length is controlled by proteins such as telomerase. The longer the chromosome tips stay stable (rebuilt by telomerase), the longer the cell survives.
The fact is, over our lifetime we are adapting to our environment every day by modifying our bodies, including at the level of our gene activities. Your next meal, your next mood, your next hour of exercise is modifying your body in an endless flow of change. Darwin explained how a species adapts to the environment over eons of time, allowing for tens of millions of years during which dinosaurs arose and then changed into birds. Flight feathers are a physical adaptation to environmental pressure and nothing more to a strict Darwinist. But in fact our genomes are adapting in real time at every moment of our lives in the form of gene activity. Is it possible that these adaptations are a driving force all on their own?
This is a hot-button issue right now. For the vast majority of evolutionary biologists, putting adaptation before mutation is unacceptable. But there are exceptions. In a New Scientist article from January 2015 titled “Adapt First, Mutate Later,” reporter Colin Barras brings up Slijper’s goat in a new context. A primitive fish from Africa called the bichir has the ability to survive on land. As an adaptation, walking on land aids survival in the drought season by allowing the bichir to leave a dried-up pool to find fresh water as well as new sources of food and a wider territory to colonize. Other species have the same adaptation. When a walking catfish from Southeast Asia (Clarias batrachus) escaped into the wild in Florida, it became highly invasive by traveling over land. The walking catfish doesn’t use two legs but wriggles along, propped up on its front or pectoral fins, which keep its head up. As long as they stay moist, these catfish can remain out of the water almost indefinitely.
This adaptation to land travel reminded Emily Standen, an evolutionist at the University of Ottawa, of how ancestral fish emerged from the oceans hundreds of millions of years ago. Recently a 360-million-year-old fossil has caused a sensation by providing physical evidence of this epochal change in life on Earth. A newly discovered fossil fish named Tiktaalik roseae had a skeleton that was fishlike but with new features that were like tetrapods, or four-legged land dwellers. Standen specializes in the mechanics of evolving species, and she wondered whether these same adaptations could be sped up—and they could, quite dramatically.
Standen and her team raised bichir fish on land, and being forced to wriggle on their fins more than they normally would in the wild, the fish changed their behavior, becoming more efficient walkers. They placed their fins closer to their bodies and raised their heads up higher. Their skeletons also showed developmental changes—the bones supporting the fins had changed shape in response to higher gravity (fish in water weigh less). Like Slijper’s goat, a whole group of necessary adaptations had formed. It will take time to see how far this line of research will take us, but it already suggests exactly what the title of the New Scientist article says: “Adapt first, mutate later.”
This revisionist thinking is a lot to take in, but we assure you that it is all leading to something great. Replacing the simple cause-and-effect model of evolution for a cloud of vague influences is unsettling. The same holds true for your body right this minute. On any given day it is bombarded with influences—through food, behavior, mental activity, the five senses, and everything happening in the environment. Which one will be decisive? Genes can predispose you to depression or type 2 diabetes or certain kinds of cancer, yet only a percentage of people with such a predisposition are going to have the gene activated. Locating the specific factor, or factors, that will activate a specific gene is like throwing a deck of cards in the air and plucking out the ace of spades as they scatter.
Scientists don’t like giving up straight-line cause and effect. Many hate the very idea. So we are left with a model that looks like a traditional matryoshka, or Russian nesting doll, in which inside the biggest doll is a smaller one, then inside it a still smaller one, and so on, until an extremely tiny last doll. Nesting dolls are delightful, but what if you claimed that the biggest doll was actually built by the one inside it, and that one by the next smallest, and so on?
That’s essentially where genetics has led us. Sometimes the genetic picture is uncomplicated enough that no ambiguity arises. Imagine you see one white flamingo standing out among thousands of pink ones. What caused it to be white? A linear sequence of reasoning gives the answer. First comes a species, the genus Phoenicopterus, which contains six species of flamingos divided between the Americas and Africa. Each has a dominant gene that produces pink feathers generation after generation. But all genes can mutate or fail to appear, leading at random to albinism in a single chick. The number of chicks born with white feathers can be statistically predicted, and there the story ends.
We’re using Russian-doll reasoning here, going to smaller and smaller levels of Nature in search of causes. This is the reductionist method, which has time-honored value in science. Chasing nature down to its smallest component is the very business of science, whether it’s a physicist chasing down subatomic particles or a geneticist chasing down methyl marks on a gene. But there’s a problem here, and it’s quite crucial.
Consider someone who has become obese, joining the current epidemic of obesity that has swept through developed countries. There are many theories about why an individual becomes obese. Stress, hormonal imbalance, bad eating habits from childhood, and the excess of refined sugar and starches in the modern diet have all been suggested. Using Russian-doll reasoning, the eventual explanation would be traced to the genetic level. Although there was once a committed search for “the obesity gene,” bolstered by statistical evidence showing that overweight runs in families, that project met with only limited success, identifying some genes (e.g., the FTO gene) that carry DNA variants mildly predisposed to obesity. As with disorders like schizophrenia that have a genetic component, the genetic influence at best provides a predisposition.
Today, a smaller doll has been found in the form of epigenetics and the switches it controls. Almost every factor that might contribute to obesity, whether it’s too much stress, excess sugar, bad eating habits, or hormonal imbalance, would theoretically be regulated by the epigenome, the switching station that turns experience into genetic alterations. But here the reductionist line of reasoning hits a wall. It is extremely difficult to tell which particular experience creates which mark on which gene, thereby shifting gene activity. Some people grow obese with or without stress, with or without sugar, and so on. As a result, it is impossible to predict with any accuracy how past or future experiences reliably alter your gene activity. The cloud of causes that surrounds why Dutch men suddenly grew so tall surrounds a great deal of epigenetics. Something is creating methyl marks, but the mark is material in nature while the something that caused it often is not. An environmental toxin can cause epigenetic changes, but so can a strong emotion, like fear, at least in mice so far.
If you look deeper, the basic assumption that a material cause of epigenetic marks must be at work turns out to be wobbly. It is the entire range of life experience, from physical interactions to emotional reactions, that govern the chemical modification of certain genes with methyl marks. A methyl mark, which you recall is the most studied method for the epigenome to modify a gene, is extremely small. Chemically, a methyl group is tiny, no more than a carbon atom linked to three hydrogen atoms. Methylation marks only the C (cytosine) base pair, sticking to it like a remora, or sucker fish, to the belly of a shark—the cytosine molecule is forty times bigger. It’s been shown that when DNA is modified with more methyl marks, some portion of it is switched off. So we seem to be at the smallest doll, the one that switches all the bigger ones. Ninety percent of the modifications in DNA associated with disease are located in switching areas of the gene. Moreover, epigenetics has a remarkable effect on prenatal development, personality and behavioral tics, and susceptibility to disease above and beyond the genes and mutations inherited from our parents.
How your mother lived her life while carrying you in the womb may potentially affect your own gene activities and your risk for disease decades later. Canadian researchers at the University of Lethbridge subjected adult rats to stressful conditions and then studied their offspring. The daughter rats of stressed mothers had shorter pregnancies. Even the granddaughter rats, the mothers of which were not stressed, had shorter pregnancies. The researchers proposed that this occurrence was due to epigenetics. More specifically, they stated that the epigenetic changes brought on by stress involve what are called micro-RNAs,* tiny segments of RNA made from the genome that then regulate gene activity.
Leaving aside potential abnormalities that medical research can focus on, switching is how all of us got here. It’s basic to the journey by which a single fertilized cell in a mother’s womb grows into a fully formed healthy baby. As this first cell divides, every future cell contains the same DNA. But to develop a baby, there have to be liver cells, heart cells, brain cells, and so on, all different from one another. The epigenome and its marks regulate the difference. It was realized that a map of the epigenome was urgently needed in order to locate how each type of cell is determined in the development of an embryo in the womb. Four countries—the United States, France, Germany, and the United Kingdom—have funded the Human Epigenome Project, whose mission is to show where all the relevant marks are, or in official language, to “identify, catalog, and interpret genome-wide DNA methylation patterns of all human genes in all major tissues.”
With the participation of over two hundred scientists, a milestone was marked in February 2015 by the publication of twenty-four papers describing, out of the millions of switches involved, those that determine the development of over one hundred types of cells in our bodies. This effort involved thousands of experiments with adult tissue as well as fetal and stem cells. (In theory, counting all the spots on all the leopards in the world would be easier.) The chemicals that regulate different kinds of cells were already known, and sometimes the switches for them aren’t close to the affected gene. In fact, switch A can be located at a considerable distance from gene B. In such cases the researchers sometimes had to infer the switch’s role by looking at the chemical regulator. If it was present in a cell, they inferred that the switch was turned on.
Arriving at this portion of the epigenome map was an exciting development. Switching key genes on and off potentially might be the best route to preventing and curing a host of diseases. As the researchers acknowledge, locating all these switches gives them mountains of new data, but that’s only a beginning. In the activity of DNA, switches interact; they form circuits called networks; they can even act on the genes from a distance. Unraveling all the circuitry doesn’t indicate why the activity arises, any more than mapping the location of every telephone in a city tells you what people are saying to each other when they call. Different regions of the genome can be turned on in parallel via epigenetics owing to a three-dimensional reorganization of the genome (such as folding the DNA strand into a loop) that brings those regions into close proximity.
There is also the effect that epigenetics has on a child’s early life after leaving the womb. This period is like a pivot between the mother’s epigenetic influence and the experiences that belong to the infant. How important is the overlap between the two? This question is central to medical issues surrounding infants, and one of these is peanut allergies. As reported in the New York Times in February 2015, about 2 percent of children in the United States are allergic to peanuts, a number that has quadrupled since 1997. No one can explain why, but there’s been a sharp rise in all allergies in the past few decades, which also remains a mystery. This rise holds true across all Western countries.
A child with a strong peanut allergy can potentially die from exposure to even a small trace of peanuts in food. The standard recommendation has been that giving peanut butter and other peanut-related foods to infants increases their risk for developing the allergy. But a compelling 2014 study published in the New England Journal of Medicine has turned conventional wisdom on its head. Feeding infants foods like peanut butter early in life “dramatically decreases the risk of development of peanut allergy,” the study’s authors concluded. This was heartening news, since it indicated that a step in infant care could reduce or even reverse a rising trend.
The new study was based in London, where 530 infants considered at risk for developing peanut allergy (for example, they might already be allergic to eggs or milk) were divided into two groups. Starting when the infants were between four and eleven months old, one group was fed food containing peanuts, while the other had such foods withheld. By age five, the group exposed to peanuts had far less incidence of allergy, 1.9 percent as compared with 13.7 percent for those whose parents avoided feeding them peanut foods. In fact, it was speculated that having concerned parents keep peanuts away from their infant children might have actually caused the dramatic rise in peanut allergies.
For quite a while parents have been confused over the issue of allergies and newborns, not just relating to peanuts. Before this new finding, the data weren’t clear. As we’ve discussed, a newborn baby inherits its mother’s immune system, which serves as a bridge while the baby begins to develop its own antibodies. The thymus gland, located in the chest roughly between the lungs and in front of the heart, is where the immune system’s T cells mature. When the body is invaded by outside viruses, bacteria, or everyday substances like pollen, T cells are responsible for recognizing which invaders to repel. An allergy is like a case of mistaken identity, in which an innocent substance is identified as a foe, leading to an allergic reaction created by the body itself, not by the invader.
The thymus is at its most active right after birth up through childhood; once someone has developed a full complement of T cells, the organ atrophies after puberty. The issue with allergies centers on how much of our immunity is inherited genetically and how much is influenced by the environment after we’re born. To explain the alarming rise in allergies in developed countries, it would seem that the more polluted the environment, the worse the problem should be. But after the fall of the Soviet Union in 1991 and the opening up of its satellite countries, which generally have much higher pollution rates than the United States or Western Europe, investigators were stymied to find that highly polluted areas in Eastern Europe showed lower allergy rates than in the West.
Then it was thought that the reverse is true: Western countries are overly clean and sanitized, depriving the immune system of exposure to substances that it needs to adapt to. Therefore the peanut-allergy finding could be very significant. American Academy of Pediatrics guidelines issued in 2000 recommended that infants up to age three shouldn’t eat foods with peanuts in them if they were at risk for developing the allergy. By 2008 the academy acknowledged that there was no conclusive evidence that avoiding peanuts was effective beyond the age of four to six months. But there was as yet no study showing that it was correct to stop avoiding peanuts at all. The first real clue came in a 2008 survey published in the Journal of Allergy and Clinical Immunology, which found that the number of children with peanut allergy in Israel was one-tenth that of Jewish children in the United Kingdom. The significant difference seemed to be that Israeli children consume peanut foods in their first year, especially Bamba, a popular snack that combines puffed corn and peanut butter, while British children don’t if their parents are allergy conscious.
The new study, however, doesn’t apply to other foods that children develop allergies to. And two major questions remain to be answered: First, if the children who were fed peanut foods stop eating them, are they liable to develop the allergy? This question is being studied in a follow-up with the original subjects. Second, are the results applicable to kids at low risk for food allergies? That’s unknown, but researchers tend to feel that eating peanut food will do them no harm. Asking anxious parents to change their habits, however, may be difficult, since standard care has made such an issue of avoiding the “wrong” foods.
We’ve gone into some detail, not because we have the answer to allergies, but to make it clear how uncertain environmental influences can be, even though it’s known in a general way that epigenetic marks are sensitive to them. The miraculous development of a human from embryo to infant, toddler, adolescent, and adult involves an intricate dance between genes and the environment. In mammals, interactions between the newborn child and its parents can have profound effects on the child’s health decades later. Although many findings in this area have emerged only from mouse and rat studies, there is increasing evidence that they may pertain to humans as well. For example, mounting evidence shows that early-life abuse, neglect, and mistreatment lead to epigenetic effects on gene activity that adversely affect physical and mental health later in life.
For good or ill, early events shaping the bonds between parent and child have profound effects on the child’s brain development and personality. But how do these bonds get established? Increasingly, studies show that epigenetic modifications of the child’s genes are largely responsible, guided by childhood experiences that begin with the earliest days of life. When a mother acts detached from her child, there can be a dysfunctional hypothalamic-pituitary-adrenal (HPA) response associated with stress, impaired cognitive development, and the elevation of toxic cortisol, as measured in the child’s saliva.
Some abused children die young, and in these tragic cases their brains can be studied at autopsy. Research of this type has shown clear evidence of epigenetic modification (increased methylation) of the gene NR3C1, which results in nerve cell death in the brain region known as the hippocampus, used for short-term memory. In living children, the same gene modification can be found in the saliva of emotionally, physically, and sexually abused kids. Such damage can lead to subsequent psychopathic behavior.
These findings extend the long-held understanding that early abuse and neglect have profound psychological effects. Now we can trace the damage to the cellular level. In the search for the biological changes that underlie these events, epigenetic pathways controlling gene expression in the brain are increasingly being implicated. By the same token, it may be possible in the future to test the effectiveness of psychological or drug therapies by looking to see if the ill effects in the epigenome have been reversed.
Progress has already been made in animal trials. In 2004 a study at McGill University conducted by neuroscientist Dr. Michael Meaney showed that baby rats who were groomed (licked) often by their mothers had increased levels of glucocorticoid receptors in the brain, resulting in a reduction in anxiety and aggressive behavior. How were these behavioral changes achieved? Again, by epigenetics. Mice who received affectionate nurturing and grooming by their mothers underwent less modification of their glucocorticoid receptor genes by methylation, resulting in decreased amounts of cortisol, thereby lowering the anxiety, aggression, and stress response.
The most controversial area in epigenetics has to do with later generations being affected by stress and abuse today. When male mice are separated from their mothers after birth, they can suffer from anxiety and features of depression, like listlessness, that are then passed on to subsequent generations. The negative epigenetic changes are actually found in the mice’s sperm following the mice’s separation from their mothers—the sperm then serving as the vehicle for transmission to the offspring. Related studies have shown that a whole host of effects, from poor diet and stress to exposure to toxins (for example, pesticides that lead to epigenetic modifications in the brains and sperm of mice), can then be transmitted to the next generation.
A profound example of how we may be able to affect our own gene activity comes from a study straight out of science fiction. A Swiss-French team in Zurich was inspired by an innovative game called Mindflex, which comes with a headset that picks up brain waves from the player’s forehead and earlobes. By focusing on a light foam ball, the player can lift it up or down on a column of air. The game consists of being able to move the ball through an obstacle course, using thought alone.
The researchers wondered if the same approach could alter gene activity. They devised an electroencephalograph (EEG) helmet that analyzed brain waves and could then transmit them wirelessly via Bluetooth. As reported by Engineering & Technology (E&T) magazine in November 2014, the brain waves were turned into an electromagnetic field inside a unit that powered an implant inside a cell culture. The implant was fitted with a light-emitting-diode (LED) lamp that emitted infrared light. The light then triggered the production of a specific desired protein in the cells. One of the lead researchers commented, “Controlling genes in this way is completely new and is unique in its simplicity.”
The researchers used infrared light because it doesn’t harm cells while yet penetrating deep into the tissue. After remote brain transmissions worked on tissue samples, the team progressed to mice, where it was also successful. Various human test subjects were asked to wear the EEG helmet and to control the production of proteins in mice simply by using their thoughts. Out of three groups, the first were made to concentrate their mind by playing Minecraft on a computer. As reported in the E&T article: “This group only achieved limited results, as measured by the concentration of the protein in the bloodstream of the mice. The second group, in a state of meditation or complete relaxation, induced a much higher rate of protein expression. The third group, using the method of biofeedback, was able to consciously turn off and on the LED light implanted in the body of a test mouse.”
Beyond the amazing implications for the influence of thought directly on gene activities, this approach could someday be applied to help patients with epilepsy by instantaneously delivering drugs or switching certain genes on or off in sufferers via a brain implant at the very onset of a seizure. Just before a seizure, the epileptic brain generates a particular type of electrical activity that could be used to activate a light-activated genetic implant to rapidly produce an antiseizure drug. A similar strategy could be employed to treat chronic pain by producing painkilling drugs in the brain as soon as the first signs of pain occur.
All in all, our genome is a fantastically nimble assembly of DNA and proteins that is constantly being remodeled in terms of structure and gene activity—and much of this remodeling appears to be in response to how we live our lives. But the Russian-doll problem cannot be swept aside. It’s apparent now that chemically induced switches are at the root of shifting gene activity. That much is indisputable. A switch in gene activity in response to one’s lifestyle can be brought on by a small methyl group stuck on a gene, leaving a telltale mark. Without this chemical modification of the gene, a stem cell might not develop into a particular brain cell instead of a liver or a heart cell. Indeed, it might not even develop into anything at all but just keep dividing over and over again, the way a cancerous tumor forms.
Methyl marks are not only chemical modifications turning off gene activity but are also like musical notes representing the symphony of more complex gene interactions. By reading the marks as a group, we can get a sense of networks of activity that correspond to how we (and perhaps our parents and grandparents) lived. It might be possible to read directly from the epigenome the specific experiences involved, like living through a famine. Looking at the marks as the score of a symphony makes sense because it takes a multitude of notes before music can really be grasped. Looking at one bar of a symphony provides only a snapshot. Likewise, trying to find the smallest Russian doll doesn’t tell you the whole genetic story.
In genetics, the marks are being deciphered chemically, but the step of connecting them to what they mean in terms of experience faces major challenges. First, we can’t actually observe genetic changes in real time. Second, we can’t connect experience A to genetic change B with any specificity, except in a few cases. It should be possible to find epigenetic alterations from cigarette smoking, for example, yet even then, as we say, not everyone suffers the same damage from the effects of smoking. While we know how chemical marks on certain genes can come about, we cannot say how a certain type of life experience (e.g., prolonged famine) causes specific marks to appear on specific genes in exquisitely precise areas of the genome.
Presently, the biggest challenge remains the missing connection between marks and meaning. When a violinist sees the marks that begin Beethoven’s Fifth Symphony—the familiar ta-ta-ta-DUM—he goes into action, moving his arm up and down across the strings of the violin. You can see his arm move, but behind this action lie many invisible elements. The violinist knows what the notes stand for, having learned to read music. They aren’t just random black-and-white marks on a page. His mind turns the notes into highly coordinated actions between brain, eye, arm, and fingers. Finally, and hardly ever mentioned because it’s so obvious, a human being, Ludwig van Beethoven, was inspired to write the symphony and invented the four-note motif known to the whole world. Hundreds of bars of music are based on this simple group of notes.
Even with this knowledge, how does the chemical choreography of millions of genes and their chemically controlled on/off switches deliver the amazing ability of a brain to think? No one knows. How did the brain somehow evolve over eons in response to programming by newly arising mutations? Darwinian genetics would say that all these mutations occurred randomly. But how could this be the whole story, considering that epigenetic modifications in response to how we live our lives may well determine where in the genome new mutations arise? In such cases even Darwin would have to admit that not all mutations occur randomly.
Of course, Darwin could have no idea about epigenetics in his lifetime. But what if he did? Darwin might then tell us that our evolution involves the interplay of both epigenetic marks and new gene mutations. Darwin shocked his contemporaries by excluding God, or any mindful Creator, from his explanation of how modern humans came to be. Certainly, in the study of genetics, assuming some kind of higher intelligence behind the scenes doesn’t help us to understand how we evolved. But we can now consider an inherent organizing principle in the evolutionary process that transcends the single-minded concept of random mutations and survival of the fittest. In constructing a new model of evolution, methyl marks on thousands of genes and their histone partners, working hand in hand with the genome, would be helping determine where new mutations will arise (also by influencing the three-dimensional structure of DNA). Then Darwin’s natural selection can take over to decide which new mutations persist. In this intriguing albeit speculative scenario, we aren’t just blowing in the wind waiting for random mutations to arise. We are directly influencing the future evolution of our genome based on the choices we make.
* Note: The DNA between the genes used to be called “junk” DNA. However, we now know that the DNA between genes (or intergenic DNA) can be used to produce tiny molecules called micro-RNAs, which control gene activities throughout the genome.