No one could have guessed that an anonymous gray metal like rhodium could produce anything as wondrous as L-dopa. But even after hundreds of years of chemistry, elements continually surprise us, in ways both benign and not. Elements can muddle up our unconscious, automatic breathing; confound our conscious senses; even, as with iodine, betray our highest human faculties. True, chemists have a good grasp of many features of elements, such as their melting points or abundance in the earth’s crust, and the eight-pound, 2,804-page Handbook of Chemistry and Physics—the chemists’ Koran—lists every physical property of every element to far more decimal places than you’d ever need. On an atomic level, elements behave predictably. Yet when they encounter all the chaos of biology, they continue to baffle us. Even blasé, everyday elements, if encountered in unnatural circumstances, can spring a few mean surprises.
On March 19, 1981, five technicians undid a panel on a simulation spacecraft at NASA’s Cape Canaveral headquarters and entered a cramped rear chamber above the engine. A thirty-three-hour “day” had just ended with a perfectly simulated liftoff, and with the space shuttle Columbia—the most advanced space shuttle ever designed—set to launch on its first mission in April, the agency had understandable confidence. The hard part of their day over, the technicians, satisfied and tired, crawled into the compartment for a routine systems check. Seconds later, eerily peacefully, they slumped over.
Until that moment, NASA had lost no lives on the ground or in space since 1967, when three astronauts had burned to death during training for Apollo 1. At the time, NASA, always concerned about cutting payload, allowed only pure oxygen to circulate in spacecrafts, not air, which contains 80 percent nitrogen (i.e., 80 percent deadweight). Unfortunately, as NASA recognized in a 1966 technical report, “in pure oxygen [flames] will burn faster and hotter without the dilution of atmospheric nitrogen to absorb some of the heat or otherwise interfere.” As soon as the atoms in oxygen molecules (O2) absorb heat, they dissociate and raise hell by stealing electrons from nearby atoms, a spree that makes fires burn hotter. Oxygen doesn’t need much provocation either. Some engineers worried that even static electricity from the Velcro on astronauts’ suits might ignite pure, vigorous oxygen. Nevertheless, the report concluded that although “inert gas has been considered as a means of suppressing flammability… inert additives are not only unnecessary but also increasingly complicated.”
Now, that conclusion might be true in space, where atmospheric pressure is nonexistent and just a little interior gas will keep a spacecraft from collapsing inward. But when training on the ground, in earth’s heavy air, NASA technicians had to pump the simulators with far more oxygen to keep the walls from crumpling—which meant far more danger, since even small fires combust wildly in pure oxygen. When an unexplained spark went off one day during training in 1967, fire engulfed the module and cremated the three astronauts inside.
A disaster has a way of clarifying things, and NASA decided inert gases were necessary, complicated or not, in all shuttles and simulators thereafter. By the 1981 Columbia mission, they filled any compartment prone to produce sparks with inert nitrogen (N2). Electronics and motors work just as well in nitrogen, and if sparks do shoot up, nitrogen—which is locked into molecular form more tightly than oxygen—will smother them. Workers who enter an inert compartment simply have to wear gas masks or wait until the nitrogen is pumped out and breathable air seeps back in—a precaution not taken on March 19. Someone gave the all clear too soon, the technicians crawled into the chamber unaware, and they collapsed as if choreographed. The nitrogen not only prevented their neurons and heart cells from absorbing new oxygen; it pickpocketed the little oxygen cells store up for hard times, accelerating the technicians’ demise. Rescue workers dragged all five men out but could revive only three. John Bjornstad was dead, and Forrest Cole died in a coma on April Fools’ Day.
In fairness to NASA, in the past few decades nitrogen has asphyxiated miners in caves and people working in underground particle accelerators,* too, and always under the same horror-movie circumstances. The first person to walk in collapses within seconds for no apparent reason. A second and sometimes third person dash in and succumb as well. The scariest part is that no one struggles before dying. Panic never kicks in, despite the lack of oxygen. That might seem incredible if you’ve ever been trapped underwater. The instinct not to suffocate will buck you to the surface. But our hearts, lungs, and brains actually have no gauge for detecting oxygen. Those organs judge only two things: whether we’re inhaling some gas, any gas, and whether we’re exhaling carbon dioxide. Carbon dioxide dissolves in blood to form carbonic acid, so as long as we purge CO2 with each breath and tamp down the acid, our brains will relax. It’s an evolutionary kludge, really. It would make more sense to monitor oxygen levels, since that’s what we crave. It’s easier—and usually good enough—for cells to check that carbonic acid is close to zero, so they do the minimum.
Nitrogen thwarts that system. It’s odorless and colorless and causes no acid buildup in our veins. We breathe it in and out easily, so our lungs feel relaxed, and it snags no mental trip wires. It “kills with kindness,” strolling through the body’s security system with a familiar nod. (It’s ironic that the traditional group name for the elements in nitrogen’s column, the “pnictogens,” comes from a Greek word for “choking” or “strangling.”) The NASA workers—the first casualties of the doomed space shuttle Columbia, which would disintegrate over Texas twenty-two years later—likely felt light-headed and sluggish in their nitrogen haze. But anyone might feel that way after thirty-three hours of work, and because they could exhale carbon dioxide just fine, little more happened mentally before they blacked out and nitrogen shut down their brains.
Because it has to combat microbes and other living creatures, the body’s immune system is more biologically sophisticated than its respiratory system. That doesn’t mean it’s savvier about avoiding deception. At least, though, with some of the chemical ruses against the immune system, the periodic table deceives the body for its own good.
In 1952, Swedish doctor Per-Ingvar Brånemark was studying how bone marrow produces new blood cells. Having a strong stomach, Brånemark wanted to watch this directly, so he chiseled out holes in the femurs of rabbits and covered the holes with an eggshell-thin titanium “window,” which was transparent to strong light. The observation went satisfactorily, and Brånemark decided to snap off the expensive titanium screens for more experiments. To his annoyance, they wouldn’t budge. He gave up on those windows (and the poor rabbits), but when the same thing happened in later experiments—the titanium always locked like a vise onto the femur—he examined the situation a little closer. What he saw made watching juvenile blood cells suddenly seem vastly less interesting and revolutionized the sleepy field of prosthetics.
Since ancient times, doctors had replaced missing limbs with clumsy wooden appendages and peg legs. During and after the industrial revolution, metal prostheses became common, and disfigured soldiers after World War I sometimes even got detachable tin faces—masks that allowed the soldiers to pass through crowds without drawing stares. But no one was able to integrate metal or wood into the body, the ideal solution. The immune system rejected all such attempts, whether made of gold, zinc, magnesium, or chromium-coated pig bladders. As a blood guy, Brånemark knew why. Normally, posses of blood cells surround foreign matter and wrap it in a straitjacket of slick, fibrous collagen. This mechanism—sealing the hunk off and preventing it from leaking—works great with, say, buckshot from a hunting accident. But cells aren’t smart enough to distinguish between invasive foreign matter and useful foreign matter, and a few months after implantation, any new appendages would be covered in collagen and start to slip or snap free.
Since this happened even with metals the body metabolizes, such as iron, and since the body doesn’t need titanium even in trace amounts, titanium seemed an unlikely candidate for being accepted by the immune system. Yet Brånemark found that for some reason, titanium hypnotizes blood cells: it triggers zero immune response and even cons the body’s osteoblasts, its bone-forming cells, into attaching themselves to it as if there was no difference between element twenty-two and actual bone. Titanium can fully integrate itself into the body, deceiving it for its own good. Since 1952, it’s been the standard for implanted teeth, screw-on fingers, and replaceable sockets, like the hip socket my mother received in the early 1990s.
Due to cosmically bad luck, arthritis had cleaned out the cartilage in my mother’s hip at a young age, leaving bone grinding on bone like a jagged mortar and pestle. She got a full hip replacement at age thirty-five, which meant having a titanium spike with a ball on the end hammered like a railroad tie into her sawed-off femur and the socket screwed into her pelvis. A few months later, she was walking pain-free for the first time in years, and I happily told people she’d had the same surgery as Bo Jackson.
Unfortunately, partly because of her unwillingness to take it easy around her kindergartners, my mother’s first hip failed within nine years. The pain and inflammation returned, and another team of surgeons had to cut her open again. It turned out that the plastic component inside the fake hip socket had begun to flake, and her body had dutifully attacked the plastic shards and the tissue around them, covering them with collagen. But the titanium socket anchored to her pelvis hadn’t failed and in fact had to be snapped off to fit the new titanium piece. As a memento of her being their youngest two-time hip replacement patient ever, the surgeons at the Mayo Clinic presented my mother with the original socket. She still has it at home, in a manila envelope. It’s the size of a tennis ball cut in half, and even today, a decade later, bits of white bone coral are unshakably cemented to the dark gray titanium surface.
Still yet more advanced than our unconscious immune system is our sensory equipment—our touch and taste and smell—the bridges between our physical bodies and our incorporate minds. But it should be obvious by now that new levels of sophistication introduce new and unexpected vulnerabilities into any living system. And it turns out that the heroic deception of titanium is an exception. We trust our senses for true information about the world and for protection from danger, and learning how gullible our senses really are is humbling and a little frightening.
Alarm receptors inside your mouth will tell you to drop a spoonful of soup before it burns your tongue, but, oddly, chili peppers in salsa contain a chemical, capsaicin, that irritates those receptors, too. Peppermint cools your mouth because minty menthol seizes up cold receptors, leaving you shivering as if an arctic blast just blew through. Elements pull similar tricks with smell and taste. If someone spills the tiniest bit of tellurium on himself, he will reek like pungent garlic for weeks, and people will know he’s been in a room for hours afterward. Even more baffling, beryllium, element four, tastes like sugar. More than any other nutrient, humans need quick energy from sugar to live, and after millennia of hunting for sustenance in the wild, you’d think we’d have pretty sophisticated equipment to detect sugar. Yet beryllium—a pale, hard-to-melt, insoluble metal with small atoms that look nothing like ringed sugar molecules—lights up taste buds just the same.
This disguise might be merely amusing, except that beryllium, though sweet in minute doses, scales up very quickly to toxic.* By some estimates, up to one-tenth of the human population is hypersusceptible to something called acute beryllium disease, the periodic table equivalent of a peanut allergy. Even for the rest of us, exposure to beryllium powder can scar the lungs with the same chemical pneumonitis that inhaling fine silica causes, as one of the great scientists of all time, Enrico Fermi, found out. When young, the cocksure Fermi used beryllium powder in experiments on radioactive uranium. Beryllium was excellent for those experiments because, when mixed with radioactive matter, it slows emitted particles down. And instead of letting particles escape uselessly into the air, beryllium spikes them back into the uranium lattice to knock more particles loose. In his later years, after moving from Italy to the United States, Fermi grew so bold with these reactions that he started the first-ever nuclear chain reaction, in a University of Chicago squash court. (Thankfully, he was adept enough to stop it, too.) But while Fermi tamed nuclear power, simple beryllium was doing him in. He’d inadvertently inhaled too much of this chemists’ confectioner’s powder as a young man, and he succumbed to pneumonitis at age fifty-three, tethered to an oxygen tank, his lungs shredded.
Beryllium can lull people who should know better in part because humans have such a screwy sense of taste. Now, some of the five types of taste buds are admittedly reliable. The taste buds for bitter scour food, especially plants, for poisonous nitrogen chemicals, like the cyanide in apple seeds. The taste buds for savory, or umami, lock onto glutamate, the G in MSG. As an amino acid, glutamate helps build proteins, so these taste buds alert you to protein-rich foods. But the taste buds for sweet and sour are easy to fleece. Beryllium tricks them, as does a special protein in the berries of some species of plants. Aptly named miraculin, this protein strips out the unpleasant sourness in foods without altering the overtones of their taste, so that apple cider vinegar tastes like apple cider, or Tabasco sauce like marinara. Miraculin does this both by muting the taste buds for sour and by bonding to the taste buds for sweet and putting them on hair-trigger alert for the stray hydrogen ions (H+) that acids produce. Along those same lines, people who accidentally inhale hydrochloric or sulfuric acid often recall their teeth aching as if they’d been force-fed raw, extremely sour lemon slices. But as Gilbert Lewis proved, acids are intimately bound up with electrons and other charges. On a molecular level, then, “sour” is simply what we taste when our taste buds open up and hydrogen ions rush in. Our tongues conflate electricity, the flow of charged particles, with sour acids. Alessandro Volta, an Italian count and the inspiration for the eponym “volt,” demonstrated this back around 1800 with a clever experiment. Volta had a number of volunteers form a chain and each pinch the tongue of one neighbor. The two end people then put their fingers on battery leads. Instantly, up and down the line, people tasted each other’s fingers as sour.
The taste buds for salty also are affected by the flow of charges, but only the charges on certain elements. Sodium triggers the salty reflex on our tongues most strongly, but potassium, sodium’s chemical cousin, free rides on top and tastes salty, too. Both elements exist as charged ions in nature, and it’s mostly that charge, not the sodium or potassium per se, that the tongue detects. We evolved this taste because potassium and sodium ions help nerve cells send signals and muscles contract, so we’d literally be brain-dead and our hearts would stop without the charge they supply. Our tongues taste other physiologically important ions such as magnesium and calcium* as vaguely salty, too.
Of course, taste being so complicated, saltiness isn’t as tidy as that last paragraph implies. We also taste physiologically useless ions that mimic sodium and potassium as salty (e.g., lithium and ammonium). And depending on what sodium and potassium are paired with, even they can taste sweet or sour. Sometimes, as with potassium chloride, the same molecules taste bitter at low concentrations but metamorphose, Wonka-like, into salt licks at high concentrations. Potassium can also shut the tongue down. Chewing raw potassium gymnemate, a chemical in the leaves of the plant Gymnema sylvestre, will neuter miraculin, the miracle protein that turns sour into sweet. In fact, after chewing potassium gymnemate, the cocaine-like rush the tongue and heart usually get from glucose or sucrose or fructose reportedly fizzles out: piles of raw sugar heaped on the tongue taste like so much sand.*
All of this suggests that taste is a frighteningly bad guide to surveying the elements. Why common potassium deceives us is strange, but perhaps being overeager and over-rewarding our brain’s pleasure centers are good strategies for nutrients. As for beryllium, it deceives us probably because no human being ever encountered pure beryllium until a chemist isolated it in Paris after the French Revolution, so we didn’t have time to evolve a healthy distaste for it. The point is that, at least partially, we’re products of our environment, and however good our brains are at parsing chemical information in a lab or designing chemistry experiments, our senses will draw their own conclusions and find garlic in tellurium and powdered sugar in beryllium.
Taste remains one of our primal pleasures, and we should marvel at its complexity. The primary component of taste, smell, is the only sense that bypasses our logical neural processing and connects directly to the brain’s emotional centers. And as a combination of senses, touch and smell, taste digs deeper into our emotional reservoirs than our other senses do alone. We kiss with our tongues for a reason. It’s just that when it comes to the periodic table, it’s best to keep our mouths shut.
A live body is so complicated, so butterfly-flaps-its-wings-in-Brazil chaotic, that if you inject a random element into your bloodstream or liver or pancreas, there’s almost no telling what will happen. Not even the mind or brain is immune. The highest faculties of human beings—our logic, wisdom, and judgment—are just as vulnerable to deception with elements such as iodine.
Perhaps this shouldn’t be a surprise, since iodine has deception built into its chemical structure. Elements tend to get increasingly heavy across rows from left to right, and Dmitri Mendeleev decreed in the 1860s that increasing atomic weight drives the table’s periodicity, making increasing atomic weight a universal law of matter. The problem is that universal laws of nature cannot have exceptions, and Mendeleev’s craw knew of a particularly intractable exception in the bottom right-hand corner of the table. For tellurium and iodine to line up beneath similar elements, tellurium, element fifty-two, must fall to the left of iodine, element fifty-three. But tellurium outweighs iodine, and it kept stubbornly outweighing it no matter how many times Mendeleev fumed at chemists that their weighing equipment must be deceiving them. Facts is facts.
Nowadays this reversal seems a harmless chemical ruse, a humbling joke on Mendeleev. Scientists know of four pair reversals among the ninety-two natural elements today—argon-potassium, cobalt-nickel, iodine-tellurium, and thorium-protactinium—as well as a few among the ultraheavy, man-made elements. But a century after Mendeleev, iodine got caught up in a larger, more insidious deception, like a three-card monte hustler mixed up in a Mafia hit. You see, a rumor persists to this day among the billion people in India that Mahatma Gandhi, that sage of peace, absolutely hated iodine. Gandhi probably detested uranium and plutonium, too, for the bombs they enabled, but according to modern disciples of Gandhi who want to appropriate his powerful legend, he reserved a special locus of hatred in his heart for element fifty-three.
In 1930, Gandhi led the Indian people in the famous Salt March to Dandi, to protest the oppressive British salt tax. Salt was one of the few commodities an endemically poor country such as India could produce on its own. People just gathered seawater, let it evaporate, and sold the dry salt on the street from burlap sacks. The British government’s taxing of salt production at 8.2 percent was tantamount in greed and ridiculousness to charging bedouins for scooping sand or Eskimos for making ice. To protest this, Gandhi and seventy-eight followers left for a 240-mile march on March 12. They picked up more and more people at each village, and by the time the swelling ranks arrived in the coastal town of Dandi on April 6, they formed a train two miles long. Gandhi gathered the throng around him for a rally, and at its climax he scooped up a handful of saline-rich mud and cried, “With this salt I am shaking the foundation of the [British] Empire!” It was the subcontinent’s Boston Tea Party. Gandhi encouraged everyone to make illegal, untaxed salt, and by the time India gained independence seventeen years later, so-called common salt was indeed common in India.
The only problem was that common salt contains little iodine, an ingredient crucial to health. By the early 1900s, Western countries had figured out that adding iodine to the diet is the cheapest and most effective health measure a government can take to prevent birth defects and mental retardation. Starting with Switzerland in 1922, many countries made iodized salt mandatory, since salt is a cheap, easy way to deliver the element, and Indian doctors realized that, with India’s iodine-depleted soil and catastrophically high birthrate, they could save millions of children from crippling deformities by iodizing their salt, too.
But even decades after Gandhi’s march to Dandi, salt production was an industry by the people, for the people, and iodized salt, which the West pushed on India, retained a whiff of colonialism. As the health benefits became clearer and India modernized, bans on non-iodized salt did spread among Indian state governments between the 1950s and 1990s, but not without dissent. In 1998, when the Indian federal government forced three holdout states to ban common salt, there was a backlash. Mom-and-pop salt makers protested the added processing costs. Hindu nationalists and Gandhians fulminated against encroaching Western science. Some hypochondriacs even worried, without any foundation, that iodized salt would spread cancer, diabetes, tuberculosis, and, weirdly, “peevishness.” These opponents worked frantically, and just two years later—with the United Nations and every doctor in India gaping in horror—the prime minister repealed the federal ban on common salt. This technically made common salt legal in only three states, but the move was interpreted as de facto approval. Iodized salt consumption plummeted 13 percent nationwide. Birth defects climbed in tandem.
Luckily, the repeal lasted only until 2005, when a new prime minister again banned common salt. But this hardly solves India’s iodine problem. Resentment in Gandhi’s name still makes people seethe. The United Nations, hoping to inculcate a love of iodine in a generation with less of a tie to Gandhi, has encouraged children to smuggle salt from their home kitchens to school. There, they and their teachers play chemistry lab by testing for iodine deficiencies. But it’s been a losing battle. Although it would cost India just a penny per person per year to produce enough iodized salt for its citizens, the costs of transporting salt are high, and half the country—half a billion people—cannot currently get iodized salt regularly. The consequences are grim, even beyond birth defects. A lack of trace iodine causes goiter, an ugly swelling of the thyroid gland in the neck. If the deficiency persists, the thyroid gland shrivels up. Since the thyroid regulates the production and release of hormones, including brain hormones, the body cannot run smoothly without it. People can quickly lose mental faculties and even regress to mental retardation.
English philosopher Bertrand Russell, another prominent twentieth-century pacifist, once used those medicinal facts about iodine to build a case against the existence of immortal souls. “The energy used in thinking seems to have a chemical origin…,” he wrote. “For instance, a deficiency of iodine will turn a clever man into an idiot. Mental phenomena seem to be bound up with material structure.” In other words, iodine made Russell realize that reason and emotions and memories depend on material conditions in the brain. He saw no way to separate the “soul” from the body, and concluded that the rich mental life of human beings, the source of all their glory and much of their woe, is chemistry through and through. We’re the periodic table all the way down.