BIOHACKERS

INFORMATION AGE

1946 to present

We have found the secret of life.”

It was February 28, 1953, and the man uttering those words had just burst into the Eagle Pub near England’s Cambridge University. The patrons must have thought it was the ravings of a madman, but it wasn’t: “we” was Francis Crick and his collaborator, James Watson—and “the secret of life” was the double-helix structure of DNA.

DNA (deoxyribonucleic acid) is the structural medium of genetic code, the programming language of life. Its existence had been discovered nearly a century prior, and its constituent parts had been subsequently identified. What hadn’t been clear was the precise mechanism by which hereditary information is encoded in the genome and replicated throughout successive generations. It was this discovery—the double helix—that led Francis, Crick, and Maurice Wilkins to receive the Nobel Prize in 1962.

DNA has an elegant simplicity. Two mirror-image strands spiral around each other like dance partners in perfect synchronization. The strands are bonded in such a way that they can be easily separated and replicated. Just four repeating molecules—adenine (A), guanine (G), thymine (T), cytosine (C)—form the basis of the code itself, each bonding with only one of the others (A with T, G with C). These base pairs form various functional units (codons, genes), which govern everything from protein synthesis to hormone signaling. Its beauty is undeniable: at the heart of all complex life, from mosquitoes to mankind, sits the same simple code.

But before the genome could be decoded it had to be sequenced. In 1977 researchers sequenced the entire DNA-based genome of a living organism for the first time. Phi X 174, a tiny bacteriophage, had a genome that only contained 5,386 base pairs. Over the decades that followed, advances in laboratory techniques grew alongside the computing power and software required to store and analyze genomic data. At roughly 3.3 billion base pairs, the human genome is more than 600,000 times larger than that of phi X 174. When two teams of scientists announced maps of the human genome in 2000, they were years ahead of schedule. The effort would have been impossible without the technology that came to define the Information Age, the digital computer, and the exponential growth of its power.

That advances in genomics depended on advances in computing was not lost on people in either field—nor were other similarities between biology and computers. Organisms are built on a base of genetic code, while computer programs are ultimately based on binary code. Both types of code are digital, encoding information in discrete bits (ATGC; 1s and 0s). Biological viruses infect living organisms, while computer viruses “infect” electronic devices. As Bill Gates observed, “Human DNA is like a computer program but far, far more advanced than any software ever created.” The rise of computers led to a profound idea: biology is an information technology. In fact, biology was the original information technology, humming along for eons before computers ever came around.

But what did that actually mean? To most people, genetic code was as foreign as software code. However, there was one group of people who thrived in this new cyber-habitat, a new breed of net denizens who finally felt at home in the Information Age: hackers.

HACKERS HAVE an image problem, particularly with nontechnical people. A “hacker” may summon the image of a rogue programmer; a “hack” may refer to an unwieldy fix (a kluge); and “hacking” sure sounds illegal. However, the true and original meanings were quite different. The term “hacker” originated at the Massachusetts Institute of Technology in the mid-twentieth century. Initially the word “hack” referred to the infamous pranks pulled by students, such as putting a fire truck on top of the MIT dome. As tech journalist Steven Levy writes in Hackers, “To qualify as a hack, the feat must be imbued with innovation, style, and technical virtuosity,” and this ethos carried over to computer hackers in the 1950s and ’60s. At MIT, hackers were respected as hands-on virtuosos—even if they pulled all-nighters, slept through class, and received poor grades. The students who always went to class, never left the library, and got straight As? They were called “tools.”

Early hacker culture took root at a few large universities and corporations (MIT, Xerox PARC), but it blossomed with the release of the personal computer and the rise of the Internet. In 1986 an online magazine for hackers called Phrack published Hacker’s Manifesto by The Mentor. The piece famously describes the feeling when a “damn underachiever” discovers computers: “a door opened to a world … rushing through the phone line like heroin through an addict’s veins, an electronic pulse is sent out, a refuge from the day-to-day incompetencies is sought … a board is found. ‘This is it … this is where I belong.’ ”

From the beautiful minds of these misfits emerged the new ethos of the Information Age. This hacker philosophy favored hands-on learning over book smarts; trial-and-error over theorizing; speedy solutions that were “good enough” over the endless pursuit of perfection; resourcefulness, simplicity, decentralization, and openness.

The starting point for a hacker is acknowledging his own ignorance: How does it work? “It” might be anything from a ham radio to word-processing software. Rather than looking for the answer in an instruction manual or a classroom (like the tools at MIT), hackers got their hands dirty through what they described as the Hands-On Imperative. This approach is also called learning by doing, do-it-yourself (DIY), self-experimentation (n=1), and trial-and-error. Hackers don’t try to avoid failure; they embrace it. Facebook founder Mark Zuckerberg adopted a company slogan rooted in hacker culture: “Move fast and break things.” The faster you fail, the faster you learn from your failures—and the sooner you succeed.

One benefit of speed is that it preempts the pursuit of perfection. Hackers live by the proverb, “Perfect is the enemy of the good.” During all-night “hackathons”—whether alone in a dorm room or at Facebook HQ—perfection is impossible, so it’s pointless to try. The same concept is articulated by the 80/20 rule or the Pareto principle—20% of the input produces 80% of the outcome—and trying to achieve 100% perfection is a waste of time and resources.

Time pressure also forces hackers to repurpose existing things to completely new uses. One of the most famous examples of hacking took place on Apollo 13 when endangered astronauts had to jerry-rig a square carbon-dioxide scrubber to fit into a round container. In fact repurposing existing inventions to new uses is a recurring theme throughout history. Coca-Cola was originally concocted as a medicine, Play-Doh was devised as wallpaper cleaner, and Viagra was developed to treat hypertension. Many major scientific breakthroughs were either discovered accidentally by hands-on experimenters (penicillin) or were first demonstrated through self-experimentation (the bacterial cause of ulcers).

Hackers also place aesthetic value on simplicity and elegance. The early hackers at MIT had to share computer time, so there was an incentive to write code as succinctly as possible. Today “Keep It Simple Stupid” remains a mantra among programmers. Simple isn’t just efficient, it’s also beautiful.

Hackers aren’t fond of authority figures, particularly tools who “earned” their authority while safely within the confines of academia, big business, or government. Two hacker principles speak to their fondness for open systems: “Mistrust authority—promote decentralization” and “Information wants to be free.” Open-source systems (Linux) stand in contrast to closed, proprietary systems governed by “authorities” (such as Microsoft Windows or Mac OS). Programmer Eric Raymond described these two approaches as “the Cathedral” (top-down, centralized authority enforcing a closed system) versus “the Bazaar” (bottom-up, decentralized equals collaborating in an open system).

These hacker principles aren’t just useful for understanding computers; they’re also applicable to understanding just about any complex system. Noted programmer, essayist, and venture capitalist Paul Graham likes to invest in “world hackers … who not only understand how to mess with computers, but mess with everything.” Kevin Kelly, founding editor of Wired, is also co-creator of Quantified Self, a movement of (mostly) techies who are applying the principles of hacking to improve their own personal health. Appropriately, they are referred to as biohackers.

If there are two things hackers love, they’re gadgets and data—and biohackers are no different. An increasing number of devices allow people to collect data about themselves: blood sugar levels, the number of steps taken each day, and sleep cycles. It won’t be long before checking blood work will only require a relatively inexpensive device that plugs into a smartphone, not a visit to the doctor’s office. The cost of sequencing the genome continues to drop, and soon it will be as unremarkable as taking a fingerprint. The marriage of big data and human health will give birth to personalized medicine, gene therapy, and countless treatments yet unknown.

But one doesn’t have to love gadgets or data in order to be a biohacker. The only requirement is taking responsibility for one’s own health. There’s no one else who can make the daily decisions—eating well, exercising regularly—that deliver lifelong health. In other words, being healthy is a “do-it-yourself” project. Seen in that light, we are all biohackers. Yet too many people entrust their day-to-day decisions to authority figures—the tools, as it were—on the assumption that the experts actually know what they’re talking about. In contrast, biohackers begin by acknowledging their own ignorance: How does the body work? The simple truth is that no one has a very precise idea. Not doctors, not molecular biologists, and not the average Joe.

Rather than looking for the answer in a scientific journal, molecular biology textbook, or classroom (like tools would), biohackers get their hands dirty. They, too, follow the Hands-On Imperative. Biohackers experiment on themselves: trying new foods, removing others, and tracking how their body responds. This trial-and-error approach has a number of virtues: it’s fast and cheap; the results are customized to unique persons or circumstances; and it doesn’t require a PhD in molecular biology.

Biohackers also understand that “Perfect is the enemy of the good.” Nobel laureate Max Planck pointed out the slow and halting progression of science, giving rise to the adage, “Science advances one funeral at a time.” Waiting for scientists to reach a consensus is waiting for your own funeral. There’s no such thing as perfect—no perfect diet, no perfect exercise, no perfect lifestyle. Unconcerned with perfection, biohackers adopt smart rules of thumb that stand a decent chance of being more or less right (the 80/20 rule).

Here’s how a smart biohacker would quickly get a handle on an aspect of health—say, diet. She would begin by looking at how diets vary across species (Animal). Then, she would learn about the human diet over the course of our formative years as hunter-gatherers (Paleolithic). Next, she would take into account cultural rules and possible recent adaptations among herder-farmers (Agricultural). Then, she would learn from the mistakes of explorers and producer-consumers eating industrial diets (Industrial). Finally, she would use self-experimentation to devise customized solutions that work for her (Information). She would be unconcerned with temporary failures—there’s no such thing as perfection—and eventually stumble on long-term success: a diet that allows her to thrive.

Nothing could be more appropriate than applying the hacker philosophy to biology. After all, that’s how evolution by natural selection works: “amateurish” trial-and-error, repurposing existing bits to new uses, and acceptance of “good enough” solutions. Nobel laureate Max Delbruck observed, “Any living cell carries with it the experiences of a billion years of experimentation by its ancestors.” The most brilliantly “engineered” organisms are actually the result of trial-and-error by countless generations of organisms. Another Nobel laureate, geneticist François Jacob, made the point that “Nature is a tinkerer, not an engineer.” In other words, nature is a hacker, the best there ever was. Trial-and-error, self-experimentation, tinkering, and hacking do not just contribute to the progress of science; they lie at the very core of evolution itself.

IF THINKING like a hacker is useful, then it may also be useful to think of the human body as an information technology—and it may be possible to borrow wisdom from the world of computer programming to make smarter health decisions.

There’s a saying among software developers: “It’s not a bug, it’s a feature.” It’s said in situations when a user thinks the software is malfunctioning, when it’s actually working as designed. A nontechnical example is when airlines overbook flights, selling more tickets than there are seats on the plane. The first time a person gets bumped he usually thinks there must be some mistake—that is, a bug. But airlines overbook on purpose because they know that, on average, a few people miss flights, which means empty seats and forfeited revenue. Overbooking isn’t a bug; it’s a feature.

Similarly, many people view the human body as “buggy.” But the human body is stunningly sophisticated, and more often than not we simply don’t understand it or misuse its features. For example, morning sickness in pregnant women has often been thought an unfortunate side effect of pregnancy. But morning sickness has a biological function: it makes a mother sensitive to unfamiliar and strong-tasting foods that are (or were) more likely to contain pathogens or toxins. Morning sickness is not a bug, but a feature of the human body.

Software developers have another phrase: “garbage in, garbage out”—or GIGO, for short. GIGO refers to how computers unquestioningly accept bad inputs (“garbage in”), process them, and spit out an incorrect “answer” (“garbage out”). If a fat-fingered accountant enters wrong numbers into accounting software, the software isn’t going to magically spit out an accurate set of books. Yet many people have too much faith in computers to correct for user error.

If the lesson of “It’s not a bug, it’s a feature” is to trust our evolved biology, the lesson of “garbage in, garbage out” is to not trust it too much. There are limits to computer technology, and there are also limits to our own biological technology. Send the wrong inputs to the body, and the body will still start to malfunction. Send the right inputs to the body, and everything works smoothly again. For example, many people struggle to get to sleep at night, relying on sleeping pills, alcohol, or other sedatives; they then struggle to wake up in the morning, relying on coffee, sugar, and other stimulants. This is often due to a circadian rhythm dysregulation caused, in part, by being exposed to indoor lights at night (It’s day!) and waking up to a dark room (It’s night!). It’s no wonder that the body becomes confused. Garbage in, garbage out.

Conceiving of the human body as an information technology is a recent phenomenon. Throughout history most people used other metaphors to try to understand how the body worked, many of them based on the technology of the age: fluid dynamics, hydraulic pumps, engines, and energy. The metaphors are worth understanding since many of them still creep into modern debates over health.

A fluid-based philosophy of health, humorism, was popular among Greek and Roman physicians and persisted all the way into the nineteenth century. Humorists (no pun intended) believed that the body contained four fluids, or humors: blood, yellow bile (urine), black bile, and phlegm. When the humors were out of balance, the result was disease. Humorism led to an unhealthy obsession with bodily fluids. Its prescriptions—bloodletting, purges, vomiting, and cupping—often had disastrous consequences. As George Washington fought the infection that would take his life, his doctors bled him of several pints of blood—most likely contributing to his demise.

Fluid-based metaphors for life underwent a renewal with the advent of industrial hydraulic technology: pumps and pressure. In psychology, no less a figure than Sigmund Freud popularized a fluid-based, hydraulic metaphor for the mind. As cognitive psychologist Steven Pinker writes in How the Mind Works, “The hydraulic model, with its psychic pressure building up, bursting out, or being diverted through alternative channels, lay at the center of Freud’s theory and can be found in dozens of everyday metaphors: anger welling up, letting off steam, exploding under the pressure, blowing one’s stack, venting one’s feelings, bottling up rage.” As it turns out, thoughts and emotions aren’t actually determined by hydraulic pressures in the brain. The brain can’t be understood without reference to the information content of cognition, and cognition can’t be fully understood without reference to the challenges humans regularly faced over the course of our evolution.

Combustion and energy are two additional metaphors borrowed from physics. Human metabolism has been variously described as a fire, furnace, or factory that produces energy. These days everyone wants more energy, as demonstrated by the popularity of energy drinks and countless energy bars. It’s also common to hear people cite the first law of thermodynamics—“Energy cannot be created or destroyed, simply transferred or transformed”—in support of the notion that “calories in” must equal “calories out.” While the laws of physics do apply to biological processes, biology is not physics, and metabolism isn’t actually a simple mechanical system. Bacon is not grass, and grass is not gasoline; humans are not cows, and cows are not cars. Energy is real, of course, and vitally important to life. But the average person already carries around enormous stores of energy in the form of body fat. People don’t actually want more energy; what they want is to feel energetic. Two of the most well-known factors that influence feeling energetic have nothing to do with energy intake: the perception of a serious threat causes the release of adrenaline, and morning sunlight causes us to wake up.

Of course, the human organism does contain fluids (blood), circulates them with a hydraulic pump (the heart), uses energy (food), and obeys the laws of physics (calories), but all of these systems are influenced by information. In a sense, information must be essential to life. Any system as improbably complex and self-sustaining as a human body could persist only if it were governed by intricate internal feedback loops that preserve functionality in the face of the relentless forces of decay: parasites, pathogens, predators, human enemies, and heat loss.

So when Watson and Crick exclaimed that they had found the secret of life, they weren’t just talking about some obscure aspect of molecular biology. The citation accompanying their Nobel Prize captured the true significance of their work: “for their discoveries concerning the molecular structure of nucleic acids and its significance for information transfer in living material [emphasis added].” Their discovery unlocked the source code that lies at the heart of heredity, how so much information about the world gets transferred from one generation to the next.

That’s why it’s so important to understand the path our species has trodden: primates, hunter-gatherers, herder-farmers, and industrial producer-consumers. And now, biohackers.