Appendix A
INFORMATION PROCESSING IN NEURONS

It is often said that the brain is an “information-processing device,” that it takes information in from the world, transforms it into an internal representation, and takes actions that change the world around it. Even if the brain does process information, the brain is not a computer like the computer on your desk anymore than the swan we saw in Chapter 2 is an airplane. In this chapter, I will provide a starting discussion of cells, neurons, and information processing therein.

First of all, the brain is made of cells. Each neuron has all of the biological material needed for normal cellular function. Each individual neuron in the brain has a nucleus and shares the same DNA blueprint with every other cell in your body. Each neuron has all of the machinery to translate that DNA blueprint into RNA and then into proteins. Second, a neuron is an information-processing device. It has a tremendous amount of intracellular (and extracellular) machinery that reacts to changes in such a way as to calculate information.

Information

What is information? In the modern world, we encounter information processing every day.A The mathematics of information that enabled the computer revolution (sometimes called the information revolution) was worked out in the 1940s by Claude Shannon,1 an engineer working for Bell Labs,B building on work he did in cryptography,C both on his own and in collaboration with Alan Turing during World War II.2

Mathematically, Shannon realized that “information” was about separating possible codes into groups.D Shannon characterized this as answering yes/no questions. Each answer to a yes/no question equaled one bit of information. In the modern world, we encounter this definition of information every day. Computer memory and hard drive capacity are generally measured in eight-bit units called “bytes” (as in megabytes [one million bytes] and gigabytes [one billion bytes]). A byte consists of eight small physical things that can each be in one of two states (on/off, yes/no). In the memory of your computer or in a USB flashstick, that small physical thing is usually a tiny magnet that can be flipped back and forth. On a CD, that small physical thing is a miniscule pit dug into the plastic that reflects light of different colors depending how deep it’s been dug.

Shannon realized how to use probability theory to mathematically define less than one bit of information. The game Twenty Questions is based on this concept. If birds fly and mammals walk, then answering the question Does it fly? provides information about whether the person is thinking of a bird or a mammal. Of course, some birds don’t fly (think penguins) and some mammals do (think bats). Since Does it fly? mostly, but not completely, separates birds and mammals, the answer to Does it fly? provides less than one bit of information about whether the person is thinking of a bird or a mammal. Decision-making is about changing our actions based on the information available to us in the world.

Neurons

Neurons (like all cells) maintain a voltage difference between the intracellular and extracellular space. There are more negatively charged ions inside the cell than outside, which creates a voltage difference between the inside and the outside of the cell. In effect, a neuron is like a battery, maintaining a voltage across the cellular membrane. This battery is maintained through small holes in the membrane called ion channels.E These ion channels are made up of intricately shaped proteins that form mechanical devices that allow certain ions to flow through, while restricting the flow of other ions. Proteins are made up of long chains of amino acids following the blueprint in the DNA, translated through the mechanisms of RNA.

These ion channels are little tiny mechanical machines. While some of them are constituently open, always allowing their specific ions to flow through, others can open and close. For example, one channel type, the voltage-gated potassium channel, literally has a ball of amino acids on a chain of amino acids attached to the base of it. When the voltage in the cell changes, the chain changes shape and the ball pops into the channel, blocking it. As another example, the voltage-gated sodium channel twists like an iris so as to open or close depending on the membrane voltage. There are actually many subtypes of voltage-gated potassium and voltage-gated sodium channels, each with slightly different properties. These properties change the dynamics of the neuron to make it do the information processing it does. This step from which ion channels are expressed where on the neuron to how that neuron processes information is a very active area of research in neuroscience today. Some of these steps are known, some are still being investigated.

Information processing in neurons

The key to information processing in the brain is a phenomenon called the action potential or (more colloquially) the “spike.” A spike is a short event in which specific channels open up allowing positive ions to rush in, which changes the cross-membrane voltage; then the first set of channels starts to close, and a second set of channels opens up, which brings the voltage back down. This entire process takes a few thousandths of a second (a few milliseconds). A cell integrates changes in its cross-membrane voltage and fires a spike when that voltage gets large enough (crosses a threshold).3

Following Shannon’s definitions, information can be measured by comparing the group of situations in which a cell (or a set of cells) spikes to the group of situations in which the cell (or set of cells) is silent.4 Remember that Shannon defined information as separations between groups: spiking or not spiking is a yes/no question about something and carries information about that something, whatever it is. To answer how neuroscientists learn what that something is, we will have to wait until Appendix B, where we will turn to issues of measuring what cells are tuned to.

How do cells get information to the next cell? This brings us to the concept of a synapse, which connects one cell to another. A spike in one cell releases a chemical (called a neurotransmitter) across the very small (20 nanometers or 20 billionths of a meter) gap between the cells and changes the voltage of the other cell. Some synapses are excitatory and make the second cell more likely to fire spikes, while other synapses are inhibitory and make the second cell less likely to fire spikes.

Of course, whenever we’re talking about biological systems, things are never so simple. Actually, there are lots of different neurotransmitters and lots of different synaptic receptors to detect them. There are synaptic receptors on the presynaptic neuron. There are also synaptic receptors, called G-protein-coupled receptors, that, instead of allowing ions to flow in, release an attached protein complex, which then interacts with other proteins inside the cell. This engenders an extremely complex interflow of intracellular information processing. In general, we won’t need to go into detail about protein cascades for this book, but they are an important part of cellular processing, particularly in how cells change during learning.

To a first approximation, neurons take information in through their dendrites, perform computation on it in the dendrites and the soma, and then send information out through the axon, primarily by deciding when and whether to fire an action potential (spike), which travels down the axon and releases a chemical across a synapse connecting to other neurons in the brain. As with all first approximations, this description is often wrong in detail, even if correct in general. Some neurons are also performing calculations in their dendrites; some neurons connect through dendrite-to-dendrite synapses; some neurons do not fire action potentials; some neurons connect up to other neurons with electrical connections rather than chemical synapses; etc. But the starting description of a neuron with a dendrite, a soma, and an axon, connected to other neurons via synapses, is good enough for now.

You can imagine each neuron poised between excitatory inputs encouraging it to fire and inhibitory inputs suppressing it. The brain spends its time poised in a dynamic balance between choices, like a baseball player up on the balls of his feet, ready to jump forward to steal second base or ready to dive back to first. It does this for exactly the same reason that the base runner does: so it can react quickly.

Current theories suggest that many brain dysfunctions are conditions in which something has fallen out of balance. Epilepsy, for example, is almost certainly a consequence of an instability in excitation (encouraging neurons to fire spikes) and inhibition (holding them back).5 What is interesting about the nature of dynamic balance (particularly because of its dependence on feedback) is that you can get the wildfire spiking of epilepsy with either an instability in excitation or an instability in inhibition. Other diseases as well, such as schizophrenia, are also now being investigated as dynamic balances gone unstable.6

This balance exists at all levels, from the systems level, where multiple decision-making systems compete for a decision, down to the neural level, where both excitatory and inhibitory inputs balance to keep neurons firing at just the right rate, to the subcellular level, where learning and memory reside.

Memory

Memory can be defined as changes that reflect a history so that future responses differ because of it. Just as anything that allowed us to differentiate two signals is information, anything that reliably changes in response to a past can be thought of as a memory of that past. The dent in the car fender “remembers” that it was hit. The fender will be weaker and may respond differently the next time. One of the tenets of this book is that all of the psychological and mental observations that we observe have physical instantiations.

The physical instantiation of memory resides in changes in the synapses that connect neurons one to the other and in the internal protein cascades within each neuron. The primary changes in the nervous system during learning occur in the synapses—how much of an effect the presynaptic cell has on the postsynaptic cell when the presynaptic cell fires its spike. These changes are now known to exist throughout the brain, in the hippocampus, cortex, cerebellum, basal ganglia, and even the spinal cord.7 We now know that these connection strengths depend on the internal milieu of the protein cascades within the cell and the timing of the presynaptic and postsynaptic spikes.8

A journey from sensory to motor

It might be useful to take a journey from a sensory receptor into the brain. As we saw in our thermostat example at the beginning of the book, a decision process requires measuring the world, processing the information hidden within those measurements, and then acting upon that information. How does the neural system perceive the world? The first step is that there must be sensory neurons—neurons that fire action potentials in response to sensory stimuli.F Some cells, like the rods and cones in the retina, react through protein cascades. Rods and cones contain a light-sensitive protein called an opsin; in response to photons hitting the retina, these molecules change their shape, which causes a cascade of other protein changes, the last of which is to open up an ion channel and change the voltage in the cell. In other cells, the translation protein is a direct ion channel itself. For example, hair cells in the ear contain ion channels with little springs made of protein in them. As the sound wave moves the hairs on the cell, the hairs pull on the protein springs, stretching the ion channel open and allowing ions to flow in, changing the voltage of the cell. So, literally, these cells have sensors that translate or transduce the physical reality of the world into the inner information-world of our brains.

On the other end, motor neurons send axons to muscles. Muscles are excitable cells just like neurons and can also fire action potentials (spikes). Because the muscles all line up in arranged structures, the firing of many action potentials in muscle tissue can be detected electrically quite easily. These muscle-based action potentials are what is being detecting in an electrocardiogram (EKG [the K comes from the German Kardio, meaning “heart”]). In other muscle groups, these same signals are generally detected with electromyography (EMG [myo comes from the Greek word for muscle]). As you move any muscle in your body, the muscles are firing action potentials. The action potentials open up calcium channels in the muscle cell, which are then translated back into physical force by a calcium-sensitive protein (myosin) that contracts in the presence of calcium.

The neural network that is the brain can take information in from the world through its sensory systems and can act on the world through its motor systems. This book is about the steps in the middle, how a network of cells, in constant dynamic flux, processes the information therein to take the right actions in the right situations.

Books and papers for further reading

• Eric Kandel (2006). In Search of Memory: The Emergence of a New Science of Mind. New York: Norton.

• Joseph E. LeDoux (2002). The Synaptic Self. London, UK: Penguin.

• Peter Dayan and Larry F. Abbott (2001). Theoretical Neuroscience. Cambridge, MA: MIT Press.

The details of how neurons themselves work are given in the first few chapters of many neuroscience textbooks, many of which are actually easy to read for those who are not scared to dive in.

• Dale Purves et al. (2008). Neuroscience. Sunderland, MA: Sinauer Associates.

• Neil R. Carlson (2010). Physiology of Behavior. Boston: Pearson.

• Eric Kandel, James H. Schwartz, and Thomas M. Jessell (2000). Principles of Neural Science. New York: Elsevier.