32

HACKING THE BRAIN

As a concrete example of the challenges involved in establishing a mental link between Jake and his avatar, let’s consider the scene in which avatar-Jake captures his great leonopteryx by falling down from the sky onto its back. As this is happening human-Jake is motionless in his tank. And yet Jake senses everything the avatar senses, and commands every aspect of its conscious movements. He feels the impact as the avatar lands on the creature’s back, feels the surge of acceleration as the indignant leonopteryx flies off.

How could you make this work?

To some extent Jake is like a player of a virtual reality (VR) system, with the “game” being Pandora as a whole. A virtual reality system feeds what is not real into our senses, well enough to enable us to believe that it is real—or at least well enough to suspend our disbelief.

And in some aspects existing systems do this pretty effectively. A music system is a VR system for the ears, fooling you into imagining there’s a rock band or a symphony orchestra in the room with you. The best modern high-fidelity systems have reached such a level of detailed simulation that the ear can’t tell the difference from the reality. For sight, too, watching the movie Avatar itself in 3-D gives you a flavour of what’s possible in delivering a convincing simulation.

So suppose you constructed an “avatar” like a high-tech robot, laden with cameras, microphones and other sensors. Jake meanwhile is in a wraparound suit with earphones, goggles and maybe with sense-stimulating plugs in his nose and mouth. He is in a motion-capture system of the type Quaritch uses to control his AMP suit, with the machine’s motions aping his own body’s gestures—or like the modern Wii game system. As the leonopteryx looms below the falling robot, you could imagine an all but perfect sensory simulation of the experience being relayed to Jake by all the little cameras and microphones and other sensors: he smells the leonopteryx’s leathery stink, an aroma simulated in some miniature chemical factory, and feels the rushing air of Pandora in his face, blown by tiny fans.

But this is a simulation which would end in dismal disappointment as soon as the robot hit the back of the animal with a shuddering crash—and Jake felt nothing of the impact.

Oh, you could provide human-Jake in his tank with some token jolt, like the little bumps you get in a fairground-ride flight simulator. But here we’ve reached the limit of modern VR technology. We don’t know any way to build systems external to the body to simulate the inner sense of the sharp deceleration that ends a fall, or indeed the acceleration that comes with a rocket launch, say. That’s why astronauts train for zero gravity by floating around in tanks of water, or in planes which make powered falls to provide the illusion of zero gravity for a few seconds: “Vomit Comets.”

You can list plenty of other “inner” sensations Jake needs to experience fully the avatar’s reality. He could be made to feel the Pandoran fruit in his hand, he could taste the juice in the avatar’s mouth—but how could he be made to feel hungry, when the avatar is hungry?

External VR systems of the kind we have today won’t be sufficient. Just as we see in the movie, it is necessary to hack into Jake’s brain to make this work.

In the link room we see Jake, preparing to drive his avatar, lie down in a “psionic link unit.” This has an architecture that looks similar to a modern medical scanner, like a magnetic resonance imager. With this, Max Patel and Grace Augustine are able to extract a three-dimensional image of Jake’s brain, complete with ongoing neural activity.

Then a data link is established between Jake’s brain and the avatar’s, as evidenced by similar-looking images in the scans. The techs speak of achieving “congruency,” as the brains are mapped one to the other. In mathematics, congruent triangles are the same shape and size; you could cut them out and overlay them exactly, though you might have to turn one over to do it. The word is also used in psychology to mean internal and external consistency of the mind. Ultimately “phase lock” is established between the two nervous systems.

What is happening is that the technology is hacking into the input-output systems of Jake’s brain. When he’s outside the link unit, Jake’s brain is connected to his body by a set of neural connections. Sensory information comes flowing into the brain through these connections, and Jake’s commands for his body—lift that arm, jump from that banshee—flow out of his brain. What the link technology has to do is hack into this flow of data, and into the similar flow of data in and out of the avatar’s brain. Sensory input coming in from Jake’s own body must be ignored, and replaced with the data flowing from the avatar’s body. Similarly Jake’s motor-control commands must be diverted from his own body, and transmitted to the avatar body. And all this is done “non-invasively,” in the jargon; the scanning machine manages all this without the need to stick wires into Jake’s skull.

This resolves the problem of inner sensation. It’s as if Jake’s brain has been physically implanted in the avatar’s body. Signals arising from the avatar’s inner proprioceptive senses of falling and then slamming to a halt aboard the leonopteryx are now sent direct to Jake’s brain, so that he “feels” the impact in a way he never could using an external suit.

So that’s the principle. What about the practice? Is this feasible?

Something like the avatar-link process has been studied in the context of “neuroinformatics.” “Mind uploading” is the process of scanning and mapping a biological brain in detail and transferring that data to a computer, or another machine. Clearly this is like half of an avatar link, with a link to a computer store rather than directly to another brain. And it is like the fate of Grace Augustine, when as her human body dies she passes through the “Eye of Eywa,” to become one with the Great Mother—that is, her consciousness is stored in Pandora’s great biological computer. (In this case Eywa was meant to be used as a temporary buffer; Grace’s mind was supposed to return through the Eye of Eywa and then enter her avatar body.)

We have taken some baby steps towards this kind of technology today. In “neuroprosthetics” the nervous system is connected directly to some device. And through a “brain–computer interface” (BCI—a variant is BMI, for brain–machine interface) the brain itself is connected to a computer. Researches in the field began in earnest in the 1970s at the University of California, where the term BCI was first coined.

The first neuroprosthetic applications have been medical, with the aim being the repair of damaged human sensory or motor functions. There have been some attempts to use this technology as an alternative way to treat spinal injuries, like Jake Sully’s. A non-profit consortium called the Walk Again Project has a five-year goal to help a quadriplegic paralysed by a spinal injury to walk again; the patient would use neuroprosthetic devices to control an exoskeleton, an interface reading control signals from the brain to pass to the hardware. The current leading BCI technology is called BrainGate, in which an array of microelectrodes is implanted in the primary motor centre of the brain. In 2008 researchers at the Pittsburgh Medical Center were able to show a monkey operating a robotic arm, with the relevant data being read from the animal’s brain with an invasive implant.

As for writing information to the brain, the most common neuroprosthetic device to date is the cochlear implant, in which deafness is alleviated by a device attached to the skull which directly stimulates the part of the cortex that controls hearing: “writing” a signal derived from auditory data to the appropriate part of the brain. There are also neuroprosthetic devices to restore vision, including retinal implants.

To be able to achieve such feats, you have to be able to understand the brain’s coding of the data it uses: how the firing of a particular set of neurons in a particular way is related to a particular movement of the arm, say. But experiments are proceeding worldwide on reading and understanding motor-control signals, and much more subtle signals, involving mental states associated with language, for example. These are still-tentative steps to something like true mind-reading.

The U.S. military is interested in this kind of technology; the defence research agency DARPA announced a research programme in March 2010. There are ethical concerns however about using such technologies to go beyond meeting clinical needs to enhancing human abilities beyond the natural limits.

Most of these experiments involve invasive procedures, in which the patient’s head is literally invaded by bits of wire. Jake’s scanning is non-invasive—no wires. Is this possible? We do have non-invasive neuroimaging technologies. Techniques include electroencephalography (EEG), the reading of brain waves (which dates back to the 1920s), and magneto-encephalography (MEG) and functional magnetic resonance imaging (fMRI), which are capable of producing three-dimensional images of the brain’s electrical activity. The latter techniques exploit the fact that charged particles, such as those passing between neurons in a brain, give off radiation when moving in a strong magnetic field: signals that can be picked up and analysed. Resolution is a problem; the skull itself dampens signals and blurs the neurons’ signals. Progress is being made. A company called G.Tec, based in Austria, already has a non-invasive system that allows users to control avatars in Second Life. Non-invasiveness only adds to the technical hurdles involved in hacking into the brain.

But even if Jake’s brain is read and written to non-invasively by scanners in the link unit, how is the avatar’s brain accessed? This is the other end of the link, after all, and data must be uploaded and downloaded to it at the same rate as to and from Jake’s brain. In this case the interfacing technology is contained inside the avatar’s brain. As the avatar body is being grown in its tank, the brain is grown with a reception node embedded in its cortex. We haven’t got this far in reality, but there have been experiments with “partially invasive BCIs,” where you lay a thin plastic pad full of sensors within the skull, but outside the brain.

Brain hacking is clearly a tremendous challenge, on which we’ve made barely a start. In the movie, the use of the word “psionic” in the description of the link technology is telling. “Psionics” is generally taken to mean the study of paranormal powers of the mind, such as telepathy, telekinesis, precognition and so forth. It seems to have been coined by science fiction editor John W. Campbell as a fusion of “psi” from psyche, and “onics” from words like electronics, to imply a more scientific framing of the subject. Perhaps we can infer from the use of that word that the science of the twenty-second century has advanced far beyond what is known now; perhaps there are principles at work in the link units of which we have no knowledge.

We can however assume that the link process will be mediated by a computer system vastly more powerful than either Jake’s brain or the avatar’s. The enormous artificial intelligences of the future, as predicted by Moore’s Law, will not be baffled by the computational size of the brain, nor, I would guess, by the challenge of decoding the brain’s many signals. It will be like managing the problem of interfacing an Apple Mac to a Microsoft PC by connecting them both up to that monster Chinese “Milky Way” supercomputer.

And if brain hacking does become possible many remarkable applications open up, beyond the driving of avatars. Fully immersive virtual reality, where we started this discussion, would become trivially easy. Roaming around inside the tremendous computer memories of the future, you could have any experience you want, real or fantastic, as richly detailed as the real world, and you could run them at any speed (compared to real life) as you liked: a twelve-year trip to Pandora and back crammed into a morning coffee-break. If you suffer from “Avatar withdrawal” after watching a mere movie, you might never want to come out of a simulation like that at all.

And VR might become so good that you couldn’t tell what is real and what is virtual, like the characters in the movie The Matrix. I’ve suggested myself that one resolution of the Fermi Paradox (see Chapter 26) is that we’re stuck inside a virtual reality suite run by the aliens, to hide the real universe. Oxford-based philosopher Nick Bostrom says that not only is it possible that we’re living in a virtual reality generated by some advanced culture, it is probable that we are—there are always going to be more copies than the original reality, so it’s more likely you’ll find yourself inside a copy than the original…

We’ve come a long way with this speculation, but we haven’t yet got to the bottom of the mystery of Jake’s mind-linking. For he is interfacing with a body quite unlike his own. And that presents yet more fascinating challenges.