32 Brain–machine interfaces

Forget keyboards, computer mice, gesture-based computing or synthesized speech—we will soon have brain-to-machine interfaces that link the human brain directly to various external devices. So, in the future, we’ll still ask computer technicians for an upgrade, but it will be for us, not our computers.

Brain–machine interfaces—or brain–computer interfaces—have existed in research laboratories for a while, especially in labs working to create computer interfaces for people unable to use their hands or other parts of their bodies. In 1995 direct brain-to-machine control surfaced in, of all places, toy shops, with toys such as the Star Wars Force Trainer and Mindflex. So how much longer are you going to have to wait until the problem of finding the lost TV remote is solved by direct brain-to-television channel-changing headgear? The answer is possibly not very long. Apple is already rumored to be working on a smart TV that doesn’t need a remote because you just talk to the television when you need it to do something.

We don’t know what the limits are yet.
Melody Moore Jackson, Georgia Tech University’s BrainLab

Assistive technologies have been around for a long time. Wheelchairs are an early example, but in the future they will probably be controlled by thought as well as via joysticks. Or what if people with disabilities could have electrodes implanted or wear a skullcap that connected their brain directly to an artificial limb? Both more or less exist already, as do various brain–machine interfaces to treat memory disorders of aging, especially serious diseases such as Alzheimer’s and Parkinson’s. Sometimes this technology uses noninvasive electrodes placed on the scalp of a patient; in other instances electrodes are implanted directly inside the patient’s brain. The advantage of implants is accuracy of commands.

So what’s next? First up are many more medical applications, especially for people suffering from verbal communication problems or movement disorders. Severely disabled individuals or those suffering from “locked in syndrome” would be prime candidates for such neural interfaces.

Then we may see the human brain being linked to various appliances and consumer devices ranging from cars and refrigerators to household lighting. Wouldn’t it be great, for instance, if you could just think “lights on” and your lights switched themselves on, or “oven on” and on it goes? Or maybe not. After all, our bodies need some physical activity to prevent them from wasting away. But perhaps even this problem can be solved using implants that stimulate our muscles.


All done for you

It’s 7 a.m. There is sun poking through the curtains in your bedroom, so you think about opening them along with a small window to let in some fresh air. Both the curtains and the window immediately oblige. You then get out of bed and walk into the bathroom and the shower starts running the moment you think about it. It looks a little hot so you think about adding some cold water and the steam starts to disappear. Downstairs you pass the coffee machine, which has already dispensed some coffee because it’s synchronized to turn itself on once the curtains are opened. At this point your fridge reminds you that you should do five minutes’ exercise to offset the lack of physical tasks already encountered at the start of your day. Is this your idea of heaven or hell?


We’re moving ahead so rapidly, it’s not going to be that long before we will be able to tell whether someone’s making up a story, or whether someone intended to do a crime with a certain degree of certainty.
Barbara Sahakian, Professor of Neuropsychology, Cambridge University

Data download The dream, to some extent, is to allow people to download data directly into their brains. This could be useful from an educational point of view, enabling students to implant thoughts directly into their heads, or perhaps it might one day allow people to download or share their own dreams. It would certainly take computer gaming to a new level, and introduces some intriguing philosophical questions concerning reality.

It also raises the question of whether we’ll one day be able to read other peoples’ brains remotely to find out what they’re thinking or what they’re planning to do in the future, which takes us directly to that classic sci-fi movie Minority Report and the Department of Future Crime. Again, it sounds somewhat fanciful, but I can almost guarantee that one day society will be having debates about ethical issues surrounding electronic eavesdropping on our innermost thoughts or on the probing of future intentions.

And what of animals? This is most definitely fringe thinking, but it’s possible that consciousness exists as a continuum, with animals and even plants having varying degrees of consciousness. If this were to be true, and we could find a way of tapping into this and communicating with them, it would literally be mind-blowing, not least because a major justification for killing animals ethically is the fact that they are not self-aware, or do not have knowledge of their own existence, in quite the same fundamental way that we human beings do. Can you imagine, for example, if we suddenly found a way to communicate with dolphins and they told us what they thought of us?


Force-powered telekinesis

A few years ago I bought a Star Wars Force Trainer toy in Sydney, Australia, for about AUD$150. It was for me. I wanted to see whether it was really possible to use a brain–machine interface (headset) to send brain waves to the toy, which would then start up a motorized fan that would send a small ping-pong ball up a tube. Did it work? Up to a point, although I get more use from my ballpoint pen that translates handwritten text into digital files and can make audio and video recordings of important meetings.


What do we really know? On another level, future possibilities include living in a world where almost any manmade device can be accessed, questioned or controlled by thought alone and where communication between individuals, even those geographically distant, is facilitated by a form of mental telepresence or psychic sixth-sense technology.

Once again, this probably sounds fanciful. But it’s not impossible. Furthermore, if and when we do enter this realm, a number of questions emerge. The first is how will we know that we really exist in the way that we think we do? Perhaps we’ve always been living inside a computer simulation? If the simulation was sophisticated enough and determined the exact inputs fed to our brains, how could we tell the difference? This would be unpleasant if we couldn’t control it, but imagine the possibilities if we could. It’s not time travel or teleportation, but it’s getting very close.

the condensed idea

Thought control

timeline
2000 Electrode arrays implanted into owl monkeys
2001 Technology allows a monkey to operate a robotic arm via thought control
2006 Teenager plays Space Invaders using brain signals
2008 Scientists manage to extract images from a person’s mind
2009 Brain–Twitter interface
2017 Voice control replaces 70 percent of keyboards
2026 Google patents neural interface