Cyber Heads
Is technology good or bad for our brains? With around 2.5 billion internet users, fifteen million texts being sent every minute and many people now spending more time online gaming than they do sleeping, are we all rapidly becoming mindless zombies unable to interact with others on a face-to-face basis? Or, are people getting all steamed up about the potential consequences of digital immersion that in reality pose no major threat whatsoever to the future of the human race?
Internet search engines make a whole world of information instantly available to us, information that is literally at our fingertips. So why would anyone want to bother committing anything to memory when it can be pulled up on a screen within nanoseconds? Labour-saving devices have unquestionably changed our lives beyond all recognition, but will the continuing tsunami of innovations leave us all with redundant brains that are unable to do anything unaided?
As yet there are no conclusive answers to these questions, but whether or not all this technology proves to be good or bad for brains, you can rest assured that your brain will have been doing what brains do so brilliantly well – it will have been changing and adapting to meet the demands of the new technological environment it is now operating in. No matter how old your brain happens to be, it will already have been busy reconfiguring and shaping up to embrace whatever new challenges this ever-expanding techno era happens to throw at it.
Having said that, our brains might have changed over the years but human nature hasn't. The main reason why we, out of approximately 8.7 million other species currently sharing this planet, have been so successful is largely down to a deep-rooted desire to progress. Thanks to lightning speed technology, we are now all advancing at an ever-increasing rate with everybody expecting everything to be done in an instant – we want it now! And, if we don't get what we want quick enough, human nature dictates that we will look for a shortcut, and as soon as one becomes available, we'll take it!
With shortcuts at the very heart of human nature we are all unsurprisingly more than happy to take the fastest option that modern technology can provide. And, with the prospect of life improvement being at the forefront of our minds, most of us will, at the earliest opportunity, interface with whatever the latest technology happens to be, in the belief that it's going to make life easier or more interesting.
We may always be looking for the fastest, easiest route forward and by using a whole array of “external brain” devices at our disposal, ranging from PCs to phones, we may have found numerous new ways of shifting a lot of the workload to them. But that doesn't necessarily mean our brains are going to have less to do, nor does it mean that they are going to become lazier and inevitably, having become semi-dormant, find themselves out of a job.
On the contrary, a lot of the groundwork might be done for us, but by shifting it we simply free up cognitive resources for a new pile of work along with a whole new set of pressures and obstacles created by it. All of which will need to be overcome and dealt with at an ever-quickening rate. When calculators first came onto the scene, there were serious concerns that they would make brains lazy and more recently, when internet search engines first appeared, there was much talk about them making minds intellectually stagnant. In both instances, these worries have been unfounded, the reality being that new doors have opened and brains being brains have moved on to bigger, more exciting challenges.
Despite how much technology comes onto the scene, if we fully utilize it, our brains should be moving in the exact opposite direction of idle. Having once been stretched to their limits to perform a particular function, they should find themselves having to rise to completely different tests in fresh areas.
For example, where a brain was once challenged by the prospect of map reading, advancements in technology will mean that despite no longer having to be quite so proficient in that skill, it will have to rise to the challenges of operating and efficiently following the instructions of a satnav. Those of us who have experienced setting off from A to go to B – only to end up in C – will know that this is no mean feat. Despite thousands of years' worth of inherited instinct and our gut feelings screaming at us that we are heading in completely the wrong direction, we still keep faith with the technology!
The bottom line is that whether or not a brain does get made redundant is of course up to its owner. Neither technology nor the brain itself can be blamed if through lack of activity it does get left behind. Provided you give it the opportunity to be stretched, it is more than capable of keeping up.
As far as brain health is concerned, making use of technology is not in itself a problem. What is currently causing some alarm in some circles is the increasing number of people who are becoming permanently hooked into, and addictively dependent on, technology.
Even the less observant will have noticed the blind reliance that more and more people are placing in the hands of these devices. Taking a few moments to notice and reflect upon the behaviours of people within your immediate vicinity will surely convince you that an obsession with gadgetry is taking over people's lives. Walk down any busy street in any town and it won't be long before you see someone scurrying along the pavement, head down, squinting at some device or other – only to step out into oncoming traffic without looking.
With minds elsewhere, these digital lemmings seem completely oblivious at times as to just how close they are to eradicating themselves from the human gene pool. Perhaps their need to be permanently technologically engaged drowns out their awareness of everything else around them, or maybe an overestimation of their own abilities to multitask is leading to more subtle cognitive drawbacks than being bounced off a bus.
Whatever the reason, many people are becoming too dependent on new technology for their own good. One of the classic measures of overdependence is automated, unthinking behaviour. How often have you seen people in your midst failing to resist the temptation to pull out their phone the moment they hear it beep, buzz, ring or feel it vibrate? Have you noticed that this happens irrespective of whether the circumstances make it appropriate to do so? The most popular times seem to be during meetings, in restaurants or whilst attempting tricky driving manoeuvres.
Should such unsociable, at times rude, and potentially dangerous tech habits be tolerated on the basis that “you can't stop progress?” There is no definitive answer to this as it is down to each and every one of us to decide for ourselves. There is, however, one thing that is becoming apparent, whether it is you, family members, friends, colleagues or random strangers you've witnessed doing this – it's a fairly safe bet to make that you are now so familiar with these scenarios that you're beginning to accept them as the norm.
To date, most of the available evidence on whether digital technology is good or bad for your brain is purely anecdotal. Studies are being conducted at this very moment to provide hard data that will establish whether our obsessive use of gadgets is having unintended consequences. In the meantime, there are a few studies that have already hit the academic press from which we can begin to forecast future findings.
Technology in itself is neither good nor bad, the problem lies in how we use it. Your malleable brain, as you are now more than aware, will accommodate the demands of any environment, this being the case whether the environment in question is physical or virtual. This ongoing accommodation will happen for better or for worse, whether you like it or not, as long as you continue to engage regularly, intensively and consistently with any given technology.
In addition to this, as we know, old habits – once formed through repetition – die hard. For instance, eating habits adopted in childhood (when metabolism is relatively high) almost always continue into adulthood (when metabolism inevitably slows down). The consequent excess of calories leads to an ever-expanding waistline, a scenario familiar to all but the most disciplined of eaters. The same principle can be applied to technology. Once a person develops a reliance on technology, not to mention an expectation of regular messages and online updates, they can end up panicked by any interruption to the flow of communication. They may be thrown into a hissy fit when unable to get a connection or, worse still, fall into a spiral of depression when a whole day passes by without hearing the reassuring ping of messages arriving in their inboxes.
With some people spending up to two months a year glued to them, the big concern for a long time was the amount of time people spent watching television. With the average household now having more screens in it than people, the latest worry is over the amount of time being devoted to gaming. With so many spending huge chunks of their lives participating, a major concern with video gaming in recent decades has been that the violent nature of many titles might be leading to a new generation of morally corrupt individuals. It turns out there is little evidence to support this.
In both cases of excessive TV watching and video game playing, the real root cause of problems revolves around displacement. Displacement of time that could be spent socializing face-to-face, for instance, and thereby gaining valuable experience enabling the all-important “soft skills” of communication to be developed. Major problems with technology arise when digital immersion displaces all the spare time that might otherwise be devoted to real-world engagement such as engaging in group activities or taking part in sports. The brain pathways involved in such activities either don't develop properly in the first place or start to fade away.
It's very much about getting the balance right. There are in fact several benefits to be had when gaming enthusiasts clock up many hours playing action video games. It forces their brains to adapt to the perceptual and cognitive demands of such virtual worlds, leading to unexpected, positive enhancements in several areas. In contrast to non-action video gaming, superior visual perception, visual short-term memory, spatial cognition, mental rotation, multitasking and rapid decision-making improvements have all been demonstrated to result from intensive action video gaming.
“Our youths love luxury. They have bad manners, contempt for authority – they show disrespect for their elders and love to chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when their elders enter the room. They contradict their parents, chatter before company, gobble up food, and tyrannize teachers.”
In addition, the concerns of parents worrying that too much time being spent staring at the screen might ruin their kids' eyesight was also proved to be completely unfounded. On the contrary, it may actually improve certain aspects of vision, so much so that video gaming may now be prescribed as therapy to help people improve visual problems associated with conditions such as amblyopia (“Lazy Eye”).
Another aspect of displacement worth mentioning is that unless you're managing to exercise at the same time, spending hours on end in front of a screen, whether it be a TV, PC, laptop, tablet, games console or phone screen, it isn't going to do anyone any favours in the physical health department. Excessive amounts of screen watching has been directly linked to obesity, increased cardiovascular disease and Type 2 diabetes.
Not that long ago it was rare for people to watch TV whilst simultaneously surfing the internet; now such dual screening is commonplace. Studies investigating this behaviour have revealed that during a 30-minute viewing period people will switch between the two screens on average around one hundred and twenty times. That's about four times per minute or once every fifteen seconds! Might such behaviour herald a new era of prolific multitasking that enables us to squeeze yet greater efficiency out of our busy, information overloaded lives? Probably not.
The first thing to get to grips with is that there is no such thing as true multitasking. Our brains have yet to evolve the capacity to actually perform two completely different cognitive processes simultaneously. Mental tasks that feel like they are being done in parallel actually involve rapid switching between the two. And, anytime a human brain switches from one task to another, there is an associated cost. You don't quite pick up where you left off when your mind returns to the “other” task. There is always a slight delay in remembering exactly where you were in the thought process and in recommencing that cognitive process.
Women are famously good at multitasking behaviours, but the fact that these are all happening in parallel is purely an illusion. The reason women are good at doing multiple things simultaneously revolves around a superior ability compared with most men to minimize the cost associated with each switch. Whilst some people, through regular, intense and consistent training have become extremely adept at making this process as efficient as possible, there is nonetheless always a small, unavoidable, cognitive cost associated with each switch between tasks. In men the switch costs tends to be larger than in women, but in both sexes there is a measurable negative consequence of switching between multiple cognitive tasks rather than focusing on one task through to completion.
Smart phones are a constant source of unintended distraction. “Unintended” is the operative word here because “intended” distraction is ideal for encouraging certain useful brain states, as we'll discover in the following chapter. Creative thinking really benefits from a bit of distraction, especially when a person's brain is cluttered with anxious thoughts and stuck in rigid ruts, struggling with seemingly unsolvable problems. The “intended” distraction shifts the brain into a different gear, de-focuses the cognitive machinery to let unrelated ideas flow more freely.
The problem with “unintended” distractions from phone alerts notifying you of a text message, email, call, or online social networking update is that your attention is repeatedly pulled away from your thought processes. Each time this happens your brain incurs a switch cost. If you allow your environment to constantly interrupt your thought processes then all those little distractions will add up to a very unproductive day at the office.
But won't our ever-changing brains adapt to help us perform better in any environment? Surely brain pathways will be reconfigured to help us block out these minor distractions? It is possible, but unless you adopt a specific strategy to train up such useful brain adaptations over many weeks and months, all the research carried out so far indicates that this is highly unlikely to happen spontaneously.
Scientific investigations into the effects of multitasking and constant interruption on our cognitive abilities is in its infancy. Early findings are extremely interesting and provide some valuable clues about what we are likely to discover as the experimental data continues to accumulate. In a comparison of heavy-to-low media multitaskers it was found that people who regularly use technology to do multiple tasks at the same time are less able to ignore distractions than those who don't. The task in question involved a display of lines tilted at different angles and surrounded by a varying number of additional lines to serve as distractors. The performance of the heavy-media multitaskers declined when more visual distractors were added. However, the performance of the low-media multitaskers remained stable no matter how many additional distractors were added. In other words the low-media multitaskers had retained the ability to stop the distractors from interfering with the cognitive task, but the heavy-media multitaskers had lost this ability.
One likely explanation to account for this data is that the daily, intensive, consistent media multitasking behaviours has led to brain changes that make them more, not less, sensitive to distraction. They may have unwittingly trained their brains to refocus automatically on any external information that arises in their environment. The only way to prove that these behaviours actually cause increased sensitivity to distraction would be to compare measurements before and after people started engaging in heavy multitasking. You can be certain that someone, somewhere, will be conducting research into it right now. In the meantime you might want to think twice about checking your emails a hundred times a day.
Assuming that further studies investigating these phenomena all point in the same direction, the implications are clear. We need to stop and think about how we use the wonderful tools of digital technology so that we harness all the many benefits without falling into the trap of inadvertently training up cognitive processes that overall serve us poorly. We can all then establish our own set of rules to help us maximize the benefits and minimize the cognitive drawbacks. For instance, next time you get a new phone – actually read the instruction manual. Not the whole thing, just the part that guides you through the process of changing which alerts can make it buzz and bleep, and which silently register the message without causing an “unintentional” distraction. Switch off all alerts that are rarely time critical.
Ask yourself – do your text messages really require instant attention? Do you have to be alerted to every email that lands in your inbox the second it arrives? Is the news that a random stranger is now following you on Twitter urgent enough to risk halting a fascinating conversation, to steal precious time spent with family and friends, to derail a productive flow of thought, crash a car or walk out into the road? Or can these relatively low-priority interruptions wait till later?