Cyber Heads

Brain for sale – no longer needed

Is technology good or bad for our brains? With around 2.5 billion internet users, fifteen million texts being sent every minute and many people now spending more time online gaming than they do sleeping, are we all rapidly becoming mindless zombies unable to interact with others on a face-to-face basis? Or, are people getting all steamed up about the potential consequences of digital immersion that in reality pose no major threat whatsoever to the future of the human race?


cmp4-fig-5003 The more Facebook friends a person has, the greater the grey matter density in brain areas involved in social interactions.

Internet search engines make a whole world of information instantly available to us, information that is literally at our fingertips. So why would anyone want to bother committing anything to memory when it can be pulled up on a screen within nanoseconds? Labour-saving devices have unquestionably changed our lives beyond all recognition, but will the continuing tsunami of innovations leave us all with redundant brains that are unable to do anything unaided?

As yet there are no conclusive answers to these questions, but whether or not all this technology proves to be good or bad for brains, you can rest assured that your brain will have been doing what brains do so brilliantly well – it will have been changing and adapting to meet the demands of the new technological environment it is now operating in. No matter how old your brain happens to be, it will already have been busy reconfiguring and shaping up to embrace whatever new challenges this ever-expanding techno era happens to throw at it.

I want it now!

Having said that, our brains might have changed over the years but human nature hasn't. The main reason why we, out of approximately 8.7 million other species currently sharing this planet, have been so successful is largely down to a deep-rooted desire to progress. Thanks to lightning speed technology, we are now all advancing at an ever-increasing rate with everybody expecting everything to be done in an instant – we want it now! And, if we don't get what we want quick enough, human nature dictates that we will look for a shortcut, and as soon as one becomes available, we'll take it!


Fast food expectations
When you read a microwave meal's instructions and it says “Microwave for two and a half minutes, stir well, re-cover and then microwave for a further two and a half minutes before stirring again and serving.” Are you disappointed that's it's going to be such a laborious process?

Are you dismayed that it's going to take so long, that you're going to have to wait at least five and a half minutes before getting to eat it, and that you're actually going to have to bother taking it out half way through to stir it?!

With shortcuts at the very heart of human nature we are all unsurprisingly more than happy to take the fastest option that modern technology can provide. And, with the prospect of life improvement being at the forefront of our minds, most of us will, at the earliest opportunity, interface with whatever the latest technology happens to be, in the belief that it's going to make life easier or more interesting.

We may always be looking for the fastest, easiest route forward and by using a whole array of “external brain” devices at our disposal, ranging from PCs to phones, we may have found numerous new ways of shifting a lot of the workload to them. But that doesn't necessarily mean our brains are going to have less to do, nor does it mean that they are going to become lazier and inevitably, having become semi-dormant, find themselves out of a job.

cmp4-fig-5001

On the contrary, a lot of the groundwork might be done for us, but by shifting it we simply free up cognitive resources for a new pile of work along with a whole new set of pressures and obstacles created by it. All of which will need to be overcome and dealt with at an ever-quickening rate. When calculators first came onto the scene, there were serious concerns that they would make brains lazy and more recently, when internet search engines first appeared, there was much talk about them making minds intellectually stagnant. In both instances, these worries have been unfounded, the reality being that new doors have opened and brains being brains have moved on to bigger, more exciting challenges.

Despite how much technology comes onto the scene, if we fully utilize it, our brains should be moving in the exact opposite direction of idle. Having once been stretched to their limits to perform a particular function, they should find themselves having to rise to completely different tests in fresh areas.

For example, where a brain was once challenged by the prospect of map reading, advancements in technology will mean that despite no longer having to be quite so proficient in that skill, it will have to rise to the challenges of operating and efficiently following the instructions of a satnav. Those of us who have experienced setting off from A to go to B – only to end up in C – will know that this is no mean feat. Despite thousands of years' worth of inherited instinct and our gut feelings screaming at us that we are heading in completely the wrong direction, we still keep faith with the technology!


Forever lost
Your satnav doesn't have common sense – you do. Map reading and navigation are useful skills to have, particularly in the event of a technical hiccup. If you do want to hold onto them, maintaining your self-navigational skills is simple. Don't rely on satnav all the time, especially when you want to get to places you've been to many times before. Think of all those drivers of London's black cabs whose inflated hippocampi shrink back down after they retire – keep on refreshing those navigation pathways of your brain. Before switching on the satnav, take a look at your route on a map to give yourself an idea of where you're going and, hopefully, the next time you find yourself driving the wrong way down a one-way street, in the middle of a building site or along a road that doesn’t exist, common sense will prevail.

The bottom line is that whether or not a brain does get made redundant is of course up to its owner. Neither technology nor the brain itself can be blamed if through lack of activity it does get left behind. Provided you give it the opportunity to be stretched, it is more than capable of keeping up.

External brain reliance

As far as brain health is concerned, making use of technology is not in itself a problem. What is currently causing some alarm in some circles is the increasing number of people who are becoming permanently hooked into, and addictively dependent on, technology.

Even the less observant will have noticed the blind reliance that more and more people are placing in the hands of these devices. Taking a few moments to notice and reflect upon the behaviours of people within your immediate vicinity will surely convince you that an obsession with gadgetry is taking over people's lives. Walk down any busy street in any town and it won't be long before you see someone scurrying along the pavement, head down, squinting at some device or other – only to step out into oncoming traffic without looking.

With minds elsewhere, these digital lemmings seem completely oblivious at times as to just how close they are to eradicating themselves from the human gene pool. Perhaps their need to be permanently technologically engaged drowns out their awareness of everything else around them, or maybe an overestimation of their own abilities to multitask is leading to more subtle cognitive drawbacks than being bounced off a bus.

Whatever the reason, many people are becoming too dependent on new technology for their own good. One of the classic measures of overdependence is automated, unthinking behaviour. How often have you seen people in your midst failing to resist the temptation to pull out their phone the moment they hear it beep, buzz, ring or feel it vibrate? Have you noticed that this happens irrespective of whether the circumstances make it appropriate to do so? The most popular times seem to be during meetings, in restaurants or whilst attempting tricky driving manoeuvres.

Should such unsociable, at times rude, and potentially dangerous tech habits be tolerated on the basis that “you can't stop progress?” There is no definitive answer to this as it is down to each and every one of us to decide for ourselves. There is, however, one thing that is becoming apparent, whether it is you, family members, friends, colleagues or random strangers you've witnessed doing this – it's a fairly safe bet to make that you are now so familiar with these scenarios that you're beginning to accept them as the norm.


Yesterday's “black art” – today's norm
As an IT sales person back in the 1980s I earned a lot of commission selling fax machines. Having witnessed my well-rehearsed product demonstration involving the transferral of a document between two “facsimile” devices, people's eyes would light up in disbelief as they struggled to comprehend what they had just witnessed. As if by magic, the very same document sent from one machine would slowly emerge from the other. The endless possibilities for those beholding this wonderment would immediately become apparent, and I could once again look forward to smashing my target. Alas, the window of opportunity, as with all the other IT products I went on to sell, would suddenly close. What was once regarded as the very latest “black art” technology soon became the norm.

It's hard to comprehend that a fax machine is something that people once got very excited about, that not that long ago people were astonished to find themselves talking to one another whilst “out and about” on phones with no wires, or that it was once amazing to see and hear someone right across the other side of the world on a laptop screen.

IT product development is now moving on so fast that whatever you see on sale in the shops or online has already been superseded, you're looking at yesterday's technology. The most interesting thing about this once fascinating fact is that with your brain's ability to embrace all things new, it probably comes to you as no surprise whatsoever!

– Adrian

Will technology ruin our brains?

To date, most of the available evidence on whether digital technology is good or bad for your brain is purely anecdotal. Studies are being conducted at this very moment to provide hard data that will establish whether our obsessive use of gadgets is having unintended consequences. In the meantime, there are a few studies that have already hit the academic press from which we can begin to forecast future findings.

Technology in itself is neither good nor bad, the problem lies in how we use it. Your malleable brain, as you are now more than aware, will accommodate the demands of any environment, this being the case whether the environment in question is physical or virtual. This ongoing accommodation will happen for better or for worse, whether you like it or not, as long as you continue to engage regularly, intensively and consistently with any given technology.

In addition to this, as we know, old habits – once formed through repetition – die hard. For instance, eating habits adopted in childhood (when metabolism is relatively high) almost always continue into adulthood (when metabolism inevitably slows down). The consequent excess of calories leads to an ever-expanding waistline, a scenario familiar to all but the most disciplined of eaters. The same principle can be applied to technology. Once a person develops a reliance on technology, not to mention an expectation of regular messages and online updates, they can end up panicked by any interruption to the flow of communication. They may be thrown into a hissy fit when unable to get a connection or, worse still, fall into a spiral of depression when a whole day passes by without hearing the reassuring ping of messages arriving in their inboxes.


The elephant in the room
Access: In 1984, there were one thousand devices hooked up to the internet across the globe. Eight years later in 1992, this number hit the one million mark. The one billion mark was crossed in 2008.

Quantity: There were 4 exabytes of new data created in 2012 – that's four billion, billion bytes – more new information created in a single year than in the 5000 preceding years put together.

Staying in to play

With some people spending up to two months a year glued to them, the big concern for a long time was the amount of time people spent watching television. With the average household now having more screens in it than people, the latest worry is over the amount of time being devoted to gaming. With so many spending huge chunks of their lives participating, a major concern with video gaming in recent decades has been that the violent nature of many titles might be leading to a new generation of morally corrupt individuals. It turns out there is little evidence to support this.

In both cases of excessive TV watching and video game playing, the real root cause of problems revolves around displacement. Displacement of time that could be spent socializing face-to-face, for instance, and thereby gaining valuable experience enabling the all-important “soft skills” of communication to be developed. Major problems with technology arise when digital immersion displaces all the spare time that might otherwise be devoted to real-world engagement such as engaging in group activities or taking part in sports. The brain pathways involved in such activities either don't develop properly in the first place or start to fade away.

It's very much about getting the balance right. There are in fact several benefits to be had when gaming enthusiasts clock up many hours playing action video games. It forces their brains to adapt to the perceptual and cognitive demands of such virtual worlds, leading to unexpected, positive enhancements in several areas. In contrast to non-action video gaming, superior visual perception, visual short-term memory, spatial cognition, mental rotation, multitasking and rapid decision-making improvements have all been demonstrated to result from intensive action video gaming.


The trouble with the youth of today
When driving in my car I often stop just down the road from where I live to let a group of teenagers coming home from school cross the road. Every time I do this, I get the distinct feeling that both my car, and myself, are invisible. I don't think in the past two years any of them have ever put their hand up in acknowledgement of me stopping for them or even given me so much as a nod of recognition that I exist. This worries me.

It could be that they simply lack the confidence to engage with others outside their group. Maybe they just don't have the social skills to do it or perhaps common courtesy isn't particularly high up on their agenda. They appear to be so wrapped up in their own worlds that they are oblivious to everything around them. I've often thought that the possible source of this disconnected, insular behaviour could be today's technology. It would be easy for me to attribute it to them being tech natives and assume their preferred mode of communication must be via email or text.

Yes, I could put it all down to technology, but then I'm always reminded of this:

“Our youths love luxury. They have bad manners, contempt for authority – they show disrespect for their elders and love to chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when their elders enter the room. They contradict their parents, chatter before company, gobble up food, and tyrannize teachers.”

It's a quote attributed to Socrates the famous Greek philosopher who lived from 469BC to 399BC. It would seem that in two and a half thousand years, technology or no technology, some things just haven't changed!

– Adrian

In addition, the concerns of parents worrying that too much time being spent staring at the screen might ruin their kids' eyesight was also proved to be completely unfounded. On the contrary, it may actually improve certain aspects of vision, so much so that video gaming may now be prescribed as therapy to help people improve visual problems associated with conditions such as amblyopia (“Lazy Eye”).

Another aspect of displacement worth mentioning is that unless you're managing to exercise at the same time, spending hours on end in front of a screen, whether it be a TV, PC, laptop, tablet, games console or phone screen, it isn't going to do anyone any favours in the physical health department. Excessive amounts of screen watching has been directly linked to obesity, increased cardiovascular disease and Type 2 diabetes.


Fatal attraction
In South Korea, a 41-year-old man and a 27-year-old woman became so obsessed with an online role-play game involving caring for a virtual girl that, in a horrifyingly ironic twist of fate, they accidentally starved their own, real-life, three-month-old baby girl to death.

In 2005, another South Korean called Mr Lee dropped dead of heart failure after playing a game called Starcraft for over 50 hours straight.

In 2007, a Chinese man called Zhang died suddenly after playing World of Warcraft continuously for seven days.

These cases, and many others, have led to the term “Internet Addiction” finding its way into the psychiatric profession's official diagnostic manual (DSM-5).

You'd be forgiven for thinking that this is a problem specific to East Asia, but this isn't the case. Digital innovations are made available and embraced earliest in these countries, before much of the rest of the world gets their hands on them. Observing the negative outcomes of digital immersion amongst early up-takers gives the rest of the world a valuable heads up that might help others take measures to avoid such lethal scenarios.

Multitasking?

Not that long ago it was rare for people to watch TV whilst simultaneously surfing the internet; now such dual screening is commonplace. Studies investigating this behaviour have revealed that during a 30-minute viewing period people will switch between the two screens on average around one hundred and twenty times. That's about four times per minute or once every fifteen seconds! Might such behaviour herald a new era of prolific multitasking that enables us to squeeze yet greater efficiency out of our busy, information overloaded lives? Probably not.

cmp4-fig-5002

The first thing to get to grips with is that there is no such thing as true multitasking. Our brains have yet to evolve the capacity to actually perform two completely different cognitive processes simultaneously. Mental tasks that feel like they are being done in parallel actually involve rapid switching between the two. And, anytime a human brain switches from one task to another, there is an associated cost. You don't quite pick up where you left off when your mind returns to the “other” task. There is always a slight delay in remembering exactly where you were in the thought process and in recommencing that cognitive process.

Women are famously good at multitasking behaviours, but the fact that these are all happening in parallel is purely an illusion. The reason women are good at doing multiple things simultaneously revolves around a superior ability compared with most men to minimize the cost associated with each switch. Whilst some people, through regular, intense and consistent training have become extremely adept at making this process as efficient as possible, there is nonetheless always a small, unavoidable, cognitive cost associated with each switch between tasks. In men the switch costs tends to be larger than in women, but in both sexes there is a measurable negative consequence of switching between multiple cognitive tasks rather than focusing on one task through to completion.

The buzz and bleep of modern living

Smart phones are a constant source of unintended distraction. “Unintended” is the operative word here because “intended” distraction is ideal for encouraging certain useful brain states, as we'll discover in the following chapter. Creative thinking really benefits from a bit of distraction, especially when a person's brain is cluttered with anxious thoughts and stuck in rigid ruts, struggling with seemingly unsolvable problems. The “intended” distraction shifts the brain into a different gear, de-focuses the cognitive machinery to let unrelated ideas flow more freely.

The problem with “unintended” distractions from phone alerts notifying you of a text message, email, call, or online social networking update is that your attention is repeatedly pulled away from your thought processes. Each time this happens your brain incurs a switch cost. If you allow your environment to constantly interrupt your thought processes then all those little distractions will add up to a very unproductive day at the office.

But won't our ever-changing brains adapt to help us perform better in any environment? Surely brain pathways will be reconfigured to help us block out these minor distractions? It is possible, but unless you adopt a specific strategy to train up such useful brain adaptations over many weeks and months, all the research carried out so far indicates that this is highly unlikely to happen spontaneously.


Instant response pressure
The virtual world of social and professional networking is cunningly designed to lure us into patterns of use that make us addicted to constant interaction. It is generating a whole new, previously unknown world of stress. Teenagers and adults alike are often expected to reply immediately to any text, email, instant message or online social networking message. Teenagers who do not respond straight away risk being socially excluded and adults face the possibility of losing out at work or in business. Many people now feel a tremendous pressure to be available constantly, day and night, and throughout the weekend.

The ultimate cause is other people's expectations of an immediate response. It would be wise for all of us to do whatever we can to change other people's expectations and create some distance between us and the cyber-onslaught of constant connectivity. What creates these expectations in the first place is the all-powerful influence of instant gratification. Activity increases along the brain's pleasure pathways (Reward Line - please see brain tube map on page 8) as they are temporarily satisfied by a quick message response. Conversely, not receiving an expected reply, leads to decreased activity along the Reward Line, causing feelings of disappointment and anxiety.

If we are to have any hope of changing other people's expectations we need to make it clear to all – “Sorry, but you probably won't receive a response from me straight away, so please don't expect one. I will however get back to you within the next 24 hours. Thank you.”

We can't change a culture of high expectations overnight but, little by little, starting with friends and family then moving on to professional connections, once everyone realizes just how important this is we could eventually regain the right to reply – at a time that is right for us.

Brain training ourselves to distraction

Scientific investigations into the effects of multitasking and constant interruption on our cognitive abilities is in its infancy. Early findings are extremely interesting and provide some valuable clues about what we are likely to discover as the experimental data continues to accumulate. In a comparison of heavy-to-low media multitaskers it was found that people who regularly use technology to do multiple tasks at the same time are less able to ignore distractions than those who don't. The task in question involved a display of lines tilted at different angles and surrounded by a varying number of additional lines to serve as distractors. The performance of the heavy-media multitaskers declined when more visual distractors were added. However, the performance of the low-media multitaskers remained stable no matter how many additional distractors were added. In other words the low-media multitaskers had retained the ability to stop the distractors from interfering with the cognitive task, but the heavy-media multitaskers had lost this ability.

One likely explanation to account for this data is that the daily, intensive, consistent media multitasking behaviours has led to brain changes that make them more, not less, sensitive to distraction. They may have unwittingly trained their brains to refocus automatically on any external information that arises in their environment. The only way to prove that these behaviours actually cause increased sensitivity to distraction would be to compare measurements before and after people started engaging in heavy multitasking. You can be certain that someone, somewhere, will be conducting research into it right now. In the meantime you might want to think twice about checking your emails a hundred times a day.


cmp4-fig-5003 Heavy media multitaskers are generally worse at controlling their impulses and score lower in tests of fluid intelligence than light media multitaskers.

Assuming that further studies investigating these phenomena all point in the same direction, the implications are clear. We need to stop and think about how we use the wonderful tools of digital technology so that we harness all the many benefits without falling into the trap of inadvertently training up cognitive processes that overall serve us poorly. We can all then establish our own set of rules to help us maximize the benefits and minimize the cognitive drawbacks. For instance, next time you get a new phone – actually read the instruction manual. Not the whole thing, just the part that guides you through the process of changing which alerts can make it buzz and bleep, and which silently register the message without causing an “unintentional” distraction. Switch off all alerts that are rarely time critical.

Ask yourself – do your text messages really require instant attention? Do you have to be alerted to every email that lands in your inbox the second it arrives? Is the news that a random stranger is now following you on Twitter urgent enough to risk halting a fascinating conversation, to steal precious time spent with family and friends, to derail a productive flow of thought, crash a car or walk out into the road? Or can these relatively low-priority interruptions wait till later?


Digital addiction litmus test
How often do you feel you need to pull your phone out and check it for messages?

Having given yourself an honest answer, try this: switch your phone to silent and hide it away somewhere for sixty minutes at a time for just one day. Only allow yourself to have a look once an hour – no peeking!

Mankind has managed to get by for thousands of years without handheld communication devices and yet research suggests that on average we check our smartphones every six and a half minutes!

Over the next five days keep a tally of how often you do find yourself picking up your phone and checking it. You might surprise yourself as to just how addicted you've become. Surely once an hour is enough to stay on top of everything. Isn't it?!

Chapter takeaways