3am: he lay awake again. The same noises shuffled in from the lawn; formless noises like footsteps and whispers or wind and softly groaning trees. Every day that week on his way to work, he had noticed the same bearded man standing on the corner near the station café behind dark sunglasses that seemed to track his progress across the platform. He wondered if he was being followed as he passed the newsstand where headlines flashed the trauma of at least 200 people killed and thousands injured in a terrorist bombing in Madrid – Spain’s own 9/11. At first he thought he was simply being paranoid. Had helping a Taliban supporter in a custody battle over a year ago come back to haunt him? 4am: more muffled voices in the yard. Every day he grew more certain that he was being watched. Doors that were open when his wife left home were locked when she returned. Small things were out of place in the house, in his bathroom. He prayed to Allah that his family would be safe. After all, he had done nothing wrong and had nothing to fear.
This is the position Brandon Mayfield was in on 6 May 2014 when two FBI agents knocked on the door of his law office in Portland, Oregon. At first they were polite, just wanting to ask him a few questions, but as they eventually forced their way through the door under cover of a warrant for his arrest, Brandon knew something more serious was going on – really serious. He soon learnt that his fingerprint had been found on a bag of detonators near the scene of the terrorist bombings in Spain. The print was a near perfect match.
Mayfield had converted to Islam to marry his Egyptian wife, Mona. He had defended a convicted terrorist in a child custody case giving him known links to a terrorist organisation. His computer held further “evidence” against him in the form of searches for “flights to Spain” and “flying lessons”. In addition, he’d received combat training during his time in the US military. Surely Mayfield was a terrorist and the FBI had their man.
Brandon Mayfield was a lawyer and a father of four who gave his time and expertise to help those who couldn’t afford high legal fees. His passport had long since expired and he hadn’t left the US in over 10 years. Despite this, the FBI had a solid case against him with only the opinion of an independent forensic expert outstanding. This expert was brought in to verify the FBI’s findings and his verdict would largely seal Mayfield’s fate. Surely this man would find the error in the FBI’s analysis, surely this was all a huge mistake? Unfortunately, the forensic expert also decreed with certainty that the print on the bag in Spain was indeed that of the accused. “100% Verified” declared the FBI report. The full weight of the justice system now pinned Mayfield to the wall yet he maintained he was wrongfully accused.
The Spanish government wasn’t so sure. After Mayfield had been in captivity for two weeks, it emerged that the Spanish police had repeatedly informed the FBI that they had found another match for the print on the detonator bag – an Algerian national. Their suspect had a credible motive and had actually been in Spain at the time. Given this new information, the FBI was obliged to release Mayfield.
Former CIA employee, contractor to the NSA and whistle-blower, Edward Snowden, has publicly spoken out against the dangers of mass surveillance for both ethical but also practical reasons. An enormous pond of stagnant data allows anyone with the necessary clearance to retroactively mine pieces of information, pick out the ones that confirm the theory of the day and create a plausible story to back it up. In other words, reshaping the puzzle piece to fit the hole in the puzzle. The Mayfield case is particularly striking because of the number of “experts” convinced of his guilt despite the lack of any concrete evidence. It seems that there was enough information available for these officials to pick the facts that fitted the case against him. They suffered from a rather public case of confirmation bias aided and abetted by the number of “facts” available to them. The results showed that their thinking tools weren’t as sharp and well calibrated as they should have been. Thankfully the Spanish police acted as a counterbalance in this case, but what about the many cases where the decision of a single agency, group or even person is relied upon?
What does any of this have to do with your child? As always, let’s start with what this has to do with you.
No matter what you do for a living, your past, present and future results from a collection of small and large decisions strung together across time and circumstance, incorporating varying measures of random events, or chance. Even not making a decision is a choice, seldom taken lightly. As a professional decision maker, what tools do you use to make decisions? Yes, you use your brain and information, perhaps even computer programs to generate smart information from data. But decision making is neither a linear nor a purely rational process and an array of diverse tools and talents are called upon. These include:
1. The brain: Understanding how it processes information and biases decisions
2. Information: The ability to recognise and source good quality information
3. Other people: Understanding how they frame and present information
4. Education: Using knowledge as the building blocks of thought
5. History and experiences: Learning from one’s own and the experiences of others
6. Emotions: Understanding the influence of emotions on thinking
7. Decision process: Developing a repeatable process that allows for reflection on and improvement in decision making
We’ve explored the development of your child’s brain in some detail. Now let’s look at how this tool does its job and the enormous role it plays in both helping and hindering our ability to think soundly and make good decisions.
Our bountiful brains are extremely resource intensive and can consume up to 20% of the body’s glucose supply and oxygen. Fully aware of its own appetite, our processor will try to conserve energy whenever possible, using an array of energy saving devices. Mental biases and heuristics, or shortcuts, are such devices. Not only is our brain a fuel guzzler but it’s also a slower processor than we’d like it to be. When I ask audiences which they think is a faster data processor: their conscious brain or a quad-core computer, I am still amazed that the majority pick the conscious brain as the faster of the two. Would you have picked the brain? We’d like to think that we walk around with something that can’t be replicated by a machine and definitely not improved upon. But if I gave you a spreadsheet with 100 figures to be multiplied together. Would you say “Oh, don’t bother with Excel, I can calculate it faster in my head.” No? I didn’t think so.
We may be able to perceive and process a variety of more subtle data that computers can’t but when it comes to raw processing power, our brains chug along at a fraction of the speed of an old 56K modem. Remember those? The only thing that I remember about them was having to dial up to some far away server and then wait for a tiny trickle of bandwidth to connect me to the slow motion magic of the world wide web. Our brains process information even slower, way slower.
Yet some people appear to make good judgements really quickly such as the paramedic who must weigh up a tremendous about of information and make a life or death decision rather quickly. Same for a fighter pilot, soldier or firefighter. Or the CEO that is under pressure at a board meeting to deliver a quick decision. Each of these people have a repository of past knowledge and experiences feeding into their decision making process at speeds much faster than our prefrontal cortex (the conscious processor of the brain) can process. Such information can be bundled under the label of intuition or as some researchers have labelled it, “somatic markers”1. In the case of a soldier or fighter pilot, they have also honed and trained their decision making through many hours in a simulator and combat training. In much the same way, executives have a range of painful and successful experiences to call upon and hopefully, some introspection into their thinking as well. Our children are still building up this huge database of decision outcomes and so many of their challenges, like many found in the world of counterterrorism, are new and unchartered.
Somatic markers and training aren’t enough to overcome the limitations of both a slow and a resource intensive brain. For that we add mental shortcuts (or heuristics) into the thinking process as well. A stereotype is such a shortcut. We are able to gather enough information to make a decision about whether we will trust someone, or not, through a single glance. Someone’s clothes, hair, tattoos, state of cleanliness, facial expression, accent and any other markers that we are able to perceive generate a remarkably comprehensive profile of someone in our own minds. A snapshot that may represent some truth or none at all. It doesn’t take very long either. I once spent a week trying not to stereotype. This involved not forming an opinion or a “feeling” about someone new when I met them for the first time. Do you think I could turn this heuristic off and resist judging at first glance? No, I couldn’t. Despite my best efforts, I received internal intel on every new person that I met, before I’d had a chance to make a conscious judgement about them. The best I could do was override my initial flash assessment of someone with much slower rational thought later on.
Behavioural economics explores the mental processes we rely on to make everyday decisions. Supported by advances in neuroscience and neural imaging it focuses on the mental biases and shortcuts (such as stereotyping) that we generate as we process data. Add to this research in the psychology of motivation and information framing and we are starting to create a really useful operating manual for our own thinking machine. Imagine if we’d had these insights as children? If our parents could have taught us how prejudice is formed, why we use stereotypes without even thinking about it, why we believe what “they” say or make especially risky decisions as teenagers? Imagine if they could have explained to us, at 13-years-old, how our emotions affected our thinking and decision making and even ability to crack that difficult algebra problem? Wouldn’t our world be a better place?
There are dozens of thinking biases and shortcuts to explore but don’t worry, I’ll stick to the most common ones that your children will face as they build up their repository of decision outcomes. Before we start, please complete the phrases below.
Americans are _____________
Politicians are _____________
Chinese are _____________
Immigrants are _____________
Londoners are _____________
Muslims are _____________
Christians are _____________
One rainy Saturday afternoon a great commotion escaped our basement. I could hear guns, helicopters and bombs exploding as someone bellowed confidently, “Attack, attack, now’s your chance!” I tiptoed to the door and peeped in as Lego blocks went crashing. A gorgeous Arabian-looking Lego castle complete with turrets, flags and an “underground” water supply had sprung up in the middle of the room. Its mini-figure guards with turbans and rifles were being blown away by a whirring twin prop helicopter and men in blue jumpsuits with fake American accents.
“Gosh, honey. What are you playing? Cops and robbers?” I asked my son as another bad guy fell.
“No. Can’t you see, FBI and terrorists!”
“And which ones are the terrorists, sweetie?”
Why did I ask if I knew the answer already? Because I didn’t want to believe that, at six, my son was able to profile and discriminate people (or in this case Lego mini-figures) based on stereotypes. I soon learnt that his terrorists were from Afghanistan where they go to school to learn to make bombs. This was shortly after the Boston bombing and the two terrorists involved were thickly spread across the media. Sure, we spoke about it but I was very careful to try to avoid the usual stereotypes. In the world we live in, feeding him stereotypes will almost definitely do more harm than good later on in his life. In addition to the Boston bombing, we had discussed 9/11 a few times and I realised that his entire experience of terrorism hinged on two high profile events and his age-appropriate belief that people are either good or bad. He had anchored on the few facts that he knew and extrapolated them to form a mentally satisfying explanation of the bad guys of the 21st century. But not for a minute could I blame the fact that he was only six years old. In fact, the anchoring bias is a mental shortcut, or heuristic, that adults are particularly good at. It allows us to focus on only a few salient bits of information and build a narrative around those. Why do we do it? Well, because it speeds up our ability to process a lot of information. And a little extra processing speed goes a long way with our rather pedestrian processor.
Now let’s get back to the words you wrote down above. Did you have a word for every one of them? A one-word answer for an entire cohort of people? Most of us do. Your answers will vary depending on what you are. Do you see how many stereotypes you harbour? Every stereotype is an anchor that flashes through your mind when you encounter strangers. It’s our metal database dishing up lightening quick preconceived information to feed into our decision process in the hope of speeding things up and avoiding imminent danger. It helps us form a quick conclusion (yes, a conclusion) about the stranger we are staring at on the train who fits one of these descriptions. Have you noticed how your friends who are British, Chinese or American don’t necessarily fit your stereotype of their group? If only stereotyping were harmless mental trickery. Unfortunately, religious extremism, bullying, discrimination and racism all spawn from these unexamined anchors.
OK, so you are trying to avoid stereotyping people in front of your children. When the cab driver from X foreign country runs the red light ahead of you. You try not to blame it on all people from X being idiots or bad drivers. That’s a great start. But only a start. Anchoring as a mental heuristic creeps into almost every situation where we have to weigh up information. What we already know about something is presented to us quickly and subconsciously and becomes our starting point, or anchor, for thinking about that information. We tend to adjust new information relative to our existing anchor.
When we first moved to Hong Kong we hired a cheerful property agent to help us find a suitable apartment. She knew our budget and that it wasn’t negotiable. She also knew that our knowledge of the HK property market was anorexic to say the least. She showed us loads of apartments within our budget. They varied from mouldy to stinky, tiny and dilapidated and sometimes all of these together. This was unexpected as we had thought our housing budget was reasonable. Clearly we were wrong and asked her to show us some nicer apartments as we were running out of time. Of course every nicer apartment was out of our budget. Did we pay up? Sure. Had we been played? Oh yes. After a year we discovered that there were wonderful apartments very much within our budget that she had chosen not to show us. She skilfully created false anchors for us to judge our spending power against.
The sooner our children are able to understand the effects of anchoring, the better. And not just because setting the opening anchor in any negotiations will frame the entire discussion but rather because anchoring leads to a far more insidious, hard-to-spot-when-you’re-older mental shortcut – confirmation bias.
In March 2011, I was teaching a semester of Critical Thinking at a business school in Singapore. On the 11th of that month, a magnitude-9 earthquake off the Pacific coast of Tōhoku in Japan caused a tsunami that devastated the Fukushima Daiichi nuclear power plant. This was the largest nuclear meltdown since Chernobyl. As more information became available about the genesis of the incident and subsequent emergency operations to contain rapid nuclear fallout, my students and I were able to evaluate the mental mistakes that decision makers were making under conditions of enormous pressure and international media scrutiny. Sitting in our lecture theatre after a hot lunch and shooting mental arrows at bad decisions made by others turned out to be quite easy to do. Even deciding how their decision making could have been improved upon turned out to be easy peasy.
But critical thinking in real time is not so easy or even intuitive. So, come exam time, I turned the spotlight on my students. Their final group assignment question pack included articles with emotive photos and information about both the human tragedy and technical and regulatory failures of the nuclear disaster at Fukushima. The final question was this one about nuclear power:
Q. Considering the recent events in Japan, would it be reasonable to suggest that nuclear energy be phased out worldwide? Support your answer.
What do you think the majority of answers from these final year MBA students were? None of us are nuclear specialists so I wasn’t looking for overly technical justification for their decision. Instead I was most interested in how they supported their conclusion. The vast majority of groups replied that yes, nuclear energy should be phased out, and they cited information related to the recent disaster almost exclusively. They chose to ignore the safety and efficiency of the majority of the world’s nuclear power plants and anchored instead on the outliers that had gone wrong. They also ignored influential differences between geographic locations and government oversight that would affect safety and stability of a nuclear power plant. Recency bias encouraged them to give almost 100% weight to the most recent event with less or no weight afforded to decades of data on the efficacy of nuclear energy.
I repeated the case study and asked this same question of the following two postgrad and undergrad classes at the end of 2012 – a year and half later. Do you think I received the same answer with the same justifications, despite using exactly the same material? Of course not. These students firmly believed that nuclear energy was here to stay and quoted far wider sources of information than recent press.
What happened? In the aftermath of the disaster the pain, suffering and enormous damage to life and property were spread across the news in blow-by-blow, by-the-minute updates. It would be almost superhuman to be unaffected by the emotions and human tragedy unfolding across Japan – a country loved for its food, landscapes and gentle, quirky people. To suggest that nuclear power was the way of the future at this point in time would have been emotionally challenging. 18 months later I received very different conclusions for the same case study. This time answers reflected a good understanding of how human error had caused the devastation at Fukushima. It was also clear that nuclear power had swung back in favour. What a difference a few months make to decision making. It’s not just my students that were caught up in this trap of vilifying nuclear power in the aftermath of the tsunami. Politicians across the world were called on to justify nuclear power capacity and investment. In Switzerland and Germany plans to extend or expand nuclear power plants were called off, despite neither of these countries having any geographical or regulatory similarities to those affecting Japan, or Fukushima specifically.
What the first group of students had fallen prey to was a bias that they knew very well in theory – confirmation bias. This sneaky mental shortcut grows like a weed from a mental anchor, which in this case, was the human disaster of the nuclear crisis. Despite already agreeing that the cause of the nuclear failure was manmade and not as a result of the type of power plant in question, they still made a decision that it was bad for humankind. They found plenty of evidence to confirm their belief and ignored a growing body of evidence that disagreed with them.
As their lecturer I think I failed here. What’s the point of only being able to avoid biases in academic exercises but not when the stakes are high? When it really counts? You’ll see confirmation bias creep in whenever you have a particular opinion on something and attach greater importance to information that agrees with your opinion and discount information or people that disagree with you.
What can be done about it and how can we teach this to our children in a meaningful way? Let’s cover a few more biases and then we’ll look at how to start introducing sound thinking into growing, but impressionable, minds. In the meantime, try and think of some ways you could guard against this in your own thinking.
__________
1 The somatic marker hypothesis (SMH) proposes a mechanism by which emotional processes can guide (or bias) behavior, particularly decision-making. Damasio, Antonio R. (2008) [1994]. Descartes’ Error: Emotion, Reason and the Human Brain. Random House.
Can you now identify how the confirmation bias influenced the FBI’s case against Mayfield?
Their initial suspicion was that Mayfield was guilty. They then proceeded to look for evidence of his guilt and had enough data to pick and choose the bits that corroborated their suspicions. Of course they also found evidence of his innocence but put less weight on these “facts”. What made this error particularly powerful was that it wasn’t only one or two individuals making thinking mistakes but it would seem that the entire team eventually bought into Mayfield’s “guilt”. This is striking but not surprising and illustrates a dynamic that our older children are particularly vulnerable to – groupthink.
Social psychologist Irving Janis first introduced the idea of groupthink. He used it to explain why otherwise rational people can make irrational or poor decisions when grouped together.
Groupthink occurs when a group makes faulty decisions because group pressures lead to a deterioration of mental efficiency, reality testing and moral judgement.
- Irving Janis
Sounds a bit harsh, doesn’t it? Team-based decision making is a bastion of business and a model that modern schooling encourages. But history and evidence bear out that we risk a change in our thinking patterns when we enter into the comfort of a group or the shadow of a cause. Groupthink is particularly virile during adolescence (from 13 to 19 years old), a time during which friendship groups become particularly important. Its influence is so pervasive in our teenagers’ thinking and decision making that it warrants a thorough discussion.
Statistics tell the story of adolescents as a developmental stage characterised by poor decision making and increased risk taking. Adolescents and young adults are more likely than adults over 25 to binge drink, smoke cigarettes, have multiple casual sex partners, engage in violent and other criminal behaviour, and have fatal or serious automobile accidents, the majority caused by risky driving or driving under the influence of alcohol.1 Much of this behaviour has been blamed on some rather spurious notions that are not supported by research. This includes the widely held belief that our teens are somehow irrational or deficient in how they process information resulting in them not perceiving risks the way that adults do. Then there’s the idea that teenagers think they are invincible or invulnerable to injury or consequences. It comes as a surprise to many parents that a significant body of research shows that the logical reasoning and basic information-processing abilities of 16-year-olds are comparable to those of adults. I’ll repeat that as it may take some time to sink in: adolescents are no worse than adults at perceiving risk or estimating their vulnerability to it.2 In fact, under test conditions,3 researchers have found almost no age-related differences in individuals’ evaluations of the risks inherent in a wide range of dangerous behaviours. The same goes for evaluating the seriousness of the consequences that might result from taking such risks. Most surprisingly, teenagers have also been found to be equally capable of evaluating the relative costs and benefits of these activities.4 The conclusion then is that heightened risk taking in adolescence does not stem from ignorance, irrationality, delusions of invulnerability, or faulty calculations.5
Despite the evidence, the vast majority of educational interventions continue to aim to reduce risky behaviour through information and discussion about the dangers of unprotected sex, alcohol, drugs and risk taking. Unfortunately, adolescent fatalities don’t reflect these educational efforts and the tax dollars spent on them. Perhaps a better understanding of the cause of these two troublemakers (poor decision making and increased risk taking) will go a long way to making adolescence a safer but still fulfilling chapter in life?
So what drives your average level-headed teenager to make questionable decisions or engage in risky behaviour such as experimenting with drugs, or other socially undesirable activities and what do their peers have to do with it?
Brain imaging and other studies are continuously providing us with more sophisticated and granular explanations for behaviour that occurs during this complex and pivotal life stage. This is what we know so far:
Cognitive systems that allow for sound decision making, risk assessment and impulse control are housed in the front most part of the brain (the prefrontal cortex). The PFC develops linearly from birth through to early adulthood and is considered a top down control system. However, if risk assessment and impulse control continue to develop slowly and steadily from childhood through adolescence then teenagers should be more risk averse than younger children. If you have a teen and a preteen in your family then you may be shaking your head here. In theory your teen should be way more sensible than their younger siblings but something interesting shifts in how your sweet little one perceives the consequences of their risk taking as they mature.
There is evidence that children are more likely to anticipate negative consequences from risky behaviour. A teen, on the other hand is more likely to associate risk taking with reward or other positive consequences.6 This is compounded by the non-linear development of another system that reacts to reward, emotional and social stimuli known as the limbic system. This bottom up reactionary system reaches maturity in middle adolescence (a few years before the control system does) allowing for a more exaggerated response to reward and emotion.7 Despite the fact that adolescents can reason as well as adults, in emotionally charged or risky situations the more “mature” reward seeking limbic system will override that reasoning, creating an imbalance in how teenagers process and respond to information. This “risky” development period is shown below.8
Are you starting to see why interventions that increase awareness of risks and improve decision making at a cognitive level are often not as successful as they should be?
Hormonal changes at this time also conspire to increase sensitivity to both reward9 and addiction, further heightening the pleasure gained from taking risks. You’ll be pleased to know that the teenage top down reasoning system begins to wrestle back control at around 15 years old (allowing for individual differences).
It’s only mildly comforting to know that these changes are not unique to humans but seen across a range of non-human primates, rodents and even bird species. In all of these species novelty seeking, hanging out with same age peers and quarrelling with one’s parents all appear in different forms10 leaving scientists to ponder an evolutionary explanation for them. After all, adolescence is a time of transition from the security of the home and mom’s cooking to that of independence, finding a mate and fending for oneself – all risky undertakings.
So until late adolescence the drive to take risks, for the average teenager, is stronger than the ability to control impulse. Interventions to reduce teenage risk taking that focus on enhancing cognitive control systems, and therefore sound decision making, may not be addressing the real cause of the behaviour at all. Helping teenagers understand their lack of impulse control in the heat of the moment and giving them tools to cope with this might be more successful.
What about the influence of peers? If you are a parent of a teen, you’d probably know that adolescent risk taking is far more likely to occur in groups. In fact, this is a hallmark of the teenage years and the degree to which your teen’s friends drink alcohol excessively, or use illicit drugs is one of the strongest, if not the single strongest, predictor of their own substance use11. Even a supremely level-headed preteen could go on to make uncharacteristically bad decisions later on given peer pressure.
All risks are taken in the hope of some implicit or explicit reward granted instantly or at some point in the future. Reward increases the activity of a feel good chemical called dopamine in our brain, as do addictive drugs. The adolescent brain experiences an exponentially bigger shot of this fun stuff than the rest of us do when experiencing reward, leading researchers to believe that this is a further driver of their increased appetite for risk. But there’s more. Taking risks in the presence of peers (regardless of the possible outcome) lights up the same feel good circuity that is activated by exposure to reward, making potentially rewarding – and potentially risky – activities even more rewarding. This phenomenon has not been seen in the brains of adults or younger children – only adolescents. It seems then that in adolescence, more might be merrier – and riskier.12
This imbalance between a well-developed risk seeking drive and less developed impulse control seems to be biologically driven and unlikely to be pacified through educational interventions designed to change what adolescents know and how they think. Interventions built around the idea that adolescents are inherently more likely than adults to take risks and focus on reducing the harm associated with risk-taking behaviour, are far more likely to succeed.13
We can’t fight biology nor wish away the intensity of the teenage years so let’s rather explore the mental dexterity behind our child’s risk assessments and decision making and see how to further improve the quality of that thinking as we explore workarounds to biases and the essential habits of good decision makers.
I should point out again that the degree to which individuals translate sensation seeking into risky behaviour varies dramatically14 and includes factors such as maturity level (early maturers are at greater risk), available opportunities to take risks (lack of adult supervision, availability of alcohol, drugs and car keys, etc.) and a child’s general temperament towards risk taking. Shy, anxious children are less likely to take risks but could also find it hard to say no.
From research on automobile accidents we know that the presence of same-aged passengers in a car driven by an adolescent driver significantly increases the risk of a serious accident. We’ve also learnt that adolescents are more likely to be sexually active not only when their peers are but also when they merely believe that their friends are too, whether or not their friends actually are. Also, statistics from the FBI show that adolescents are far more likely than adults to commit crimes in groups than by themselves.
However, let’s not forget that the relatively greater prevalence of group risk taking observed among adolescents may stem from the fact that adolescents simply spend more time in peer groups than adults do.15
• Risk Taking in Adolescence: What Changes, and Why? by Steinberg, L. (2004), Published in Annals of the New York Academy of Sciences, 1021: 51–58.
• Peer Influence on Risk Taking, Risk Preference, and Risky Decision Making in Adolescence and Adulthood: An Experimental Study by Gardner, M, Steinberg, L (2005), Published in Developmental Psychology, Vol 41(4), Jul 2005
__________
1 Steinberg, L, 2008. A Social Neuroscience Perspective on Adolescent Risk-Taking, Department of Psychology, Temple University. NIH Public Access
2 Millstein SG, Halpern-Felsher BL J. Perceptions of risk and vulnerability. In Adolesc Health. 2002 Jul; 31
3 Test conditions are staged with test subjects usually knowing that the outcome of their performance in experiments will not have lasting implications for their future.
4 Beyth-Marom R, Austin L, Fischoff B, Palmgren C, Jacobs-Quadrel M. Perceived consequences of risky behaviors: Adults and adolescents. In Developmental Psychology, 1993
5 Reyna VF, Farley F. Risk and rationality in adolescent decision-making: Implications for theory, practice, and public policy. In Psychological Science in the Public Interest. 2006;
6 Casey BJ, Galvan A, Hare TA. Changes in cerebral functional organization during cognitive development. Curr Opin Neurobiol. 2005a;15(2):239–244.
7 Casey, B.J., Rebecca M. Jones, and Todd A. Hare. “The Adolescent Brain.” Annals of the New York Academy of Sciences 1124 (2008): 111–126. PMC. Web. 11 Nov. 2015.
8 Casey, 156.
9 Galvan A, Hare T, Voss H, Glover G, Casey BJ. Risk-taking and the adolescent brain: who is at risk? Dev Sci. 2007;10(2):F8–F14.
10 Spear LP. The adolescent brain and age-related behavioral manifestations. Neurosci Biobehav Rev. 2000;24(4):417–463.
11 Chassin L, Hussong A, Barrera M, Jr, Molina B, Trim R, Ritter J. Adolescent substance use. In: Lerner R, Steinberg L, editors. Handbook of adolescent psychology. 2. New York: Wiley; 2004.
12 Steinberg L. Risk taking in adolescence: what changes, and why? Ann N Y Acad Sci. 2004;1021:51–58
13 Steinberg, Laurence. “A Social Neuroscience Perspective on Adolescent Risk-Taking.” Developmental review : DR 28.1 (2008): 78–106. PMC. Web. 13 Nov. 2015.
14 Kagan J, Snidman N, Kahn V & Towsley S. The preservation of two infant temperaments into adolescence: Monographs of the Society for Research in Child Development Volume 72, Issue 2, page vii, July 2007
15 A Social Neuroscience Perspective on Adolescent Risk-Taking by Laurence Steinberg, Department of Psychology, Temple University. Dev Rev. 03/2008
In 1917 Einstein discovered a glitch in his new general theory of relativity. His equations suggested that the universe could not be in a stable state but was either expanding or contracting. Except that was impossible because the universe was static and stable. He quickly developed a workaround to satisfy this fact and return order to the galaxy. At first he called it the “cosmological constant” – a constant introduced into his theory to “hold back the effects of gravity” and maintain a static universe.
In 1927 a Belgian priest and astronomer, Georges Lemaître, cornered Einstein at a conference and introduced a theory that he had been working on, based partly on Einstein’s own calculations. Einstein dismissed him out of hand commenting that his calculations may be correct but his physics was atrocious! Lemaître didn’t give up and continued to develop his idea that the universe was indeed expanding and that the cosmological constant was superfluous. It would take another 60 years for Lemaître’s theory to be recognised as the dominant theory underlying our understanding of our universe – The Big Bang Theory. Today most scientists accept that the universe originated from an incredibly dense, incredibly hot single point in time and space and has been expanding ever since then. Expanding into what is the next puzzle to solve. In case you were wondering, Einstein eventually warmed to this idea and later called the cosmological constant the biggest blunder of his career. Even he didn’t know what he didn’t know.
Imagine a colour you’ve never seen before. Anything? That’s tricky, isn’t it? Our awareness is often limited by what we either have been exposed to or can mentally assemble based on what we already know and, particularly, have language for. If Henry Ford had asked customers what they wanted they would have said a faster horse. The latter is a nothing more than an Internet meme but certainly rings true today. If someone is not aware of the limitations of their mind, their biases, mental maps and predispositions, would they ever think of improving their thinking? Fortunately for all of mankind, there are enough people like you and me that are curious and open to exploring what we don’t know.
Imagine if the chaps at the FBI could have said to each other, “Hang on does our verdict suffer from confirmation bias? Or groupthink? Or anchoring?” Or even, “Are we making thinking mistakes? Does our frame/prior/situation/environment prejudice our conclusions?” It would take just one person with the guts to not only question but prove to the rest of his or her team that their thinking needs rethinking.
With your current understanding of anchoring and confirmation bias, how would you raise your children’s awareness of these thinking mistakes? Offering your child the textbook definition of any of them simply won’t work. This doesn’t even work with adults. Trust me, I’ve tried it. I used to offer thorough definitions and explanations of mental biases in workshops and coaching sessions and expect my clients and students to be able to identify these thinking mistakes in their own thinking. What happened instead is that they became very good at picking out these thinking mistakes in theory and in other people’s thinking. Of course knowing when others are making thinking mistakes is a valuable skill but it’s a bit like sautéing top quality ingredients in a filthy frying pan.
When you hear your children anchoring on one trait or piece of information and using only that to inform their opinion, ask them if they may be anchoring on it? Then go on to ask them if they can find information that disagrees with those anchors. Here’s an example.
“Dad, Sarah gets more pocket money than I do. Her parents love her more than you love me.”
This one is easy to diffuse, isn’t it? What is the anchor? Correct: pocket money is viewed as the only indicator of love. And the question to ask is: Are there any other ways in which we show you that we love you?
In older children it gets more tricky because we may be able to identify when they are anchoring on information but might not be able to find reasonable alternative anchors in the spur of the moment. That’s OK, before they even reach puberty they already know that we don’t have all the answers. Despite our lack of facts on tap, we can still offer them a framework for thinking about their world through slightly clearer lenses. To encourage them to consistently ask what is the anchor being used and how would the issue change if alternative anchors, or none at all, were used? If Einstein can introduced a groundbreaking new theory anchored on current and limited knowledge of the universe then we can all be forgiven for making the same mistake in our thinking, sometimes.
Here’s an interesting example of an anchoring effect that has been found in teens. We already know that peers are a major source of influence on adolescent substance use1 and other antisocial behaviour, at times even overriding genetic predispositions2 to such behaviour. It’s no surprise then that what a teenager merely thinks their friends are doing (wether accurate or not) significantly affects their own actions. If a young teen thinks that his friends are already sexually active, he may feel inadequate if he isn’t already doing the same or justified if he already is. These thoughts become an anchor for his behaviour.
But the mind is a far more byzantine and interesting thing. What researchers have found instead is that a teen’s own behaviour becomes the anchor for what they think their peers are up to. To quote researchers directly, “adolescents’ reports of their friends’ substance use are biased in the direction of their own use. Substance users consistently exhibit a liberal bias, assuming that their friends also use substances. In parallel fashion, non-substance users consistently assume their friends are non-users, exhibiting a significant conservative bias.”3 We ca ll this the false consensus effect where we, often incorrectly, assume that others behave the way that we do, perhaps for the same reasons that we do.
Apart from helping your teens to spot personal anchors you can help them (and yourself) in commercial ways too. Advertising is all about using our thinking against us. Discounting the latest pair of Sic fashion Phenom jeans from $110 to $80 looks way more attractive than simply asking $80 for them, especially when the size of the discount, not the actual price is our anchor. Receiving a discount triggers the reward centre of the brain and when you’re a teen this is a really strong call to action. 3-for-2 special bargains are seldom real deals but again that saving is processed as a reward and losing it by passing up the “bargain” also feeds into our natural tendency to strongly prefer avoiding losses to making gains. (This is known as loss aversion – another bias.)
It’s not only advertisers that use anchoring to weigh us down, we can use it quite successfully too. If you start out giving your teen $50 pocket money a month, chances are they’ll grumble that it’s not enough. Rather, start out offering only $35. This then becomes the anchor around which negotiations happen, not what their peers are getting. Any increase towards $50 is an improvement for them, a reward. The same goes for curfews, screen time, etc. Getting an allowance increased is a psychological hole-in-one for your teen, even if the game was rigged to begin with. If a weekday curfew is 10pm, then a weekend curfew of 11pm doesn’t feel particularly generous. But a 9pm weekday curfew and midnight weekend curfew would sound more reasonable to teenage ears even though the hours away from the house would actually be less in a week.
Like anchoring, confirmation bias is easy to spot in someone else’s thinking but trickier in your own. Think about this question. If two deeply religious people, from different faiths, were arguing the merits of their own religion, could either win the debate? A deeply religious person would, no doubt, know a great deal about their religion and harbour more facts that support her religious ideals than facts that refute them. So the more you argue with her the deeper she is forced to dig into her collection of “facts” about her belief or details that discredit other faiths in order to support her own argument. The more you argue with her the more of an “expert” she becomes. We call this attitude polarisation, which is especially tricky to deal with if the person involved is also actively closed minded as often happens in religious extremism. Confirmation bias allows her to pick facts that support her argument and feel OK about it. On the other hand, a critical thinker not only listens to information that contradicts their own ideas but actually seeks it out.
This is where the parent truly becomes the teacher. Teachers in school or college work with pretty clear and dry facts on well understood issues but you, mom or dad, get to deal with all the grey areas that have no real answers. You get to show your children not only how to think between the lines, but also give them the confidence to do so. Most of this will happen through how we think and act on information, not through what we teach them.
Do we allow ourselves to be openly questioned or do we see it as a sign of weakness? For example, if there is another school shooting in America do you immediately take “sides” based on whether you own a gun or not? Or are you able to consider the issue and discuss it with your children as more than a pro-gun/anti-gun issue. Do you support Brexit (Britain’s exit from the European Union) to free your country from foreigners or have you carefully considered the financial implications? Do you dominate a discussion with your views or do you curate balanced discussions with give and take? As I wrap up this chapter, the world has just witnessed a spate of deadly terror attacks across Paris, Tunisia, Egypt, Beirut and Turkey.
“Why did they do it Mom? Why are there terrorists?” This is the question that my son and I wrestled with all weekend in the eerie glow of the BBC’s coverage of the Paris tragedy. My initial reaction was to quote the news of the day, and Mr John Kerry in particular, with: “They are psychopathic monsters.” That about covers it, doesn’t it? They’re the bad guys and there’s plenty of evidence to confirm that they are suffering from a chronic mental disorder with violent (psychopathic) tendencies right there on the headline news.
I’m not a scholar of terrorism, politics or religion, but if that truly was the end of my explanation, I would be failing in my efforts to raise a child that is capable of reasoning with a curious and critical mind. A mind open to the possibility that we are not helpless in the fight against terrorism nor indeed the bully next door. So how do you explore the inexplicable with impressionable minds while avoiding a minefield of mental mistakes inherent in an emotionally charged topic? No matter how true it is, telling your children that terrorists are “psychopathic monsters” will engender even more fear and more helplessness against the “bad guys”. Showing our children instead that the problem has a cause and therefore a solution will do the opposite.
To avoid confirmation bias in any topic a critical thinker would gather evidence to prove and disprove their theory before deciding. A little bit of online research reveals very clear origins of the ISIS (Islamic State) movement, their beliefs and theories. Their propaganda reveals a very clear path of action that they are duty bound to follow. Like the bully next door, the fact that they have a reason and a plan removes the mystery and fear that clouds this organisation of mass murderers. We can only change what we understand.
Discussing terrorism with your child is an essential part of parenting today but way trickier than the S.E.X talk. Unlike the latter, most parents will not go further than sharing their own personal view on terrorists with their children – a view shaped by their own ethnicity, political and religious beliefs and tolerance towards different ethnic groups. It’s true that the facts of the matter are tricky to get to and most of us simply don’t want to spend time trying to understand a criminal’s motives, especially a terrorists’. In the hope of trying to ease the burden of the “terrorism talk” on those parents who want to bring more than their own views to the discussion, I researched and wrote a piece for The Huffington Post that you’ll find at the end of this chapter.
__________
1 Kobus K. Peers and adolescent smoking. Addiction. 2003;98(Suppl 1):37–55. [PubMed]
2 Guo G, Elder GH, Cai T, Hamilton N. Gene–environment interactions: Peers’ alcohol use moderates genetic contribution to adolescent drinking behavior. Social Science Research. 2009;39(1):213–224
3 Henry DB, Kobus K, Schoeny ME. Accuracy and Bias in Adolescents’ Perceptions of Friends’ Substance Use. Psychology of addictive behaviors : journal of the Society of Psychologists in Addictive Behaviors. 2011
My critical thinking programmes centre around exploring and understanding our individual tendencies towards thinking mistakes. If we don’t take things too personally and are willing to explore and learn about our own failings, then bias bashing can be incredibly insightful and even loads of fun. In the adults I work with, we usually discover a lifetime of bad decisions and mental mistakes that give us plenty of material to work with, but our children generally haven’t had many such experiences yet. If we can’t start with the past then where do we start with our younger ones?
Regardless of age, good decision making will always begin with information. When you’re a toddler this may be inscrutable information of the emotional kind but information nevertheless. A good place to start with your child from about six years old is to slowly guide them through the seven habits of good decision makers because developing good information processing habits from an early age is as important as brushing their teeth twice daily. A little later on, from about 10 years old, begin to explore some basic mental biases as opportunities arise.
When Suzie rushes home to tell you that her new best friend’s uncle is the Queen’s (Elizabeth II) brother – help her evaluate this piece of information with a few simple questions about the source.
• How do you know this?
James, my new best friend, told me.
• How does James know this?
I don’t know.
• Is it possible that James just wants to impress you or feel important?
Maybe.
• Does the Queen have a brother?
Let’s Google that.
Of course it’s much quicker just to tell Suzie that James is lying because the Queen only has a sister but then she would never learn to evaluate information herself. So rather get her thinking about the questions to ask when confronted with such juicy news.
In just a few years Suzie will have access to the Internet. An endless and increasingly primary font of knowledge that’s easy, convenient and omnipotent. Google is the McDonalds of information – serving up super-sized helpings of data that have been processed and flavoured by those that have gathered and interpreted it for us. We all choose the quality of the information we consume in much the same way we choose between fried chicken and chips or a veg wrap for lunch on a Tuesday. If we base our thinking, and hence decisions, on quick-to-access and widely available information (accepted without verification of its underlying data), then our decisions will disappoint on average. Quality information takes time and effort to gather, just like a healthy, well-balanced meal – there is no quick way around it.
Probing with questions to develop a healthy scepticism from the youngest possible age will expand your child’s thinking and lead to healthier information choices. Encourage your middle school or older child to ask, “How do you know that?” when given information they can’t personally verify. Then to decide whether the source is credible and think about the motivation behind the information. Verifying information using an independent source is a valuable skill. An independent source is not Facebook, Wikipedia or a friend that “just knows that kind of thing”. To find the primary source one would need to follow the sources listed at the bottom of the Wikipedia article quoting the original researchers, for example.
If we were to run history lessons through this simple checklist, we might discover several alternative stories and views on the “facts”. Of course that would take a lifetime, so let’s focus on the present and future and help our children test conclusions, verify interpretations and frames and go to the source of data whenever they can. If you want to raise your children as thinkers, “Because I told you so” has to leave your lexicon forever.
It was Socrates who first proposed that all information occurs within points of view and frames of reference and that all reasoning proceeds from some goal or objective. In 399BC the great man was executed for corrupting the youth of Athens through questioning that which was considered above question. Today we know and understand the fundamental truth in this reasoning and how it separates good decision makers from the rest. Without fail, every piece of information that is presented to you or your child is done so through someone else’s frame. What is a frame?
Let me show you. Below is a list of numbers from 1 to 10. There is a sequence hidden in these numbers. Those of you who are good with numbers will have no problem finding this sequence. If you’re not so good with numbers, give it a try anyway.
5 - 4 - 9 - 1 - 7 - 10 - 3 - 2
The numbers 6 and 8 have been omitted. Can you see where they belong and what the sequence is?
Got it yet? Most people find this sequence tricky or just plain impossible. I’ll give you a clue – it is a common sequence that most of us work with every day. Does that help?
If you are ready for the answer, read on. This problem is presented numerically, which makes you think in numbers. In fact, most people only think in numbers here and calculators and smartphones are usually whipped out to do some interesting, but ultimately pointless, calculations. This is re-inforced by my mentioning that those good with numbers should find it a breeze. Very sneaky of me. In fact, these numbers are in alphabetical order, which is hard to see if you are in a numeric frame of mind. Here they are presented differently. Of course you see it now!
five - four - nine - one - seven - ten - three - two
How a problem is presented, or framed, can influence how we process the information as much as, or more than, the facts of the matter.
It’s late summer in our new neighbourhood in London and a group of six boys play in the street every afternoon. My son has joined this little collective and discovered the added benefit of an ice-cream van that visits every weekday at around 4:45pm. At two pounds a cone, my son spent his monthly allowance on ice-cream in the first week. Of course as soon as the money ran out I was petitioned for more, “But mom, this man is so kind – he comes all the way to our street just to check if we want ice cream. He says he doesn’t go anywhere else in our neighbourhood, he comes here everyday just for us. We really should support him. If we don’t, he’ll stop coming.”
Needless to say we had a little chat about this man’s motives and why he would possibly come to our street on his way home for guaranteed business. At 9 my son is beginning to understand that what people say and do is always framed by what they want out of a situation, consciously or unconsciously and driven by motives that are honourable or not. I can leave this lesson up to the school of life, but why risk that? In a few year’s time, as a teenager, he will evaluate others’ frames and motives without my guidance and use this to inform decisions with further reaching consequences.
Over a few farewell cocktails in Singapore, a friend and well respected investor mentioned that “everyone is selling China”. Everybody? Selling? If everybody is selling than who is buying it? Stock markets are driven by an asymmetry of beliefs – where someone wants to sell a position and someone else wants to own that same position – so they trade. Short of a stock market crash or correction, the price they settle on will reflect, amongst other things, if there are more buyers than sellers. My friend’s statement could not possibly have been a fact or a judgement based on fact but an opinion extrapolated from a trend or observation. It’s fine to believe in our own opinion (self-deception is one of the oldest survival techniques and a fascinating topic of decision science) but let’s be very careful when making important decisions using opinion as our raw data and not the facts that those opinions interpret or obscure.
Kids love hyperbolic opinion, too. It’s not uncommon to hear things like “everyone” is picking on me or that “nobody” likes me or that “everyone” thought it was a good idea to climb the dead tree before it collapsed. It only takes a few guiding questions to get a child’s (and an adult’s) thinking back on track such as: Really? Everyone? Was there not one single person who wasn’t picking on you? Was Tayla picking on you? Was Lohini? She would then have to admit that not everyone was part of the pack, which can usually be narrowed down to one or two kids that weren’t being particularly nice or just one ring leader that thought climbing the crumbling tree was a great idea.
Smiggle is an Australian retail chain that sells toys masquerading as useful pieces of stationery. (Their fashion-forward items have grown a large, almost cult following amongst infant and middle schoolers in Asia.) I was in our local store when a small boy and his mom walked in to clear his Spiderman wallet of pocket money. After about 10 minutes it became clear that, instead of having a fun shopping experience, this little guy of about five years old, was on the verge of tears, shuffling back and forth frantically in front of a wall of bright playthings for his pencil case. His mother, frustrated with his indecision, threatened to pull him out of the store if he didn’t pick something, anything, immediately. At this he burst into tears and thrust his wallet into a shelf of animal shaped erasers, having lost all interest in that “something special” he had set out to buy. All because he didn’t know how to make up his mind.
I wondered if his mom could have helped him decide how to decide instead of making it harder for him.
Nothing is more difficult, and therefore more precious, than to be able to decide.
~ Napoleon Bonaparte
Do you know what a metadecision is? No? Chances are your child won’t either.
It’s the simple act of deciding how you will decide before you jump in and gather information, make a decision or solve a problem. It begins by checking that you are, in fact, solving the right problem, then asks you to decide how you will solve the problem – with what tools, time, information and resources and against what criteria. It sounds like a mini project plan because it is. The metadecision forms the very first step in a good decision process – it helps you to anticipates challenges, use the best tools, and gets all your team members (if any) on the same page. All this speeds up the decision process.
Einstein is widely quoted as having said, “If I had only had one hour to save the world, I would spend fifty-five minutes defining the problem and five minutes finding the solution.” As much as I would have liked the great man himself to have said this, it appears to be a misquote based on a collection of articles published in 1966 that included a comment by the then head of the Engineering Department at Yale1 as follows, “If I had only one hour to solve a problem, I would spend up to two-thirds of that hour in attempting to define what the problem is.” I image the original is so widely misquoted because a. it’s easier to misquote than to research the actual source, b. it sounds smarter when it comes from Einstein and c. because no one really bothers checking quotes out – especially when it’s pasted across an iconic picture of the smiling wild-haired professor. It’s also so widely quoted because every good decision maker or problem solver knows how important it is to spend more time thinking about the problem than the solution.
In one corner of the decision-making ring we have those like the little chap at Smiggle who know their options but can’t decide between them, usually because they lack a framework to help them decide or criteria against which to judge their choices. In the other corner we have those that dive in to solve the problem or make a decision before thoroughly exploring their options or understanding the issue itself. These characteristics are deeply rooted in an individual’s tolerance for risk and uncertainty.
You’ve seen both adults and children alike jump right in to solve a pressing task without first thinking about how they are going to go about solving it or even if they should solve it. When I ask groups of well-seasoned, senior executives to build a device that can safely land a raw egg on the conference room floor from a height of two metres, 90% of them assume they have to build some kind of parachute to do so – an egg shuttle.
Surprisingly, most groups simply jump in and experiment with the materials I’ve supplied. Planning or prototyping is usually not considered, nor is looking for ways in which this has been done before without needing a cleaning crew in the conference room. Almost no one asks if they have to use only the materials supplied and no one has ever attempted to solve a different problem such as making the landing surface soft enough to support a falling egg. The latter being a far easier problem to solve by turning a standard conference room chair upside down and making a “trampoline” with the balloons supplied (I supply a goody bag of materials) or tie a T-shirt across the four legs of the chair and voilà, your egg will be caught and supported if enough flex is allowed. In six years of conducting this experiment with participants around the world, this particular solution has never been presented despite being the first YouTube video that pops up if you google “how to drop an egg without breaking it”. But that’s not what surprises me most. It’s rather that executives, who are by and large professional decision makers, don’t have a framework for thinking about how they will make a decision or a strategy to guide them. Like our children, they simply jump in and get busy with finding a solution.
In their book, Winning Decisions,2 authors Russo and Schoemaker chide amateur decision makers for spending most of their problem solving time (75%) on gathering information and coming to conclusions at the expense of understanding frames, thinking about how they will decide and learning from experience – both their own and others. They also note that a carefully constructed metadecision can save time and money.
If you have children that bolt after every good idea or jump straight in to solving problems without pausing to think of the best way to do it or even if it should be done at all, it would help them tremendously to share the idea of deciding how to decide first in a few child-friendly steps. If your child is the opposite and can’t seem to decide or get started on a project then this is a useful structure for them, too:
Is it whether the egg must land on the ground without cracking or simply survive a two-metre fall?
Is it to buy a toy that I can play with alone, that will last more than a few months or that my friends will think is cool?
I’m carpooling at the moment and “what I want to be when I grow up” was a topic of discussion in the car yesterday.
Friend 1 (10 years old) said: I want to be a particle physicist.
Friend 2 (9 years old) said: I want to run an engineering business.
Of course I had to jump in with a few questions:
It turns out that Friend 1 wants to be a particle physicist because science is fun and he wants to have fun when he grows up. He likes science and knows all about atoms. Friend 2 doesn’t know what kind of engineer he wants to be but he knows that engineers design and build things and that’s what he wants to do (possibly because he’s a Lego junkie) as well as being the boss. We’re a long way from wanting to be pilots and postmen but I think that Friend 1 is solving the wrong problem. The question he should be asking is what can I do that is scientific and fun? I suggested being a mythbuster. He wasn’t sure if his dad would be OK with that.
What resources do I have to solve this problem (tools, other people, information)?
How much time can I spend working on this problem?
How will I know when I’ve decided or solved the problem?
Have I, or anyone I know, had a similar problem before and how did I/they solve that and what did I/they learn from it?
So our little chap at Smiggle is still standing there in a puddle of his hard-earned coins fending off his mom’s frustration. How can she help him?
Clearly he wants to spend his money so she could begin by helping him decide if he wants something to play with or something to colour in with. Then help him count his coins and understand what items fall within his budget. What about the colour? If pink and purple are out then he’s left with blue, green and orange and if green is his favourite colour then he’s down to a handful of objects to choose from. Now comes the crunch. She could ask about the last toy he bought from Smiggle and how long it lasted, or how much he played with it and steer him towards more robust choices.
Putting a child under pressure to make a quick decision generally promotes poor decision making and misses a fantastic learning opportunity. These moments of frustration are pure gold and it’s up to us to invest them.
Ahmed Mohamed3 was the new kid at school in Irving Texas. A little geeky, a little shy, he wondered how to fit in and make friends – an important issue when you’re 14. At his old school he was known as the kid who made interesting stuff from old stuff, a cool kid. He tried the same tact at his new school and built a clock from scratch. Not a pretty clock but one that he hoped to impress his teacher with. But instead of receiving her praise and a little extra credit for science, Ahmed found himself suspended from school later that day just before being arrested, cuffed and taken downtown to the police depot. I’m sure you can guess why. As it turns out his clock wasn’t a bomb, his school wasn’t under threat and his teacher had it wrong. Like the Mayfield case this says a lot about modern life, and the unprecedented issues and fears that we face as civilians. It also says a lot about the environment within which our children make decisions today.
As our right to be fearful increases, we seem to be reaching a tipping point where tolerance of fear outweighs tolerance for risk. Our children are growing up as the most protected generation with more protective legislation, rules, schools and playgrounds than ever before. Being more sheltered from risk means that they are also less exposed to the idea of risk taking. You may find that you tell your children when something is dangerous and why it’s dangerous rather than letting them even think about the inherent danger or adverse consequences by themselves. Don’t worry, I’m not going to suggest that raising free-range kids is the way to go, but in order to raise thinkers, understanding risk and how to think and talk about it is very much a part of the process.
In the corporate environment we like to leave all things to do with risk up to the risk manager and his or her flock of bespeckled actuaries and number hugging PhDs. Unfortunately, this behaviour is in itself risky business. If decision makers don’t have a strategy beyond models and numbers, to grapple with unprecedented risks and imagine the unimaginable, then unimaginable things will continue to blindside us. The subprime crisis that swept global markets in 2007/2008 has largely been labelled a crisis of imagination, where politicians and governments alike failed to imagine that such an outcome was possible and later admitted so. I try not to use the word “imagine” with my corporate clients but no other word seems to fit the bill as snugly. Their risk processes failed to flag the risk of global systemic failure or the possibility of a bank run because the risk systems used had been programmed by minds that couldn’t, or wouldn’t, imagine such an extreme financial event.
Problems with individual financial sectors were identified, but a global failure of imagination meant no one anticipated this crisis. No one stopped to think “what if”.
~ Michael Coogan, Director General of the Council of Mortgage Lenders
All decisions involve risk – the more important the decision, the larger the risk, but also the reward for getting it right. Risk assessment and management tools are only as useful as the skills of those who programme and interpret them. Housing bubbles from ill-thought-out economic policy, stock market crashes, bank runs and corporate failures are part and parcel of our complex and risky political, financial and social environment. The risks that drive extreme events are usually the ones that no one paid attention to or could have foreseen when making the decision or setting policy. Whilst it’s very hard to know what you don’t know, the ability to imagine alternative futures is becoming more important around the boardroom table.
Despite being better educated about the risks our children face we might not have a better strategy to help them factor risk into their decision making. One strategy that is gaining traction in boardrooms and on management training programmes around the world is also something that already comes naturally to our children: telling stories or creating narratives.
History has shown us that the most unlikely scenarios at the beginning are the ones that do the most damage at the end. In trying to understand the risks inherent in their decisions or actions, encourage your children to create narratives of future scenarios from the most likely to the most “impossible”. They can even create stories with real or imagined characters to populate their scenarios.
Ahmed Mohamed is well aware of the fact that he is a Muslim living in Texas; even at 14, he’s sadly already had his fair share of prejudice because of this reality. There are many different things he could have made to take to school. If he understood the metadecision process he could have begun by asking himself what problem he was trying to solve: trying to impress his teacher, trying to impress his school mates, trying to get noticed, or, more sinisterly, trying to get press coverage or possibly test the school’s racial profile or tolerance? Whichever one of these he chooses will inform a very different invention. A ticking clock in a box that was not invented from scratch could easily be slotted into any one of these narratives. The most extreme ending to such a story would be what actually happened in reality, or even worse. His parents could have helped him choose an invention that would have achieved his aim without having to live through such a traumatic experience. Unless the risk was worth the payoff, whether intended or not.
Our children will face risks about whether to accept a dare or not, to befriend strangers on Facebook or other sites, to sneak out and go to parties without telling us, or more simply, to spend all their pocket money every month without ever saving a dime. Far from wanting them to be afraid of life, we want them to understand life better and the full range of options and consequences available to them.
Did you know that the hormones that make you feel sad also promote thinking and that feel good hormones increase your appetite for risk in much the same way that being angry does? Emotions result from a cocktail of hormones or biological chemicals generated in response to information we receive (and interpret) through our own five senses. We can’t remove the effect of emotions on our thinking but we can identify them and ensure that, when making important decisions, we control for their effects on our mental state – whether that be tiredness, frustration, disappointment, confidence after a success or irritation at our friends, teacher or boss. Every one of these impacts how we process and frame information. This is such an important aspect of raising thinkers that we’ll devote a chapter to looking at how emotions affect thinking and decision making. We’ll also look at how we can help our children become aware of not only their own emotional state but the influence that it has on their actions.
A fundamental premise of decision science is that good decisions are never random inspirations hastened by a moment of genius or lucidity but rather the result of a process used deliberately or unconsciously; a personal process informed by a lifetime of decision failures and successes plus a structure to hang these on.
Do you have a decision-making process that allows you to reflect on and refine your approach to problem solving? I won’t dictate a decision process as it’s as personal as your belief system but sound, repeatable processes usually make space for:
• A metadecision
• An understanding of how information is framed
• Checking for motives, mental mistakes and biases in all stakeholders
• Counteracting the effect of strong emotions
• Thorough scenario analysis or reality testing
If your child is still too young to embrace a process for thinking about his thinking then simply encourage him or her to STOP and THINK – if we can learn simple slogans such as “Reduce, Reuse, Recycle” or “Stranger Danger” or even “Slip, Slop, Slap on Sunscreen”, then STOP AND THINK is just as easy to retain.
__________
1 1966, The Manufacturing Man and His Job by Robert E. Finley and Henry R. Ziobro, “The Manufacturing Manager’s Skills” by William H. Markle (Vice President, Stainless Processing Company, Chicago, Illinois), Start Page 15, Quote Page 18, Published by American Management Association, Inc., New York.
2 Russo, J. E. and Schoemaker, P. J. H. (2001) Winning Decisions: Getting It Right the First Time. 1st edn. New York: Bantam Doubleday Dell Publishing Group.
3 The Guardian Newspaper, 17/09/2015. http://www.theguardian.com/us-news/2015/sep/17/ahmed-mohamed-is-tired-excited-to-meet-obama-and-wants-his-clock-back
A few weeks ago, I spoke at a philosophy club at a school in Surrey, UK, for students from nine years old and up. I planned to share a little about Socrates (he’s a real crowd pleaser) and Socratic questioning. Given that this was a “premium” private school with a reputation for being non-mainstream and much more focused on knowledge than testing, I had high hopes that what I was going to share on Socratic questioning was already old news to them.
Unfortunately, I was again surprised by the students’ lack of a critical thinking framework; in fact they lacked any framework for thinking about information. The idea that they could question a question was as foreign to them as poverty. Every question I posed was quickly wrapped in answers and opinions. These children were quick on the draw – knowing only that answers must follow questions. But what if a question followed a question?
2,500 years after he lived, Socrates’ ideas still push us forward and are fundamental in our explorations of science, technology, law and pretty much every other field of human advancement. If you don’t know it already, I’d like to introduce you to his method of enquiry, the first of its kind ever recorded (not by him but his student Plato and others). A method that we still use today as a warm welcome to critical thinking for people of all ages. As a systematic method of enquiry it aims to separate belief from truth, provoke deep thinking and elicit curiosity and epistemic humility. A question is answered with another question from a state of genuine curiosity and then another question until we understand the true nature of the problem.
Please put your hands together and welcome Socrates and his quirky system of questioning. Once you’ve done that, fold up the corner of this page and tag it, because you’ll use it again and again – guaranteed. Please feel free to explore these questions with your child on any topic, especially those issues where the facts are often anecdotal and opinion drives belief instead – like discussions on climate change, refugees, human rights and, for the brave among us, religion, the need to fit in or stand out or even gender equality.
As always, I’ll show you how it works with an example that I’ve already used with children. In this case nine and 10 year olds. The question was: Should we allow smart drugs in schools? (You know, those cognitive enhancers that temporarily bestow lucidity, wicked processing speeds, unlimited memory storage and generally make thinking a more fun and sparkly affair.)
Nooooooo, boomed a colourful collection of accents from the horrified expressions of the small group of boys in front of me. Clearly they’ve been taught that “drugs” are bad. What if the bad guys get hold of them?
I wasn’t there to talk about the drug itself, which is by now available legally in the form of Modafinil. I was interested in helping them work through the question and to identify the influence of their beliefs on their conclusions using Socratic questioning. This is how it went – I introduced six questions that would help us debate this idea:
1. Questions for clarification:
What do you mean by...?
Could you put that another way?
What do you think is the main issue here?
So, we can ask:
• What is meant by smart drugs – clarify which drug you are talking about, its effects and side effects.
• What is meant by allow – is it allowed for everybody or only those with learning difficulties? Is it allowed by prescription or can anyone just take it? Must a student tell a teacher if they are taking it? Must they get permission first?
• Is the issue about the safety of the drug, the effects of the drug or the social effect in the classroom?
Can you see how ambiguous and open to interpretation our original question was?
2. Questions about the question:
Are we asking the right question?
Why is this question important?
Is this question easy or difficult to answer?
Why do you think that?
Does this question lead to other important issues and questions?
Once we’ve clarified the question, we may want to change it to better reflect its actual meaning. Perhaps one of the following is more accurate: Are smart drugs safe to use? Do they create an unfair advantage? Should children with learning difficulties be offered smart drugs? Should just Modafinil be banned or all forms of cognitive enhancement?
Naturally this leads to all sorts of derivative issues and questions about doping and bad guys and unfair advantages. But also questions about superhuman enhancements and their implications for all of us. No matter where you stand on the issue it’s no longer the sole realm of superhero blockbusters and science fiction but rather something our children will have to deal with in their lifetime.
3. Questions that probe assumptions:
What are we assuming (guessing) but not sure of?
What could we assume instead?
How can you prove or disapprove that assumption?
Here the children definitely assumed that all drugs were bad and had bad side effects (well done, life skills class) but a little bit of research revealed that the benefits of this particular drug far outweighed the physical side effects. The military (both the good guys and the bad guys) already have it on their approved list and so do about a quarter of Oxford University’s students – keep up will you!
The rest of the questions follow a similar and natural pattern of probing ideas and gathering information to form new ones or confirm existing ones.
4. Questions that probe reasons and evidence:
Do we have enough evidence?
Is our information from reliable sources?
Are we anchoring on any one piece of evidence or idea?
Do we have information from different sources or just ones that agree with us? (confirmation bias check)
5. Questions about viewpoints and perspectives:
What is another way to look at it? (checking for framing)
What are the strengths and weaknesses of our current viewpoint?
How would we disagree with our own conclusion?
6. Questions that probe implications and consequences:
What are the consequences of our conclusion?
What are the implications of that conclusion?
Are there any next steps?
Previously published as a blog post on The Huffington Post.
How can we discuss the frightening yet inscrutable concepts of terrorism with our children? As a parent you know that your beliefs will shape your children’s beliefs and actions, which will ultimately shape tomorrow’s neighbourhoods and societies. Helping children understand an evil that is foreign yet right in our midst is a contemporary parenting imperative.
So how do you explain the inexplicable to impressionable minds? No matter how true it is, telling your children that terrorists are “psychopathic monsters” will engender even more fear and more helplessness against the “bad guys”. Showing our children instead that the problem has a cause, and therefore a solution, will do the opposite. In critical thinking we call this a thought experiment and, with the help of expert knowledge, it goes something like this:
Q: What if we knew what makes an ISIS fighter?
What if we knew that ISIS fighters who had been interviewed in prison are generally ignorant about the religion and politics of their paymaster, that they know little of its most extreme requirements? What if we knew that the average age of a fighter is 27, with 2 children? That they came of age in a war-torn Iraq during the American occupation that began in 2003. As teenagers they couldn’t go out to parties or even have girlfriends. Many of them grew up without fathers. They blame the Americans for this and leaving them in the middle of a civil war where food, safety and shelter were scarce and fear was plentiful, violence a way of life. For many of them, fighting for ISIS is seen as a way to avenge that. To take action and not wait for someone else to do it on their behalf.
Armed with information like this, we have a way of making the truly incomprehensible, comprehensible. As uncomfortable as it is to think and talk about, we can only change what we understand.
Q. What about the big question? What is ISIS?
What if we knew that ISIS was formed around ancient texts called the Prophetic Methodology? Texts that followers are not allowed to question, for to question them would mean certain death. These medieval texts were written at a time of war when brutality and bloodshed were the norm. Experts like Bernard Haykel tell us that the Islamic State is trying to recreate these earliest days and reproduce its norms of war. They want to create a vast empire filled with loyal subjects who abide by their extreme laws. Much like Hitler and his Nazi party, the Islamic State is committed to purifying the world by killing vast numbers of people who disagree with their beliefs. They see the world in black and white. Their propaganda tells us that they categorically reject peace and aim to bring about the apocalypse in all of its headline hyperboles, even if it means their own destruction. Perhaps this is why we find them so hard to understand, yet their propaganda is clear and available.
Q. Why do they commit acts of terror?
It would seem that terrorism is their way of making us (their enemy) feel afraid. They hope that fear will create intolerance and hatred, driving a wedge between different religions and people. Perhaps they’ve been watching Star Wars reruns and know that hatred is the path to the dark side. If we are weak, full of hatred and focused on what divides us more than what unites us as people, we will be easier to conquer.
Q. Am I powerless to help?
While the the leaders of the free world go to war against ISIS, hunt their leaders and stop young people from traveling to Syria to join them, what can we do? We can fight a movement that is closing young minds to the beauty and potential of a free world by opening our children’s minds to their own power. This means giving them information and tools to think critically, to probe and dissect confusing information and the permission to question everything. Everything.
Because physical freedom is nothing without freedom of thought.
1. Lydia Wilson’s report at www.thenation.com/authors/lydiawilson/
2. Graeme Wood’s study of ‘What ISIS Really Wants’ as published in the Atlantic, March 201
3. Bernard Haykel, the foremost secular authority on the Islamic State’s ideology.
TIPS AND TAKEAWAYS FROM CHAPTER 7
1. Each of us have an array of thinking tools at our disposal to compensate for the speed of our pedestrian processor. Thinking mistakes or mental biases are inbuilt compensators that require careful attention.
2. Talking through these biases with our children raises awareness of their thinking mistakes (and ours).
3. Your children will learn more about thinking and making good decisions from how you think than how you tell them to think.
4. Discussing the 7 habits of good decision makers is a great way to introduce critical thinking to your children. The habits are: 1. Understand the quality of your information, 2. Understand how information is packaged and presented, 3. Be very clear on what is fact, judgement (assertion) and opinion, 4. Develop a habit of deciding how to decide first, 5. Tell convincing stories to understand risk, 6. Examine the impact of emotions on your thinking before you make a decision, 7. Judge decisions, including your own, by their process, not their outcome. If all else fails, simply STOP and THINK!
5. Socratic questioning is a systematic way of deconstructing information and thinking about our thinking that children will find easy and fun to do.