images

You are not controlling the storm, and you are not lost in it. You are the storm.

—Sam Harris, author, philosopher, and neuroscientist1

We're entering an era of unprecedented psychological manipulation.

—Bruce Schneier, computer security expert2

Human stupidity has been one of the most powerful forces in human history.

—Yuval Noah Harari, historian3

Machines don't think, but neither do people.

—César A. Hidalgo, director of the Macro Connections group at The MIT Media Lab4

It's the twenty-first century; do you know where your mind is? That's not as silly a question as it seems, because most people don't know. The human brain today is an organ out of time. It evolved during and is best suited for life in the Pleistocene, a period from nearly two million years ago to about eleven thousand years ago. And yet here it is, having to make do in a modern, high-tech, increasingly wired, urbanized, and fast-changing world. The majority of people on Earth, right now, know little if anything about the brain's evolutionary history, its basic biology or psychological processes. This, perhaps, best explains why so few appreciate the value and urgent need for critical thinking in their lives. Enter the Internet and social media. Into this void of knowledge comes what may be the ultimate playground for cognitive bias, delusion, fraud, and stunning stupidity.

Why haven't critical thinking, skepticism, and a deep appreciation for the scientific method become prevalent at every level of every society by now? Countless examples, from aviation to computers, show that science, evidence-based thinking, and respecting reality work better than anything else for discovery, invention, and simply getting things done. Do elevators go up and down, microwaves ovens cook food, and planes fly because we wish them to while crossing our fingers? No, they work because of science and engineering. Do we chart the accurate movements of the heavens above by reading bird entrails and consulting astrologers? No, we look to math and science for that. Computers, smartphones, and the Internet work only because those who built them stuck to a trail of discovery and facts, as opposed to one of magic and mysticism. Yet here we are, a species so often delusional and deranged that at times it can seem miraculous that we are able to feed and dress ourselves. We have the answer to our most difficult challenges. We know the way toward greater safety, efficiency, and productivity. It's through science and reason. This is not to suggest that scientists are always right or that science is always perfect. The scientific process is a tool, one that can be used in many ways for many purposes. Science gave us antibiotics and vaccines as well as napalm and thermonuclear weapons. The point here is that it works and that scientific thinking is available to all of us. It's the means by which anyone can cut through the nonsense and lies, to make one's way closer to truth.

“Science is a way to call the bluff of those who only pretend to knowledge,” wrote Carl Sagan in his classic book about critical thinking, The Demon-Haunted World. “It is a bulwark against mysticism, against superstition, against religion misapplied to where it has no business being. If we're true to [science's] values, it can tell us when we're being lied to. It provides a mid-course correction to our mistakes…. Finding the occasional straw of truth awash in a great ocean of confusion and bamboozle requires vigilance, dedication, and courage. But if we don't practice these tough habits of thought, we cannot hope to solve the truly serious problems that face us.”5 You and I must never forget that we are led around by the subconscious workings of what is essentially a prehistoric brain meant for the prehistoric environment. This is why we so often default to the quick comfort of a guess dressed up as reason. Our natural urges lead us to rely on instinct over intelligence, impulse over reflection. Look around and everywhere you will observe people who are busy fearing, hating, and worrying over the fabrications and exaggerations of their irrational minds. Humankind squanders immense, immeasurable amounts of time, energy, and resources upon various altars of madness. We have such capable, powerful brains but scarcely apply them to full effect. We are the brilliant-idiot rulers of Earth who fritter away so much of the potential within these massive thinking machines evolution gifted to us. We are like Ferraris stubbornly driving in reverse, top-end computers with screens dimmed to gray fog. And all the while we fool ourselves into believing that we are rational, sensible creatures loyal to logic and evidence. The hard truth is there to see for anyone who looks. We cling to made-up stories about ourselves and the world around us, mostly fictional narratives designed to comfort, confirm, and conform to whatever seems to work in the moment. But this way fails us again and again.

Good thinking matters. Every advancement that made our world better, and every human act that diminished or wasted human lives, tracks back to an idea, a success or failure of reason. It is only through self-awareness and humility that we might have a chance at overcoming the strange burdens and baggage that come with the ownership of a human brain. Billions eagerly use and rely on the products and benefits of science and critical thinking while simultaneously thinking and behaving in unscientific, uncritical, and irrational ways. A woman picks up her smartphone and accesses the Internet—two glorious examples of scientific thinking. She reads her horoscope and sends a text to her psychic—two sad examples of unscientific thinking. A man sits in his living room, where he reads and believes a fake news article on Facebook. The article, claiming new evidence that NASA faked the Moon landings, had been written only an hour ago, thousands of miles away from his home, and was delivered to his computer screen in part thanks to a satellite in space.

You may have moments in which you sense that you are out of place or disconnected to society because of all of the craziness that swirls around us. You might even feel as if you are thrashing about in an ocean filled with bobbing imbeciles, zealots, and maniacs. But this is where you belong. To be human is to be susceptible to nutty beliefs and behaviors. Don't be condescending. Don't attempt to deny or escape your membership in this club. Strive to make the best of it. Know the cognitive weaknesses, flaws, and foibles that come naturally to all of us. Then rise above them as often as possible.

For all our problems, human culture has made real progress. We are less violent.6 We have more knowledge, more stored information, than ever before.7 We have managed to turn on a light switch, yet most people insist on wearing a blindfold and continue to stumble around in the dark unnecessarily. Why is this the case? Why isn't rational thinking a universal goal, one that is on the lips of every teacher and politician? Why isn't every child on Earth, at least those fortunate enough to see the inside of a classroom, taught how to think? Why is the absurd notion that skepticism is negative so pervasive? Every achievement and all of our miseries alike are rooted in the consequences of thinking. For better or worse, our thoughts make the world we live in. Be aware of this, and also understand that by getting your own cognitive house in order you help nudge your society and all humankind toward a better place.

My travels across six continents and a deep curiosity for all things human have enlightened me to harsh realities. Most randomly selected twenty-first-century people are likely to know more myth than history, understand more astrology than astronomy, and be far more attentive to unproven supernatural forces that allegedly run their lives than they are to scientifically revealed natural forces that actually do run the universe. Given the high stakes of thinking, why do we place so little value on improving the quality and reliability of our perceptions, thoughts, and decisions? Societies train their brightest to be engineers, lawyers, doctors, astronauts, and so on. But not even the most highly educated elites are reliably taught in schools how unexpectedly strange and troublesome normal human minds are. The result is that relatively few of us understand how to improve safety, efficiency, and productivity through better thinking. I argue that scarcity of critical thinking is humankind's great unrecognized crisis. Therefore, given this planetwide and perpetual blind spot, it should be no surprise that weak critical thinking is a significant problem in such a vast subculture of human activity as social media. Make no mistake, we do import our cognitive shortcomings and subconscious weirdness into this new arena of communication and social interaction. And we suffer for it, just as we always have in other arenas throughout history. But you don't have to go this route. You can avoid most of the mental trip wires and potholes that others will walk straight into. All you have to do is think.

WHAT IS GOOD THINKING?

Good thinking is an umbrella term for understanding the human brain and using it in ways that enable one to make rational decisions, identify deception, and avoid or discard delusions as often as possible. It requires the following:

  • An understanding of the evolutionary history of the human brain and how it has left us with a thinking organ that goes about its business in unexpected ways that mislead us about reality.
  • Knowledge of the basic structures and functions of the human brain, how vision and memory work, for example. How personal recollections and sensations can seem real and accurate even when they are not.
  • An appreciation for the profound impact nutrition, lifelong learning, and physical activity have on the brain's health, performance, and longevity.
  • Awareness of the prominent role of the subconscious mind in daily life, and the understanding that we inherited our brains from ancestors shaped by extremely competitive and dangerous environments that made fast subconscious reaction a priority over slower conscious reflection and imagination.
  • An alertness to many of the natural and common mental biases and shortcuts that can undermine rational thought.
  • The courage and maturity not only to question everything but also to accept the absence of answers and those answers that may contradict hopes and beliefs that appeal to us.
  • Sufficient humility to prevent one from placing absolute trust in sensory perceptions, personal experiences, and even thoughtful conclusions. A willingness to always reconsider, revise, and change one's mind when better evidence demands it.

Source: Adapted from Good Thinking: What You Need to Know to Be Smarter, Safer, Wealthier, and Wiser (Amherst, NY: Prometheus Books, 2015).

We carry, loaded into our skulls, three-pound blobs of electrochemical “magic” that have the potential to light up the dark and penetrate the densest fog with reason—if we care enough to try. Call me irrationally optimistic, but I view critical thinking as a cold meme, packed with universal potential, just waiting to catch fire and rage across all humanity via social media. (Perhaps if we somehow worked it into a cat video?) What is certain beyond doubt is that more of us need to recognize these problems of cognitive biases, deceptive perception of the world around us, emotional thinking, and subconscious shenanigans so that we may live more reasonable lives and build better societies.

“The history of science, medicine, politics, and business is littered with examples of obstinate adherence to old customs, irrational beliefs, ill-conceived policies, and appalling decisions,” points out Dean Buonomano, a professor in UCLA's Departments of Neurobiology and Psychology. “Similar penchants are also observable in the daily decisions of our personal and professional lives. The causes of our poor decisions are complex and multifactorial, but they are in part attributable to the fact that human cognition is plagued with blind spots, preconceived assumptions, emotional influences, and built-in biases.”8

I suspect one of the reasons critical thinking gets so little attention and respect from most people lies with the term itself. After years of giving many lectures hearing feedback from interviews, lectures, and writings, as well as having many conversations about thinking and irrational beliefs, I have learned that many people are just not sure what “critical thinking” means. Very few admit this up front because it's one of those things everyone feels they should know. For this reason, it is a mistake to casually toss around this phrase under the assumption that everyone is clear about what it is. Some people even become defensive when encouraged to “think critically” because they seem to sense that it may be a threat to them. But critical thinking is a threat only to lies and mistakes in reasoning, nothing else. It is both a valuable skill and an attitude we all need; yet I have had conversations with numerous people who were bright, positive, and clearly self-reflective—but who couldn't recognize a need for consistent critical thinking in their lives.

Many people seem to believe that critical thinking is an elitist catchphrase tossed about by professional philosophers and other intellectual weirdos who spend their days and nights worrying about whatever the hell is going on in Plato's cave or whether it's okay to push one fat man off a bridge in order to stop a train and save two children on the tracks. Maybe critical thinking is a bogus concept used by those who like to appear smarter or remind everyone that they are more educated than others. Maybe it's a sly way to insult people. Or, perhaps, it's a marketing scam to help sell books like this one. No, no, and no. Regardless of who we are, where are, or what we do, we all need critical thinking because being human and dwelling within human culture means being forever trapped in a fog of misperceptions and confusion. Critical thinking is conscious thinking, and it is the best way to cut through that fog and navigate our lives to better results. There simply is no rational reason to deny this need. Critical thinking skills are people skills, and they are readily available to everyone. Anyone can be an intellectual, a philosopher-scientist, and elite thinker, but only if reason and reality are valued. Ultimately it is our choice whether to think or not, and the quality of our own life hangs in the balance of that decision.

Toward making critical thinking more palatable to the masses, I have taken to referring to it as good thinking. Simple and positive-sounding, good thinking is easy for anyone to recognize and relate with as a positive trait or desirable behavior. Who doesn't want to think well, to be a good thinker? Who is going to say, aloud, “No, thanks. I'm sticking with bad thinking—just a better fit for me.” I feel so strongly about this that I wrote a book titled Good Thinking: What You Need to Know to Be Smarter, Safer, Wealthier, and Wiser.9 To warm up readers for deeper explorations of the key cognitive biases to come in this chapter, I have loosely excerpted the following primer from Good Thinking. This a brief rundown of common challenges every human must cope with while thinking and making decisions. Failing to keep these in mind guarantees you a consistent stream of mental mistakes with occasional bouts of idiocy on social media and elsewhere.

THE DIRTY DOZEN

Be aware of these standard weak points to make your mind less vulnerable to bad claims and bogus beliefs.10

  1. The Emotion Potion. We are emotional creatures, and this often leads us to make irrational decisions, embrace bad ideas, and act in ways that work against our best interests. Emotions can intoxicate us, make us dumb. Be aware of this vulnerability. If someone tells you the world is going to end in fiery chaos soon, for example, don't let your fear of such an event distract you from rationally analyzing and challenging the claim. Don't think, “I'm scared.” Instead, think, “This person doesn't have any good evidence to back up what he is saying.”
  2. Popularity. We are social animals. The relative safety of the crowd feels good. It can be cold and harsh out there all alone in the wilderness. But reality is not a popularity contest. Recognize how we all can be swayed by popular support of an idea no matter how destructive or ridiculous it may be. Never forget that truth and reality are not decided by vote. The majority of people have been wrong about many things many times throughout history. There were times when phrenology and bloodletting were respectable and popular ideas—but they were still wrong.
  3. Straw Person. A common tactic people use to promote weak or worthless claims is to attack an easy-to-beat, diluted, or counterfeit version of the counterargument. Those who say, for example, that Earth is around 4.5 billion years old should not be swayed one bit on this point if a science denier were to tell them that there was a time when geologists didn't understand continental drift and still can't explain everything today about the structure and function of the Earth's core. Of course geologists don't know everything. But this does not refute the strong evidence for a 4.5-billion-year-old Earth.
  4. Loaded Questions. Sometimes people try to make their point seem more sensible by slipping in an unproven claim or bit of nonsense as filler or padding. Example: “Another reason we know the Lost City of Atlantis is real is because psychics and mediums have communicated with spirits of dead Atlanteans.” Listen well and catch weak arguments or bad ideas within the larger claim. Challenge them all.
  5. Wishful Thinking. This is simple but deadly to good thinking. We desire something, so we believe it to be true. This is a powerful human compulsion. Be aware of it and be tough with yourself. Always ask yourself, “Am I accepting this claim because it makes sense and is supported by strong evidence? Or do I just want to believe it so much that I am willing to pretend to know that it is valid?”
  6. False Dilemma. Watch out for people who frame their case as an “either, or” proposition. Sometimes there is a third option, or perhaps many more options. For example, a politician might say that more prisons must be built or there will be more violent criminals on the streets. But what if we went with a third option? What if nonviolent offenders were released early or given lighter sentences, thereby freeing up space for more dangerous criminals to serve longer sentences?
  7. Explaining by Naming. Giving a name to something is not the same as explaining it. For example, calling an event a “miracle” is not an explanation for what happened and by what process. Calling an expensive conversation with a psychic or medium a “reading” does not explain how mind reading or talking to dead people works. Watch out for this deceitful form of verbal carpet-bombing. When it happens, simply ask the person to explain the name or concept he or she attempted to pass off as an explanation itself. An inability to do so will reveal everything you need to know about the person's credibility and/or competence.
  8. Circular Reasoning. This shows up frequently in online chats and comment threads where people try to make their case in few words. It occurs when someone attempts to prove “A” by pointing to “B.” And then claims that “B” can be trusted because “A” proves it.
  9. Authority Worship. Don't forget that we humans are primates; we are essentially chimps in shoes. And, just like them, we are obsessed with rank and power. This gives us a huge weak point in our brains, because our natural reaction is to snap to and obey when we view someone as our superior. I'm not suggesting that you should rebel against everything an authority figure says, of course. But do try to think clearly about the validity of words from on high. Don't let a uniform, fancy title, or dominant posture hoodwink you into believing nonsense or buying a junk product.
  10. Special Pleading. People who promote or believe in things that are unlikely to be true often scramble to change the game when they feel the walls of reality closing in on them. For example, a person who says acupuncture works because “one billion Chinese people can't be wrong” might not like hearing that only about 18 percent of China's population relies on acupuncture,11 and that person might react by suddenly arguing that numbers don't matter.
  11. Burden of Proof. The person who makes an extraordinary claim has the burden of backing it up. You and I don't have to prove that mediums can't talk to dead people or that Bigfoot is living among the Redwoods. It's not even fair to suggest that we should, because in many cases it is impossible to disprove such things. Instead, the believer must validate her beliefs.
  12. Ad Hominem Attacks. If dealing with facts doesn't get you anywhere, try name-calling. Or don't, because this is a despicable tactic. If you are discussing astrology with a total jerk, remind yourself that being a jerk is irrelevant to whether or not astrology is a real thing. Focus on logic and evidence. It's better for everyone in the long run to kill a bad message rather than the messenger.

CONTACT WITH ONLINE INFORMATION MATTERS

Beyond the “Dirty Dozen” mental mistakes listed above, there are many cognitive biases and other psychological phenomena about which we all should be as informed as possible. Every social media user can benefit from being aware of the mere exposure effect and the illusion-of-truth effect, because typical online activity includes consistent contact with inaccurate stories, lies, and misinformation that serves someone else's hidden agenda. The mere exposure effect is enough to lose sleep over when you think about how easily your judgment can be manipulated. Your subconscious mind remembers and makes use of far more than “you” will ever know. Contrary to our natural assumptions, simply being exposed to an idea can influence our thinking about it later. This often holds true even if the information is deemed to be trivial or meaningless in the moment, and even if we consciously forget about it. The old adages “There's no such thing as bad publicity” and “Repeat a lie often enough and it becomes the truth,” pack a frightening punch. Subconscious familiarity can lead us to feel more favorable about something than we otherwise would have—regardless of whether or not there is a good reason to be positive about it. Think of how easy mistakes in reasoning can happen when simply bumping into an exaggeration, absurdity, or lie online can make it seem sensible and believable should you encounter it again later. This is a primary reason why advertising is so effective. We are bombarded daily with ads online and off. We may assume they don't really work on us but, thanks to the mere exposure effect, many ads do succeed. Our subconscious minds are paying attention and being influenced, even as our conscious minds tune them out. It's as if we are all sleepwalking, on our feet but not quite awake, not fully alert to what is going on around us. And, as the brilliant writer Ralph Ellison warned us: “There are few things in the world as dangerous as sleepwalkers.”12

The mere exposure effect raises serious concerns about all “meaningless” comments, photos, and viral memes encountered on social media platforms. We may dismiss an image or string of words that we encounter online as inconsequential in the moment; but it's possible that—beneath our awareness or consciousness—we are falling in love with the idea. This gives you something to think about as you laugh off those insane political rants and weird claims you skim over on social media every day. Without your open consent, the simple fact of exposure to them may be nudging you toward feeling more positive about these claims and ideas, should they cross your path in the future.

The similar illusion-of-truth effect is another potential problem on social media in particular. This disturbing phenomenon entails repeat readings, hearings, or viewings of information that lead you to believe that the information is true. Think of it as the cognitive version of being beaten into submission. Where the mere exposure effect can cause you to feel positive about something, this other effect can leave you convinced that a false claim is factually correct—just because we saw it before. The particularly creepy aspect of it is that it works even when we can't remember seeing the claim before. Worse, the illusion-of-truth effect can occur even if you are initially informed that the information is false! It's almost as if knowingly bad information wears us down, tortures us until we believe. Be careful about wallowing in sketchy information daily in your social media ecosystems. Bad ideas may contaminate your judgment through the most basic and briefest exposures if repeated enough times. The Nazi propaganda machine of the 1930s did not convince sufficient numbers of Germans that Jews were to blame a litany of their problems the first time they made the claim. To gain real traction with the idea, they repeated it, again and again. In 2002, I interviewed Armin Lehmann, a former member of the Hitler Youth.13 He also was Hitler's last courier, present in the Berlin command bunker during the final days of World War II. Lehmann told me that anti-Semitic propaganda was a near-constant theme in school.

The present danger of this standard psychological vulnerability—one we all have—is clear. Therefore, we should keep it in mind, for ourselves and for our societies. It certainly might explain in part the strange loyalties we see some people show for destructive ideologies such as ISIS and other negative movements. “The illusion-of-truth effect highlights the potential danger for people who are repeatedly exposed to the same religious edicts or political slogans,” warns neuroscientist David Eagleman.14

“The more we hear a lie, the more likely we are to accept it as truth,” adds Bo Bennett, an educator and author of Logically Fallacious.15 When I interviewed him about this topic, he continued, “But this also has to do with how easy the information is to process. When a politician says, ‘I am going to fix the healthcare problems and make our system the best in the world,’ we tend to believe this because of its simplicity. If the same politician were to explain precisely what he or she will fix, using legal and economics jargon, that message is less likely to be believed. Sometimes the truth is simple, but sometimes it's not.”

WHAT HAPPENS WHEN YOU DON'T KNOW WHAT YOU DON'T KNOW?

The Dunning-Kruger Effect

Is there anything worse than being ignorant, dumb, stupid, misinformed, misguided, and clueless? Yes, there most certainly is. Meet the Dunning-Kruger effect. The Dunning-Kruger effect is named after psychologists David Dunning and Justin Kruger. With a fascinating study that they coauthored, they exposed and described the phenomenon of believing we know more than we do. In other words, the Dunning-Kruger effect is the problematic and dangerous state of being unable to recognize the depths of one's own ignorance. It can lead to worse outcomes than simply not knowing, because when we underestimate lack of knowledge or skill we can be quick to adopt the role of an arrogant expert, eager to comment, decide, believe, and act—often with undesired or negative consequences. One might assume that ignorance would nudge us toward humility and caution in thought rather than arrogance and recklessness. But the human mind doesn't work that way.

Make sure you understand and never forget that we do a poor job of accurately assessing our own knowledge, skills, and reasoning abilities. This is a common human trait. Any typical D student, for example, is likely to perform worse than an A student on a test in the given subject, of course. But there's more to be concerned about here. Thanks to the Dunning-Kruger effect, the D student would also tend to be comparatively worse than the A student at estimating his or her own test performance. The D student struggles at getting an accurate feel for just how little he or she knows. When we find ourselves in over our heads, we tend to overestimate how much we know or how good we are at a skill, simply because we just don't know enough to know our shortcomings. Leonardo da Vinci excluded, this matters to everyone because out in real life most of us are “D students” in most subjects. An engineering or law degree alone doesn't make you an expert in geology or nuclear arms proliferation, but you may feel competent in the moment because of the Dunning-Kruger effect. The same lack of knowledge that makes a D student deficient in the first place also leaves him or her less able to recognize what she or he doesn't know. To make matters worse, we also do poorly at assessing the skills and abilities of other people in areas in which we lack expertise. This means that we can't consistently and accurately identify incompetence in politicians, medical doctors, lawyers, and so on because we don't know enough about their specific fields to evaluate them.

To see the Dunning-Kruger effect in action, just listen in on two typical Americans engaged in a heated discussion about politics. Before long, both are sure to verbalize unjustified expertise in a wide variety of fields and offer ready solutions to every complex problem from the economy to terrorism. This is not a “dumb people problem.” Well-educated and brilliant people also stray from their areas of competence without a corresponding dialing down of confidence. You have done this. You will do this. I state that with confidence because everyone has this problem. No one is immune. We all struggle to recognize our limitations.

I have been aware of this problem for so long that my conversational speech has adapted to guard against it. Over the years, it has become my habit to sprinkle conversations with phrases that qualify and declare the limits of my knowledge and certainty. I'm not sure, but…;According to the data I've seen, it seems to me that…;I don't know, but I lean toward…;I could be wrong, but my strong hunch is…;At the moment it seems reasonable to me that…;and so on. I do this with such consistency that I sometimes worry I might sound like a paranoid lawyer trying to preempt a lawsuit stemming from the reckless use of a ladder or toaster. This also might seem to some to be wishy-washy or weak language, but it's not. Prefacing the statement of a claim or position with, “I don't know for sure but…” is not weak. It is a mark of strength and honesty. It makes everything to follow more powerful, not less, because it is not shrouded in lies, delusions, or false confidence. Qualifying important communications in this way serves two purposes. (1) It protects me from overreaching and thereby setting myself up for intellectual annihilation in conversation or debate, should I be wrong. (2) It also helps keep me authentic and focused within. It is more difficult to drown in your own hubris when you literally speak aloud the limits of your knowledge. Still, even with my regular linguistic safety catches, I fear that I sometimes slip into fake-expert mode.

Thanks to many years of running and weight training, I tend to feel that I have it all figured out and can come up with a good answer to virtually any question on the topic of fitness. But I can't. I exercise a lot, and I've read a bunch of books on the subject. That's not the same as having earned a university degree in kinesiology or putting in twenty years as a head strength and conditioning coach for an NFL team. Yet for all my internal protocols and verbal speed bumps, I still catch myself speaking sometimes as if I am a credentialed expert in the field. But the reality is that there certainly are specific fitness topics about which I am so deeply ignorant that I can't even recognize how uninformed I am, so I should tread carefully and maintain cautious humility.

The good news is that simply being aware of the Dunning-Kruger effect can make us safer, more efficient, and rational when online. Humility can and should become a habit, the default setting. Yes, it's hard to be humble when half of your Facebook friends are certain that aliens built the Great Pyramids at Giza, but we must beat back these demons of overconfidence. Humility is the prerequisite to sound skepticism and consistent critical thinking.

Having the mere awareness and understanding that all people—yourself included—struggle to accurately assess competency levels can inspire the crucial and necessary pause, that moment of reflection before speaking, writing, clicking, liking, or swiping. Prior to declaring the “obvious answer” to gun violence, racism, sexism, or poverty—and then digging in to defend it—we must recall that confidence is not the same thing as knowledge. The Dunning-Kruger effect explains much of the loud, proud folly you find in social media. In a paper about their study, Dunning and Kruger write, “when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and unfortunate choices, but their incompetence robs them of the ability to realize it. Instead…they are left with the impression that they are doing just fine.”16

We struggle to know our own ignorance, always have and maybe always will. No one is an expert on everything, so everyone succumbs to the Dunning-Kruger effect at some point. This is an old cognitive challenge, one that will not cease tormenting us anytime soon. Some twenty-six centuries ago the Chinese philosopher Confucius is believed to have written or said: “To know when you know something, and to know when you don't know, that's knowledge.”17 Socrates, regarded by some as the greatest thinker of all time, is believed to have inspired what we call the Socratic paradox: “I know that I know nothing.”18 William Shakespeare seemed to be aware of this problem as well, having written this line into his play, As You Like It: “The fool doth think he is wise, but the wise man knows himself to be a fool.”19

Good thinkers resist the lure of arrogance and are quick to say, “I don't know.” Good thinkers are always willing, even eager, to change their minds when shown enough contrary evidence that collides with a belief or conclusion.

A final warning: Don't allow your awareness of the Dunning-Kruger effect to paralyze you on social media or elsewhere. Be more thoughtful and cautious, of course, but this is not an excuse to shy away from or disengage from interactions and explorations into areas you don't know much about. Plunge forward into new frontiers and grow your mind. Only do it with a wisdom that comes from knowing that you don't know everything.

WHY WE DON'T ROCK THE BOAT—EVEN WHEN IT'S SINKING

Building and maintaining social networks is one of humankind's greatest strengths. These connections allow us to accomplish things no individual could ever achieve alone. However, a significant problem often arises when we bind ourselves together off-line or in cyberspace. This is where groupthink enters the picture. Groupthink is the common and dangerous tendency to fall in line and agree, even when we should know better. In other words, it is the tendency to align our thoughts in pursuit of consensus; this can lead us to make regrettable decisions. In our positive efforts to maintain unity, get along, avoid alienating others, or lose friends, we go along with what we perceive to be the group's desired direction, idea, or outcome. But sometimes we agree with what seems to be the majority position even when our intellect or moral compass might point in a different direction.

Groupthink has long been and continues to be a serious problem for governments, corporations, religious organizations, and maybe for you on social media. Many disasters, near disasters, and just plain bad decisions might have been avoided if one or more people within a collaboration had resisted the dominant winds. This is not the same as someone not speaking up with an opposing viewpoint or idea because they are afraid of repercussions.

Groupthink is different than fear-based obedience. Instead, groupthink happens when people feel that agreeing and going along is the right thing to do. It makes sense that this would be common, because we are taught to value teamwork and loyalty to friends, colleagues, and allies. Saying no when everyone in the tribe is saying yes can be difficult. It not only makes one stand out for possible negative scrutiny within the social unit but also can be seen as abrasive, repugnant, or even a sign of dangerous disloyalty.

Groupthink can be a problem for social media users because platforms like Facebook and Twitter cater to it so well. For example, many Facebook users build a coalition of family members, real-world friends, coworkers, and online acquaintances. How many Facebook users post any and all opinions without any thought given to possible negative reactions from others? I suggest most don't. Within the mostly closed world of a particular Facebook community, groupthink can be a powerful and near-constant presence. What if the all or at least a majority of someone's Facebook friends post, share, and “like” a report about a dubious herbal concoction said to cure AIDS, diabetes, eczema, and low self-esteem? There is a good chance that this person will either join in by also giving the news story a “like” or react to it with silence, which itself can be a form of agreement and conformity. If groupthink comes into play, as it so often does, the least likely response from most people in these situations is to challenge it. For all of the talk about trolls and flame wars, most people go along to get along. Facebook, Instagram, and Twitter users know that great power is always at their fingertips and the fingertips of others. With a simple click they can banish or be banished from a community forever. Exiling was a common practice in some ancient societies. Some viewed it as a fate worse than death. And here we are again. Today, people are exiled from social media tribes every moment.

Perhaps a helpful way to think about this is to remember that old saying, “monkey see, monkey do.” We're all primates, remember, and, like the monkeys of that cliché, we are prone to imitation and quite good at it. Imitation is one of the primary ways in which we learn new technical and social skills. Imitation has been a means of survival for a long, long time. You are a copycat. Keep this in mind while wading through all of those posts and comments on social media. Again and again, subconscious impulses will encourage you to do as you see your fellow primates doing—even, sometimes, when the behavior or idea is negative, a waste of time, or just plain dumb. “The extent to which imitation is engrained into the brains of humans is easy to underestimate,” warns UCLA neuroscientist Dean Buonomano, “not because it isn't important, but rather because it is so important that, like breathing, it is automatic and unconscious”20

What social media users must do to avoid bogging down in a swamp of groupthink is first realize and remember that the sensible goal is not consensus for the sake of consensus. The objective is to be correct, to have a clear view of a situation, and to avoid being an idiot. What good does it do you to be a loyal teammate when the team is running full speed toward the edge of a cliff?

What can you do to avoid groupthink? Build diverse social networks. If I encountered a Scientologist or a CrossFit fanatic who asked for advice on how to avoid groupthink, the first thing I would suggest is for him or her to diversity the social circle. Friend a few Buddhists. Connect with a couple of jujutsu enthusiasts. The more diverse the exposure and input, the better. It's difficult to think outside the box if you live your life inside a box.

Finally, understand that there are a variety of ways to raise objections and offer differing opinions within a group that do not involve being a rude, rebellious, jerk. Those who are concerned about offending friends, being blocked on Twitter, or instigating arguments need only take a soft but steady approach. Show studies, data, and expert opinions that support a view that contradicts group consensus. To soften the blow, preface it with: “Hey, just to careful, let's consider another possible answer.” or “Do you think there is anything we should consider about this other idea?” Try to frame the exchange as less of a debate and more of team effort to get to the truth. Remember that the point is not to win arguments or achieve unanimous agreements one way or another. The goal is to make the right decision. One key to preventing or fixing groupthink on social media is to work hard at being tolerant. It's the same way we can escape filter bubbles. Don't reflexively unfriend, unfollow, or block every person who has an opinion that is different from yours. Ask questions before helping to ignite a fiery debate. Constructive, positive dialogue, at least from your end, has a much better chance of doing more good than immediate censorship and isolation. If, of course, you are connected with someone who is so persistent, stubborn, obnoxious, abrasive, or disappointing that you find the relationship too stressful, then unfriend and block away. There's no sense in suffering when relief is but a click away. Just know that social media works in a precise way that fuels groupthink. We actively connect ourselves to people who mostly share our backgrounds, experiences, and views. This is human nature. But while it might be comfortable, it is not intellectually healthy or safe in every situation.

BEWARE THE BANDWAGON

We love jumping on bandwagons. Similar to groupthink, this too can seem irresistible at times. Who among us would not enjoy the warm and cozy feeling of riding off into the sunset with a bunch of people who are all on the same wavelength, chanting the same slogans, or perpetually patting one another on the back? It's obviously reassuring and empowering to be with the in-crowd or at least among allies. When surrounded by agreeable voices, we feel confident that we are right and that all is right in our immediate field of vision. But those who wish to be mature, wise, and reality-based in their thinking must acknowledge the obvious: Crowds can be wrong. Popularity is not a reliable measure of truth and reality. One of the most detrimental effects of social media is the way it exploits our natural weakness for assuming a real or perceived majority knows what it's talking about.

The bandwagon effect must be given due consideration while on social media. To fail in this is to ensure taking repeat journeys to fantasyland. It doesn't matter who you are; you will feel a rush of confidence when you think that your beliefs, ideas, or conclusions align with a majority or a substantial number of people. The mind automatically seeks reassurance and refuge in the crowd. Your subconscious will whisper to you: How can all of these people be wrong? But of course we know that large crowds can be wrong, and many are wrong every day. Consider as an example the world's three most popular religions, which account for more than half of the world's population. Based on their most basic, core claims and doctrines, Christianity, Islam, and Hinduism cannot all be correct. The only possibilities are for one of them to be correct or for all three of them to be wrong. It is not even possible for them to be logically compatible on so basic a matter as the number of deities: Islam's one god only/no divine son vs. Christianity's trinity of three-gods-in-one vs. Hinduism's millions of gods.21 Therefore, on this point of religion, at least a few billion people are wrong.

Let's reiterate that point: The crowd is often wrong. It can be wrong even when the crowd numbers in the billions. And if billions of sane, honest, well-intentioned people can be mistaken about their beloved religious belief, how easy then is it for a crowd to be wrong about a political candidate, an alternative medical treatment, or a conspiracy claim? Keep this in mind when exposing yourself to impressive retweet, share, and “like” stats.

The bandwagon effect is not cut-and-dried. Sometimes a bit of bandwagon jumping is okay. We must necessarily do it at times for efficiency's sake. I'm happy to jump on the UNICEF, science, and critical thinking bandwagons, for example. I just know not to strap myself in so tight that I can't jump off if I should discover that I'm being taken off in the wrong direction. The key is to board a particular bandwagon only on the basis of very good reasons to do so and not for reasons of popularity, comfort, coercion, or emotional hunches. For example, “scientific consensus” is a bandwagon of sorts. When the world's scientists within a particular field generally converge on the same conclusion, those of us who value truth and reality can with some confidence jump onboard with them. But we do this not because they are a crowd. It is because we understand the scientific method and trust it to a significant degree. We tentatively accept the scientists’ conclusion as probably true—because we know from past experience that science is the most effective means of discovery and figuring things out. But good thinking requires us to be ready to bail out at a moment's notice should better, contradictory evidence turn up.

WELCOME TO THE SHADOW BRAIN22

Most of us have heard of something called “the subconscious mind,” that mysterious other part of ourselves that somehow interacts with our normal mind. What few people grasp, however, is just how prominent and powerful an influence it is to our lives. This subconscious presence, I call it the shadow brain, is not a minor player who follows behind you only to occasionally whisper advice in your ear. No, it is you who follows the shadow most of the time. Believe it or not, “you” are the minor player in your life.

To help understand how important the shadow brain is to you, imagine looking down from high above and spotting a lone swimmer treading water at the center of an enormous lake. The swimmer is you, and the lake is your shadow brain. The swimmer is tiny and alone, with deep waters on all sides. The swimmer is also blindfolded. Currents and waves constantly pull and push one way or another. The blinded swimmer cannot consistently detect or make sense of these forces, because there are no points of reference. Sometimes the currents turn the swimmer a bit left or right. Sometimes they completely reverse the swimmer's course. Often the swimmer will stroke away at full effort but go nowhere. Sometimes the oblivious swimmer doesn't swim at all but makes fast progress across the lake nonetheless due to strong currents. In addition to this, unknown creatures swim around and beneath the swimmer. Sometimes they brush against the swimmer's legs with such a light touch that the swimmer doesn't notice. Some are small, some big. Many times they impact with such force that the swimmer knows something substantial is there but still has no idea what it is or what its intentions are.

The swimmer has no idea how wide or how deep the lake is. The shore could be ten meters away or ten thousand kilometers away. But the swimmer imagines it is nothing more than a small swimming pool because this is a comforting thought. The lake is a thousand times larger than that, however. If only the swimmer knew more about the lake, at least a hint of its size and maybe something about the currents, waves, and creatures as well. But the blindfold prevents the swimmer from understanding anything. In an attempt to make sense of it all and feel less anxiety, the swimmer constantly thinks up reasons, explanations, possibilities, and excuses for why the waters move, why strange things move around below the surface, and why no shore is ever reached. The swimmer keeps swimming, year after year, never realizing the bizarre truth: The lake is the greater part of the swimmer's life. The lake is the swimmer, too.

Good thinking can't take us out of the vast lake that is our shadow brain. But that's okay. We wouldn't want to leave it even if that were possible. We need our shadow brains to be our automatic-reaction force, to keep our involuntary systems functioning, and to find countless mental shortcuts through daily life for us. But good thinking does one invaluable thing for us. It takes off the blindfold.

Good thinking allows us to understand that we are that swimmer and that the lake is us too, influencing and controlling us. No longer blind and ignorant, we can now pause to reason when it is appropriate and possible, such as the moment before we make important decisions. We can pay attention and try to determine if we swam a northerly course for sensible reasons or if it was one of the creatures below that nudged us northward for irrational or unknown reasons. We will always be in the center of the lake; the lake is us as much or more than anything else in our lives. We will always find ourselves treading water and swimming with or against the currents of our subconscious. Simply recognizing this reality, however, gives us greater control and provides more opportunities to make good decisions.

Are you in control of “you”? It is difficult to overstate the influence and impact of the shadow brain on one's life. You don't even live your life in real time. It typically takes a fraction of a second for your shadow brain to receive, process, and act on input from your senses, significantly faster than your conscious mind. So when a friend yells your name from across the street, a cat rubs against your leg, or your phone rings, “you” are the last to know. Weirder still, your other brain is first to act even when “you” decide to do something! Researchers have observed related brain activity occurring before a person makes the conscious decision to act.23 This means that at the precise moment you decide to stand up, for example, your shadow brain has already begun the cerebral processes related to standing up. It knew what you were going to decide and started working on it before “you” actually made the decision. What does this imply about free will? If your shadow brain has already decided to stand up before “you” did, then who is really doing the standing up? Who is in charge? Moreover, this “other you” that is so involved with your life is constantly feeding you input about how to react to the things, events, and people you encounter. Should you buy this or that? Is this person you just met okay or a bit too creepy? Trust me, while you might want to dither and ponder ethics or weigh a long list of pros and cons, the other you has already made the call. More often than not, “you” are relegated to the role of explainer in chief. You have to defend and make sense of decisions and actions your shadow brain is responsible for.

“To make our way in the world,” warns Yale psychologist John A. Bargh, “we must learn to come to terms with our unconscious self.”24 Otherwise we can do little better than flounder in the middle of a deep lake—blindfolded. While in pursuit of good thinking, we have to acknowledge that the shadow brain is always there, looking over our shoulder, paying attention when we are not, working through problems we gave up on, and making countless split-second decisions that we then take credit or blame for. It may be common for us to second-guess the words and deeds of others. But modern science has made it clear that we should be in the habit of second-guessing ourselves as well.

FALSE CONSENSUS EFFECT

It's reassuring and empowering when everyone agrees with us and, surprisingly, even when they don't. All who so much as dip a toe into social media are immediately subject to the false consensus effect. This psychological phenomenon is rampant, virtually unavoidable on Facebook, Twitter, Snapchat, Instagram, and so on. False consensus effect is yet another lie the shadow brain tells us to make us feel more comfortable with ourselves. It is the natural assumption, the stealthy bias that consistently bubbles up from down below to assure us that our thoughts, feelings, beliefs, attitudes, values, views, conclusions, and behaviors are not weird or unpopular—even when they are. False consensus effect is our tendency to overestimate how similar other people are to us. It is remarkable how common this bias is.

People often defend a pet belief or preferred conclusion by declaring that there must be something to it because “everyone believes it.” I've heard this defense tactic countless times on six continents. It inevitably comes up while discussing everything from conspiracy theories and ghosts to astrology and alien abductions. It is clear to me that many of these believers have exaggerated and unrealistic notions of just how many people share their beliefs and are imbued with heightened confidence if not outright certainty as a result. This is the bandwagon effect without the bandwagon.

The key to understanding this challenge is to recognize that we not only seek belonging and acceptance but want others to agree with us. We yearn not for a few comrades, but for most or everyone who is good, sane, and smart to agree with us. This makes perfect sense because it makes us uncomfortable when good, sane, and smart people do not agree with us. Their contrary positions call into question the validity of our important beliefs and conclusions. They push us toward doubt and possibly revising or dropping beliefs, a journey many people don't wish to take. Enter the false consensus effect.

With a little help from the subconscious mind, we assume without much thought that everyone, or at least a significant number of people, see the world in the same way we do. Those who don't must be dim or defective in some way. I have fallen victim to the false consensus effect many times, and it can be jarring when reality comes crashing in. Because I have researched, written, and talked about irrational beliefs for so much of my life, many of these claims seem to me nearly impossible to be believed. When you know the real story behind the Roswell myth, for example, it can seem silly to take it seriously.25 And even though I consciously know that millions of people do believe it, my subconscious mind sometimes ignores the data and leads me to just assume the person I'm chatting with couldn't possibly accept the claim that extraterrestrials crash-landed in a New Mexico desert back in 1947, were scooped up by the US military, and are on ice today in some secret facility. As a positive, constructive skeptic who wants to be polite and respectful when it comes to the wild things people believe, I have to consciously push back against the false consensus effect and remind myself that my perspectives are not synchronized with everyone I encounter on the street or on social media, not even when it comes to the most absurd and unlikely beliefs. False consensus effect has tripped me up more than once on the topic of conservation, for instance. I am a lifelong fan of life. I think that biodiversity, the beautiful and vital mix of life on our planet, is the ultimate treasure of our world and that squandering it within the flash of a few generations is a tragic mistake and colossal crime. To me this seems apparent and undeniable, so I tend to automatically assume that others feel the same way. Clearly, however, not everyone does, as I rediscover again and again. What may seem obvious to you is not necessarily so to others. Keep this in mind and you will be able to mount some defense, at least, against the false consensus effect.

The false consensus effect runs rampant on social media because of the fractured/filtered nature of online communities. When a group of like-minded people agree and reinforce one another, confidence tends to soar. People within these groups naturally believe that most people outside the group agree with them as well. After all, they figure, everybody I'm communicating with thinks this way, so the same must hold true for all of those I'm not communicating with as well. False consensus effect distorts attitudes even when people know that the majority doesn't align with them, because they may then assume that enough people do agree to validate their ideas and beliefs. Related to this is our habit of thinking for the population. When we have no idea how many people believe a certain claim, for example, we commonly project our own knowledge, experience, values, and beliefs into the imagined minds of hundreds, thousands, millions of others to decide how they think or feel.

WATCH WHERE YOUR ANCHOR DROPS

The anchoring effect can cause our thinking to set sail for the wrong harbor. The human brain is demanding, one could say greedy, in its constant need for blood and oxygen. No other organ comes close to its requirements. The three-pound brain of a 150-pound adult accounts for just 2 percent of bodyweight but demands as much as 25 percent of the body's blood supply. The brain hungers for information in much the same way. The brain's mantra could be “keep it coming”; something is always preferable to nothing. Your shadow brain wants, craves, and covets input of any kind to serve you. Working behind the scenes, it is relentless in trying to help you stay alive and succeed. Most of the time it works out well. Sometimes, however, no good, accurate, or useful information is available. But the subconscious mind doesn't give up on its mission to serve you. So, for better or worse, it will seize the first thing available to work with. The absence of accurate, relevant, or timely information prior to decision making seems to be an uncomfortable or intolerable state for the subconscious mind; so, in a pinch, your shadow brain will make do with almost any extraneous input. The early information that hits a knowledge vacuum matters. It becomes the anchor around which later thinking and decision making grows.

A simple example of the anchoring bias can be found at car dealerships. A dealer puts a price on the car's window. That price is high, too high, but a typical potential buyer doesn't know how much the dealer actually paid for the vehicle wholesale, so a fair price is unknown. However, with that sticker price offered up as an anchor, the first bit of information, the buyer's subconscious has a starting point from which to work and, the dealer hopes, it will sway the buyer to estimate higher on what a fair price is.

It is remarkable, even scary, how anchoring bias can steer us toward answers and opinions with frivolous information. For example, if I were to mention or show you a large number, say 100,000, and later ask you to guess how many spoons or forks you have in your home, you likely would guess a higher number than if I had earlier exposed you to the number 10, 20, or some other lesser number. Because you didn't know how many silverware items you have, your subconscious mind would instantly scramble for input of any kind that might help you come up with a good answer. Having recently been exposed to a number, any number, could be enough to set your mind off in one direction over another—even though that first number had nothing to do with the silverware count.

To be clear, the anchoring effect is different from the mere exposure effect because it is based on our mind's willingness to rely on an anchor, a specific bit or bits of information, even though it is unrelated to the decision or task at hand. The mere exposure effect can betray our reasoning via nothing more than familiarity. It nudges us toward a feeling or belief based on nothing more than exposure. The anchoring effect has been well tested and shown to be remarkably consistent. For example, an experiment by psychologists Amos Tversky and Daniel Kahneman had people estimate what percentage of African countries are in the United Nations, a question at which most people could only guess.26 But before guessing, participants had to spin a Las Vegas–style roulette wheel that was engineered to stop at the number 10 or the number 65. The test subjects were not told that this roulette-wheel spin and “winning” numbers had any connection to the percentage of African UN member nations. But those numbers dropped anchor in the subconscious minds of the wheel spinners anyway. Those who landed on 10 gave an average answer of 25 percent. Those who landed on 65, the higher number, guessed 45 percent, a higher answer. The meaningless whirl of a wheel influenced answers by a difference of twenty points!

The direct relevance of anchoring bias to social media comes into play when we consume news, posts, gossip, or images posted by friends and followers. The first input that hits your subconscious mind about a topic you are not reasonably familiar with sets the stage for your conscious thoughts and decisions about it. It may not matter if this information is inaccurate, exaggerated, biased, or completely irrelevant. It is input, and your subconscious mind will happily make use of it. The importance of first impressions cannot be overstated here. There is no cure for this challenge. Anchoring is a standard process of the human mind. So be warned: when you skim that News Feed on Facebook or scroll through tweets, your subconscious mind is sucking up input that may be put to use—beyond your awareness—when it's crunch time and you need to make a decision or form an opinion. The best we can do is know about the anchoring effect, respect it, and question ourselves at every turn. Use your mental faculties to seek out accurate, unbiased, and relevant information whenever and wherever possible to be more sensible and safer.

OTHERS ARE MORE EASILY FOOLED THAN I AM

Another significant challenge for social media users is coping with the third person effect. This effect is our natural tendency to overestimate the impact of information on others while underestimating its impact on ourselves; in other words, it gives us a false sense of superiority. It is important to be aware of this effect because it can make us feel overconfident about our analytical and reasoning skills and, as a result, be more vulnerable to bad information. Who among us hasn't sighed and shook her head at the sad mental foibles of “those people”? Who hasn't wondered how in the world “they” can be so stupid? “Those guys” fall for anything, but I don't. Once again, a good general education or high IQ can't be relied on to save you from yourself. This is one more human cognitive trap no one can afford to ignore or dismiss, especially while wading around in the diverse content found on social media.

As I am writing this, one of the day's trending topics on social media is the work of “Prophet” Lethebo Rabalago, a South African Christian minister. During services he sprays what he claims is insecticide, “Doom Super Multi Insect Killer” to be specific, in the face of his congregates to “heal them.”27 Several photos of one insecticide healing event are posted on his Facebook page. As one might expect, many people immediately condemned or mocked Rabalago on social media. Most of those making negative comments probably felt a powerful sense of intellectual superiority over Rabalago and his flock. They're dumb and I'm smart, would be the most basic likely thought. But how many of these condescending critics are themselves believers in outlandish claims and practices? Given the right timing and context, we all are capable of being nudged onto strange paths of delusion. Humility is in order, everywhere and always. It is easy to find the fool without but never so simple to spot the one within.

The advertising industry can teach us a lot about the third person effect. Most people assume that the radio, television, and Web ads that greet us at every turn and fill so much of our daily world have a negligible effect on us, or none at all. It's just visual and auditory clutter to be ignored. Sure, advertising works on some people; it must, or corporations wouldn't spend billions of dollars on it every year. But it doesn't work on you and me, right? We're too sharp for that. The reality, however, is that these ads can and do work on all of us, to one degree or another, as numerous studies have shown.28 Mere exposure—as in the mere exposure effect addressed earlier in this chapter—to a product or brand can make us more likely to buy it in the future. So while we assume advertising seduces, hypnotizes, or tricks all of those other gullible suckers but leaves us unscathed, it turns out that these ads are having an unnoticed effect on us. We like to imagine that we scan ads as cold, calculating robots who are impervious to deceptive words and images. We glance at ads just to make sure there isn't some new product we need or want or to see if there might be a great price to be taken advantage of. In reality, however, we are not cold robots; we are warm emotional pushovers being softened up in a way that makes us more likely to buy later.

One interesting component of the third person effect that everyone should be aware of is that it has been shown to have an even greater impact when we are confronted with information we don't know or care much about.29 For example, if you were to read an article or some Facebook post about a person, topic, or event about which you had thought little before, it would be natural for you to assume that this information would impact others a lot more than it does you. After all, you don't care, right? The third person effect can also be particularly potent when the source has or seems to have a negative bias.30 I suspect that this stems from our natural attraction to gossip. We find stories about others, especially those with a negative slant, to be irresistible. And for good reason. We use gossip to transmit and remember important information, especially warnings. Lisa Feldman Barrett, a Northeastern University professor of psychology, describes gossip as nothing less than a “vital thread in human social interaction.”31 Gossiping is perhaps the most effective means of sharing information about those we cannot trust or are at least suspicious of. Our gossip fixation, I believe, is one of the key reasons why so many absurd conspiracy theory claims are able to infect the minds of millions.

The important thing is to understand and be aware that information assumed to be irrelevant in the moment can have a significant impact on you later, should that person or topic come up again in a more important context. “You,” your conscious self, may not even remember those brief minutes, seconds, or fractions of seconds you spent on that article or post; but there is a good chance that your subconscious mind will remember and reference it toward “advising” you on how to think, act, or decide. In light of this, virtually everything one skims or reads has potential for later subconscious influence. Something to keep in mind.

Real for them, real for me? The conformity conundrum. Several years ago while working as newspaper reporter, I covered the performance of a marginally popular faith healer. She shared her personal and emotional story of dying, going to heaven, and being sent back to Earth by Jesus to heal the sick. The extraordinary tale didn't come with any evidence, of course, but was fascinating nonetheless. And the audience certainly loved it. I watched as their excitement grew with each emotional high point in the story. At one moment during the presentation, a woman in the audience held up her hands and then reacted as if she had been electrocuted. I was close enough to hear her say after recovering that the “Holy Ghost” had touched her. This electrocution-like behavior spread around the room. Within minutes, several people indicated that they had felt the warm shock of the Holy Ghost.

I can't say for sure if an invisible supernatural entity was or was not in the room with us that night. But I do know enough about normal, natural human psychology to conclude that I most likely witnessed the power of conformity. We are social creatures. We tend to synchronize not only our physical behavior with those around us but even our deepest thoughts and feelings as well. There is no escaping this fundamental aspect of human existence. It's just how we big-brained primates roll. Most of us understand and agree with the obvious truth that family, friends, crowds, and culture influence us, but few recognize just how powerful and persuasive “the group” can be to our thinking processes as individuals.

Given its impact, we don't spend enough time contemplating our place in the hive. Sure, we think about status, rank, belonging, and rejection, but not nearly enough about how relationships and social networks influence our reasoning and decision making. Our connection to so many other minds, whether up close or via social media, has a great impact on our thinking, our perceptions, and our lives. The herd-like nature of social media can be disturbing when viewed from the outside. When I step back and think about the Facebook universe, for example, I see many vast digitized subcultures with their own particular slants, moods, and flavors. One of the primary reasons filter bubbles are such a problem in social media is that we are herd animals, more cow than bear. Social psychologists have shown over and over how easy group membership can influence our thinking—much of it taking place behind the curtains of our awareness. No human should go anywhere near social media without a keen awareness of this.

It is unsettling how social influence can alter perceptions of reality within a person's mind. Please do not underestimate this. Do the work to maintain independent thinking. Be brave enough to doubt and honest enough to change your mind when necessary. Solomon Asch's classic 1951 experiment presented college students with the task of making simple perceptual judgments about the length of lines on a sheet of paper. Unknown to them, however, was the presence of hired actors among them posing as fellow test subjects. The test was simple enough: Identify the longer of two lines. No tricks or optical illusions. The correct answer was obvious, or should have been. But again and again the actors chose the wrong line, and a large percentage of test subjects went along and agreed with it.32 This should disturb and scare all thoughtful people who value truth and reality. There was a bright spot in the experiment, however. When one actor was tapped to play the lone dissenter and choose the correct line, the ratio of test subjects who selected the wrong line fell. I take hope and inspiration from that. One sensible voice in a sea of madness can sometimes make a difference.

It is difficult to know whether people who chose the wrong line truly believed their answer due to misperception or were lying to conform. But does it matter? When we are surrounded out on the streets and inside social media by a crowd of people who can't or won't keep themselves grounded in even the simplest of realities, what does it do to our own private perceptions of reality? When we see and admire the emperor's fine clothes, how can we be sure that he's not actually nude and that we have been hoodwinked by the intimidating weight of majority opinion?

FALLING DOWN BY LOOKING UP

Authority bias is the tendency we have to trust in and be influenced by someone perceived to be an authority figure. Its grip can be so tight on us that in some situations we will believe, do, or buy virtually anything. This psychological phenomenon is not a workplace problem exclusively. This is a life problem. It happens everywhere, all the time. It is the reason actors wear white lab coats while telling us how white our teeth can be on TV commercials. Authority bias is an amazing thing to behold. Just about anything becomes credible and can be believed if the person speaking about it has some attachment to the magic of authority. We constantly underestimate the sway this bias has on us, often at the cost of money, time, dignity, or safety.

Authority bias is a significant problem in social media because of the way we endorse and share news, comments, and images. The mere act of receiving a news report from a trusted, intelligent friend on Facebook can infuse that article with the pixie dust of authority. Add to that the additional challenge of seeing through the silent authority claim that comes with information presented in the form of a news article created and published by a person or a company that wants you to believe and accept its product.

The authority bias is a slippery challenge in current times because traditional authority is under attack in so many spheres. But regardless of whether you already hate big government, don't trust the police, or are in the process of rebelling against formal religion, you need to be on top of this problem because authority bias is unbound by social borders or traditions. It comes at you from every direction. For example, Alex Jones, host of the InfoWars podcast and website, is a tireless promoter of some of the most absurd conspiracy claims. But he is nothing less than a trusted authority figure to his fans who don't trust mainstream news sources and government authority figures. I've never been to an anarchist pre-protest planning meeting, but I wouldn't be surprised if it's typical for someone in the room to assume an authoritative role, just to get things going.

Closely related to the problem of placing excessive trust in authority is what I call the bleed-over effect. An impressive title, rank, or credentials in one thing doesn't necessarily make one an expert in other things. Don't let someone's expertise bleed over to other topics without justification. This is a big problem on social media, where I see many people citing and quoting people as definitive voices of authority on something, even though they carry no real weight on that topic. An expert in one thing is not an expert in everything. Vague titles aren't enough. One can get a PhD in virtually anything these days—and from the comfort of one's couch. A chiropractor may wear a white lab coat and be called “Doctor,” but we can't assume that he or she went to an accredited medical school like doctors who work in hospitals do. A security guard may wear a law-enforcement-style uniform and carry impressive tactical gear, but it doesn't necessarily mean he knows the law. All of this may seem obvious to you now, but it's easy to be overconfident in the moment and let down your guard. We forget the danger at key moments because of our innate attraction to and weakness for authority. Your shadow brain is more impressionable than you are. Isaac Newton may well have been the greatest genius of all time. But not everything he claimed can be trusted. For example, he seemed certain that the Earth will end by supernatural means one day. Great as Newton was at mathematics and astrophysics, I don't place any trust in his doomsday projections, because—unlike his work in math and celestial mechanics—he offered no compelling evidence in support of them.

To be clear, I should add here that it is almost always wise to listen to and consider new ideas and claims from anyone, regardless of their expertise. You can make a fair hearing brief, of course, and you don't want to waste time or brainpower going over the same nonsense again and again. But don't reflexively shut out everything that comes at you just because it may lack impressive credentials. Just as experts are sometimes wrong, so to can non-experts be correct. Superior critical thinkers work to maintain open minds.

Giving due respect and attention to those who have earned it with hard work and an impressive track record is necessary for efficiency in seeking good information and coming to the best conclusions. But never should we assume that truth and reality can be accurately assessed by the credentials of the messenger alone. Even when the source of information we are considering is legitimate, with real and obvious authority or expertise, we still can't afford to sleepwalk or go on autopilot. If it is about something important to you—the fair price of a house, which college to apply to, the fate of the planet, and so on—then don't rely on a single expert opinion, no matter how impressive that expert is. Second opinions are important not only for serious medical procedures. They can be valuable in all aspects of life.

It may be deep within us to fixate on authority figures and believe what they say, but you don't have to be an obedient little space monkey and snap to attention every time someone wearing a uniform, lab coat, or funny hat tells you what to believe, do, or buy. Always consider the claim in isolation, if only for a moment, as if it fell from the sky or emerged from the soil, unattached to anyone or anything. Develop this unnatural but necessary habit of seeing through the masks, titles, and crowns. With deliberate effort, listen to a claim, order, pitch, or promise. In your mind, separate it from the source as a way to push back against authority bias. Ask yourself if it would make as much sense coming from a pauper as it would from a prince. If not, there may be a problem.

WHEN FINDING OUT THAT YOU ARE WRONG MEANS KNOWING YOU ARE RIGHT

Have you ever wondered, in a fit of red-faced frustration, why some people, when they are wrong, can't seem to recognize obvious facts and change their minds accordingly? Of course you have. Even worse, some people seem to become more confident about their bad idea or incorrect conclusion, no matter how many facts or how much evidence you present to them. It's like throwing wood on a fire when you are trying to put out the fire. What is wrong these people? There are two things you need to know about this: (1) Nothing is wrong with “these people” other than the fact that they are people. (2) You are one of “these people.”

There is a name for this problem. It is called the backfire effect and most people are not only unaware of it but also have no idea how vulnerable they are to this common failure of reasoning. Contradictory as it seems, seeing evidence of a new reality before us that contradicts what we already think does not necessarily clear up our thinking and make us wiser. In fact, it can often make us dumber than we were before we encountered the new knowledge. This natural inclination of the shadow brain will have you feeling more confident about a position you hold, even as you encounter accurate and overwhelming evidence that says you're wrong.

“Giving people factual information isn't as convincing as people often think,” says Brendan Nyhan, a social scientist who researches the confounding backfire effect.33 Having once worked for a fact-checking website, Nyhan saw firsthand how difficult it can be for people to change their minds in the face of compelling evidence that warrants it. Contrary to what you might assume, it is not unusual for us to do an about-face and move in the opposite direction of reason and evidence. In the upside-down world of your brain on the backfire effect, exposure to more knowledge can make you dumber. An encounter with new information that is real and true can push you farther away from truth and reality.

Why would we react to good evidence by maintaining our belief with even greater conviction than we had before discovering the new evidence? Most would agree that teaching, debunking, enlightening, and exposing truth sound like great concepts. But they can cause us to tighten our cognitive grip on a belief or idea that is important to us. The shadow brain does this as means of keeping us feeling stable and content. Reality be damned, we're not letting go, cries the subconscious brain. This bias raises disturbing thoughts for someone, like me, who for many years has worked hard trying to bring more light and reason to the world. Have my books about critical thinking and science caused some readers to double down on a nonsense belief? Thanks to the backfire effect, this is probably so.

The backfire effect helps to explain why so many online debates are a waste of time. How often does someone on Facebook or Twitter write: “Wow, the evidence and logic you shared has changed my mind. Thanks!” No, in almost every case, heated exchanges about politics, the existence of gods, or the morality of twerking result in nothing more than people on both sides digging in their heels and walking away feeling more confident and committed to their original belief than ever before. Being aware of the backfire effect will help you during engagements with people—friends and foes alike—on social media. Understand that failing to see the obvious is merely human. Much begins to make sense when we comprehend the power and influence of the backfire effect. Your frustration with others is likely to lessen as well. Where you once pulled your hair out and wondered if this person was insane or if that person was maliciously pretending to hold a nonsensical position just to irritate you, now you can accept that this is normal behavior, something to be expected. I have been called names and wrongly accused of bad things a few times on social media but was able to process it and react to it with what I feel is a high degree of maturity and wisdom, because I understand just how easy it is for anyone to become overconfident and feel excessive passion for a lost cause. Yes, brain awareness can make us kinder and more forgiving. Consider this a nontrivial fringe benefit.

An understanding of the backfire effect can also be great motivation to be more accepting of awkward evidence and inconvenient ideas. The simple act of remembering that your shadow brain is consistently trying to “protect” your positions and your feelings about them by infusing your conscious mind with unjustified confidence and a distorted perception of the sensibility of your beliefs can keep you on your cognitive toes.

“As social media and advertising progresses, confirmation bias and the backfire effect will become more and more difficult to overcome,” warns David McRaney, author of You Are Not so Smart.34 “You will have more opportunities to pick and choose the kind of information which gets into your head along with the kinds of outlets you trust to give you that information.” He continues:

Advertisers will continue to adapt, not only generating ads based on what they know about you, but creating advertising strategies on the fly based on what has and has not worked on you so far. The media of the future may be delivered based not only on your preferences, but on how you vote, where you grew up, your mood, the time of day or year—every element of you which can be quantified. In a world where everything comes to you on demand, your beliefs may never be challenged. As information technology progresses, the behaviors you are most likely to engage in when it comes to belief, dogma, politics and ideology seem to remain fixed.35

I have learned through experience over the years that it is not enough to explain to someone why it is not rational to believe in extraordinary claims that aren't backed up with a lot of very good evidence. Based on the best current arguments and evidence, for example, it makes no sense to assert that psychics, demons, alien abductions, Bigfoot, and the Loch Ness monster are real. These and other such claims could be true but probably are not, because to date no one has been able to show that they are true. But deconstructing weak arguments and pointing to the absence of evidence is not enough, not in light of the backfire effect. The best way to counter such staunch support, I think, is to explain a few relevant brain processes that might be helping to support the claim. Knowledge of how human vision and memory work go a long way, for example, toward explaining why many people believe that they have seen alien spaceships in the sky. It helps to let people know that supernatural or paranormal claims are not the only riddles to be solved. In fact, they are the easiest of all. People are better off, and wiser, when they recognize that the human brain presents the first hurdle and greatest obstacle toward reaching reasoned and intellectually sound destinations.

CAN WE INOCULATE OURSELVES AGAINST BOGUS CLAIMS?

Dartmouth College associate professor Joshua Compton specializes in inoculation theory, a process that allows us to strengthen ourselves against being persuaded to believe in nonsense and accept bad ideas by exposure to limited or weak versions of them beforehand. “Inoculation theory as it applies to persuasion and influence is based on a process that's actually pretty familiar to most people—medical inoculation,” Compton told me.36 “In a typical medical inoculation, we get exposed to a weak version of a threat, like a flu virus, to prepare our body to fight off stronger versions that we might encounter later. And that's how persuasion inoculation works, too. When we get exposed to a weak version of a threat, like a persuasive argument, we get prepared to fight off stronger versions that we might encounter later.”

Compton says that fifty-plus years of research shows that it works. Pre-exposure to weakened counterarguments, an intellectual vaccination, really does give us greater resistance to stronger counterarguments later: “One of the biggest takeaways from inoculation scholarship is that, in terms of resistance, it's usually better to confront counterarguments head on than it is to try to avoid them.”

Compton thinks inoculation theory might prove useful in mitigating problems associated with filter bubbles and fake news. “Inoculation theory would suggest that exposure that is limited to just one side of an issue can result in what some attitude researchers have called a paper tiger effect—a position that looks really strong, but only looks strong because it's never been strongly challenged. These beliefs can crumple under the pressure of persuasive arguments. Inoculation, on the other hand, shores up these positions—challenges and bolsters them prior to exposure to stronger persuasive arguments encountered later.”

“Inoculation theory suggests that attitudes and beliefs must be robust to fend off persuasive attacks on them—and that one way to make an attitude or belief more robust is to challenge it with weaker attacks before exposure to stronger ones,” Compton continued. “Inoculation theory would seemingly work, then, on fake news about specific issues, like climate change, or perhaps even the very genre of fake news itself. The key is to not be taken by surprise by the threat, so that means being exposed to different opinions and beliefs, and study and analysis of these different opinions and beliefs, before exposure to persuasive attacks.”

But wait, can a person be inoculated against good ideas? Unfortunately, inoculation theory is not a perfect fix—because it can work both ways.

“Inoculation doesn't guarantee a rational, informed response to persuasion, of course. The same process than can make a healthy position more robust could make an unhealthy position more robust, too. Some of the evidence does strongly suggest that inoculation elicits more thinking about an issue—and we hope that this thinking is informed by reason and fact, but that's not necessarily so. The one finding that probably gives me the most hope that inoculation leads to better thinking is that inoculation seems to motivate people—and equip them—to talk about the issue with others, in more conversations, with more conversational partners. If this boost to dialogue is leading to exposure to more perspectives, then that seems quite healthy to me. The strong effects of such inoculation—resistance effects, persuasion effects, increased talk about the issue, a boost to involvement and participatory behaviors—and the resilience of these effects—[can last] for weeks, if not months, if not years…. That keeps me hard at work on better understanding not just processes of resistance to influence, but also, the roles of facts and evidence and logic and truth in resistance to influence. Informed resistance seems the better goal than resistance.”

IT IS RATIONAL TO FEAR IRRATIONAL FEAR

Imagine a terrorist, some ski-mask-wearing, faith-fueled zealot with an AK-47, running toward you on a sidewalk. Now look back over your shoulder. That's not an approaching freight train, it's a tornado heading your way. See with your mind a ten-foot shark slicing effortlessly around you as you tread water far from the beach. Its dark-gray dorsal and top tail fin just break the surface. Finally, pull up a scene in your head of an intruder in your house, late at night, who has come to murder you. Scary stuff, right? Now imagine a severely depressed person sitting alone in a room, an automobile, and a pack of cigarettes on a table. These three images are not nearly as scary as the first few. But why is this the case? Look at the data in the box below and ask yourself why it is that we allow ourselves to become so easily terrified by some things more than others, even when the numbers don't support our fears.

images

The reason fear does not always align well with reality is that fear is a deep, fast emotion. When there is an immediate, grave danger, the luxuries of time and deep intellectual reflection are not practical. If the boulder is falling above your head, you need to jump out of the way now, not after you think about the size and weight of the boulder and what kind of damage it might do to your cranium. Terrorism terrifies us more than car crashes, despite the numbers, because our brains have evolved to panic over the threat of another life-form intentionally killing us. That intense fear of predators made sense for most of human and pre-human existence. Automobiles and cigarettes may kill more people, but it's not personal and they have not been around long enough to spark any similar primal horror within us. Intense fear surges up from a place in the brain that lies far below higher reasoning.

Fear is the necessary monster that lives in the darkest part of our shadow brain. Though it makes us uncomfortable and often causes us to worry needlessly and make bad decisions, we need our capacity for fast and thoughtless fear.

Without instinctual, emotional fear, we all would spend short lives walking across busy highways, throwing knives at each other for sport, and picking fights with UFC champions—and humankind would not last long. Fear evolved to help keep us alive. Yesterday's primates who lacked sufficient fear no doubt were culled by predators, generation after generation. Their genes, for the most part, have been lost to the distant past. It is to this process that we owe our modern anxiety-riddled, and fearful minds. Unfortunately, fear is not always useful. Because it must be automatic and fast in order to work for us when it counts, fear is often inappropriate, misguided, and just plain unnecessary. Many of the natural fear-related instincts that were developed so long ago can sometimes be distracting and even destructive today. “Our evolutionary baggage encourages us to fear certain things because they comprised a reasonable assessment of what was harmful to our ancestors millions of years ago,” explains UCLA neuroscientist Dean Buonomano. “But how appropriate are the prehistoric whispers of our genes in the modern world? Not very.”38

Toward overall good thinking in the social media universe it is necessary for you to strive for a somewhat realistic connection between the degree of fear felt and the actual danger level of the stimulus causing it. To do this, two things must happen. First, you must recognize when your fear button is being pushed by digital input. If an image, post, or news article alarms you, angers you, or makes you feel intensely uneasy, acknowledge what your deeper brain is experiencing and the message it is sending you. The second step is to do the work of trying to figure out whether or not this fear is based on a credible threat. Take this seriously. Those who do not are soft clay in the hands of all who would use fear to manipulate. I'm a believer in fearing fear. Given how often it is used by governments, demagogues, corporations, religious leaders, and so on to influence or control people, it is rational to have a healthy respect for fear. The solution to our irrational fears is not bravery; it is good thinking. We can diffuse irrational fear to feel better and become more sensible by watching out for it, recognizing it, analyzing it, and, when appropriate, rejecting it. We can and should become better at distinguishing credible reasons to be afraid from all of the irrational and bogus stimuli that scare us. The latter, by the way, is often in the service of someone else's selfish agenda.

Bo Bennett, author, educator, and strong advocate for critical thinking, points to the availability heuristic as a key culprit behind many of our fear-based judgment errors. “The availability heuristic is the tendency to overestimate the likelihood of events with greater ‘availability’ in memory,” he said, “which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.”39 He continues:

Most people who are terrified of being killed by terrorists don't lose a moment of sleep about being killed by lightning, even though they are many times more likely to die from being struck by lightning. Why? Because terrorist attacks are emotionally charged events that, as wrong as it sounds, have high entertainment value. When they occur, these events dominate the media for days, and sometimes months, like in the case of the 9/11 attacks. Based on this level of exposure, our emotions overpower our reasoning, and what is factually a statistical impossibility becomes a perceived “strong possibility.” This bias affects our views on other issues such as white cops killing unarmed black men (if you consume more liberal media), or acts of heroism by cops (if you consume mostly conservative media). Besides making sure you get your information from an array of credible sources, realize that all media report on information that is interesting or entertaining, which skews our perception of the world.

The anatomy of fear. Two small but influential bundles of neurons are positioned about four inches directly behind each of your eyeballs. The amygdalae are ground zero for your deepest fears. Because of the significant impact the amygdalae often have on our conscious thinking, everyone should know something about what they are and how they work. About the same size and shape as an almond, an amygdala is panic central, a fear factory, our primal repository of terror. The amygdalae work by receiving input from the senses and making lightning-fast reactions, when deemed appropriate, to put you into a state of fear.

The amygdalae are relevant to social media use because your modern brain doesn't know that it is a modern brain. It never got the memo about the last several thousand years having happened. Civilization, urbanization, and computerization just kind of happened all of the sudden and your brain hasn't caught up yet. Vast portions of our twenty-first-century brains continue to function as if we were still in the prehistoric world—especially those amygdalae. This is the reason we so often react and think in such bizarre and irrational ways when afraid. Our decision-making abilities are compromised when under the influence of fear. Subconsciously, you may respond to a scary tweet about the general threat of terrorism in a way that is similar to how a Homo erectus individual a million years ago might respond if she caught a glimpse of a four-hundred-pound crouching predator cat half hidden in the bushes fifty feet away. But these two very different scenarios do not warrant the same automatic reaction. The Homo erectus would likely be helped by an instant fight-or-flight response to being in the presence of a threatening predator. Without time-consuming conscious thought, her heartrate rises in preparation of physical effort (such as running or fighting), metabolism cranks up to high-speed, muscles tighten, and pupils widen to maximize vision. This is all good, because being ready for action could be the difference between life and death for her. But a tweet about terrorism in the twenty-first-century is a much different scenario.

A general comment about terrorists killing random people may be cause for concern, but the physical impact and mental distractions of the flight-or-flight response are unnecessary. They are worse than unnecessary because the fear response degrades, corrupts, and smothers good thinking. Rational decision making becomes significantly more difficult, sometimes impossible, when your heart is racing, hands are trembling, and amygdalae scream “red alert!” No actual living terrorist is fifty feet away. The tweet cannot kill you. Adrenaline and widened pupils won't help you assess the information's credibility and relevance to you. Potentially important information comes at us constantly in daily life and as much of it as possible needs to be analyzed, considered, and processed by the prefrontal cortex, the very part of the brain that intense fear relegates to the sidelines. We don't want to miss warnings about real and immediate dangers, nor do we want to waste time and energy reacting to bogus or distant threats. When any kind of information scares us, the best reaction is to remember that words are not a crouching lion with sharp claws and dagger-like teeth. We do not need to run from or physically fight words. The better option is to calm down and defend ourselves with the human brain's higher reasoning abilities.

TEN GOOD BOOKS FOR GOOD THINKING

  • The Demon-Haunted World: Science as a Candle in the Dark, by Carl Sagan (New York: Random House, 1995).
  • Brain Bugs: How the Brain's Flaws Shape Our Lives, by Dean Buonomano (New York: W. W. Norton, 2011).
  • How to Think about Weird Things: Critical Thinking for a New Age, 6th ed., by Theodore Schick and Lewis Vaughn (New York: McGraw-Hill, 2010).
  • Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School, by John Medina (Seattle: Pear, 2014).
  • Brain: The Story of You, by David Eagleman (New York: Pantheon, 2015).
  • Caveman Logic: The Persistence of Primitive Thinking in a Modern World, by Hank Davis (Amherst, NY: Prometheus Books, 2009).
  • Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit, by Jonathan C. Smith (Hoboken, NJ: Wiley-Blackwell, 2009).
  • The Believing Brain: From Ghosts and Gods to Politics and Conspiracies—How We Construct Beliefs and Reinforce Them as Truths, by Michael Shermer (New York: Times Books, 2011).
  • Thinking Fast and Thinking Slow, by Daniel Kahneman (New York: Farrar, Straus, and Giroux, 2013).
  • You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself, by David McRaney (New York: Gotham, 2013).

When a politician stokes fires of panic deep in our brains with exaggerated talk of war or financial ruin, many people pay attention. They worry and if they react with any kind of action it is often an ill-conceived and irrational response, because human brains are less analytical and contemplative when fear takes over. This is why it is so important that we consciously monitor our fear and tame or reconsider how we react to it.

Crime, unemployment, economic and moral collapse, terrorism, and various supernatural doomsdays have all been standard fodder for speeches and soundbites by many political candidates. “If a politician can unnerve the electorate, he or she can obscure the facts,” warns Timothy J. Redmond, an adjunct professor at Daemen College. “And if a politician can obscure the facts he or she may be able to mobilize the public to support policies that the latter will come to regret. For as the English poet Samuel Taylor Coleridge noted, ‘In politics, what begins in fear usually ends up in folly.’”40 Don't fall prey to every scary post or news article that comes your way on social media. Quiet the siren and think. It is not impossible nor even all that difficult for you to step back from the fearmongering, recognize it for what it is, consult with reality in the form of reliable new and credible statistics, and tame the instinctual fear response within that finds such exaggerated warnings so believable and disturbing.

HOW TO THINK LIKE A SCIENTIST

It is a common misconception that science is something reserved for scientists and is therefore off-limits or somehow beyond the reach of mere civilians. Nonsense. Science is a tool, a human invention available to all of us to use down in the trenches of our daily lives. Each of us—regardless of education, income, nationality, subculture, or age—can and should apply the scientific method often. Everyone stands to benefit by maintaining the intellectual stance of a good scientist. Remember that science is by far the best way we have to explore, discover, learn, and confirm the accuracy of claims. Why not use it for yourself?

What follows here is a simple, no frills guide for doing science in the real world. This is the scientific method stripped down, simplified, and humanized for everyday use by anyone. Become familiar with it and apply some or all of it often so that you may better identify and step clear of mistakes, frauds, and all of those crazy claims that stalk us in daily life. Fortunately, it works as well online as off-line, so keep it in mind while surfing the Web and engaging with all of those friends, followers, trolls, and bots who might lead you astray on social media. Science works, so why not use it?

I. React like a scientist. The most crucial step is the first step. When your travels throughout the social media drop something important or strange in your lap, it is crucial that you pause and think before making any decisive intellectual moves. Avoid granting immediate and total acceptance to anything that matters. Hold back, at least for a moment. Good scientists do not accept any big claims at first contact, and neither should you. They wait, they doubt, they consider, they check, they withhold judgment. And, even if everything checks out, they only accept the claim/idea to a degree. Everything is provisional in science; every conclusion is held on a tentative basis, no matter how overwhelming the current evidence may be. Revision is always welcome, no matter if it is unexpected, uncomfortable, or even embarrassing. The goal of science—and good thinking—is always to get to what is real and true, whatever that may turn out to be.

Adopting a scientific stance will serve you well on social media.

II. Do your research. Given the right context, a bad idea can take on the look and feel of legitimacy, at least at first glance. Even when we judge a source to be sensible, sober, honest, and reliable, we still cannot immediately accept unusual claims, not before doing some minimal research. Check everything that is weird, important, or new before believing it. The hard truth is that human beings cannot be trusted unconditionally when it comes to sharing information. People lie; people become confused; and even the most sensible and honest among us often get facts wrong. Therefore, we must check to see if the claim holds up to scrutiny.

Researching a social media post, image, or news share is usually an easy process. Most bad ideas and false claims wither and die when exposed to even a tiny bit of light. The only reason so many ridiculous lies and painful mistakes are able to rage through social media like wildfires is because sufficient numbers of people are intellectually lazy and pass them on unopposed.

DOUBT IS GOOD

Doubt everything, more so initially and less later should the claim earn it, but keep doubting to an appropriate degree. Think of doubting things as essential bricks that construct a wall around you, one capable of holding back all of those barbarians at the gate. It is doubt that prevents us from embracing every bad idea and harmful lie that comes along.

Embracing doubt is not negative. Being skeptical doesn't mean closing your mind or being cynical. Applied properly, doubt is profoundly positive, the best protection we have against deception and delusion. Listening to and considering new and extraordinary ideas is generally a good idea. It does not become a problem so long as you always give a fair hearing to the skeptical voice within. Doubt is a pillar of wisdom. It lies at the core of good thinking and is essential to maneuvering through social media as the most rational, truth-seeking, reality-based person you can be.

One of the first things scientists do when confronted with a new idea is to explore what other sources are saying about it. Why make mistakes that others have already made? Why waste time and energy traveling down dead-end roads when others can tell you what's ahead? Do an online search to find out more information about the claim or news story. Do not, however, simply search the name of the claim or general topic. Doing this may present you with nothing more than sites and articles that stand to gain something from the claim in question. Once again, remember that Google is not your personal infallible, all-knowing god. It's a search engine. So make sure to do searches with keywords such as “investigation,” “skeptic,” “hoax,” “fraud,” “controversy,” and “scam” along with your topic name. It's possible, of course, that this could just as easily leave you with a list of negative sources that also are not credible. Everybody and everything has haters. When you find information both for and against the claim, then the next step is to consider the sources. What kind of sites and people are confirming the topic under consideration, and what kind of sites and people are condemning it? Not always, but often a clear pattern emerges. If two podcasters and a supplement company are pushing some new miracle brain pills but Consumer Reports, the Food and Drug Administration, as well as Skeptic and Skeptical Inquirer magazines have declared it to be a scam or at best an unverified claim, then it should be obvious which way to lean.

After checking for mainstream news articles by reputable journalism sources, do a specific search for articles that have been published in reputable, peer-reviewed science journals. What are scientists saying about the claim? Does is it seem as if the scientific community is convinced that there is something to it? If not, ask yourself why not. Even the best scientists are not perfect; science is not free from corruption; and the absence of a consensus does not necessarily disprove anything. But understand that scientists make their living by dealing in reality, figuring out what works and what doesn't. Careers advance by blazing new knowledge trails using the light of evidence to guide them. It's a clear warning sign when scientists have not confirmed something despite it having been around for a while. But what if the claim or idea is so new that no scientist has had an opportunity to study it yet? This too is a reason to be wary.

In the case of little or no scientific research having been published, the claim may still be true, but a wait-and-see approach is probably the best course of action. Rarely do you have to accept or reject an extraordinary claim immediately. It is okay to defer a conclusion, to wait for more data. “I don't know at this time” is a perfectly reasonable position to take when warranted. It's often the only honest and sensible position when confronted with new ideas.

Finally, ask around. Revert to old-school tactics and communicate directly with other people. Isn't this just what social media is supposed to be for? Connect and communicate; ask questions about a claim before you believe it. Of course, this is not completely reliable either, because those you ask could be clueless or deluded about the particular claim or news item. But it's worth trying.

Scientists rely on the work of other scientists to help them save time, work efficiently, and avoid stumbles. Always research a claim before believing it. Do smart searches and reach out to others to see if they know something about it that you don't.

III. Come up with your own hypothesis or theory. Try to explain how a claim might be true or false. Speculate. Imagine. Could that blurry Bigfoot photo your friend just posted to Instagram be his wife in a cheap Halloween costume? Think of possible alternate explanations for the weird claim you just read on Twitter. Maybe the person who tweeted it is sincere but gullible, smart but drunk, or a liar—or maybe he just made an honest error in judgement.

When confronted with an outrageous or weird news article, think it through and consider alternate explanations. Yes, maybe the Loch Ness monster really was hooked and boated in Scotland yesterday. But consider this hypothesis: Maybe this is the product of a snickering, teenaged carnival barker somewhere in Moldova fishing for clicks by any means necessary in order to get paid.

Imagine possible explanations for an idea or claim, then narrow them down to determine which of them is more likely to be true. This mental exercise can help you analyze and deepen your view of the topic, while also better positioning you to reach a well-reasoned conclusion later.

IV. Conduct your own experiment or test. Some people love to talk all day about how confident they are in the validity of ghosts, psychic readings, and other such claims. Believe me, I know. I'm the guy who so often politely listens to the stories and takes notes. But, at the end of the day, talk is just talk. Prove it, says the good scientist. The good thinker asks: How do you know? Sometimes the best way to get to the truth is by running an experiment or two. Test the claim and respect the results. In sort, do science. Experiments don't just happen in laboratories, of course. Many people already conduct experiments all of the time without thinking much about it. They just need to be more thoughtful and methodical about it. When a typical person is looking to buy a car, for example, it's customary to take it out for a test drive. That can loosely be considered an experiment.

It's not always easy. Figuring out how to test a claim can be difficult, but it is doable more often than not. It is important to keep in mind that most claims we encounter in daily life are positive, meaning they declare that something exists, is true, or works in a certain way. Your challenge is not as hard as you might think, because you don't have to disprove the claim. You just want to see if an experiment can verify the claim as true. If the experiment fails, the claim might still be true but you now have a reason to be more skeptical. For example, if one of my Twitter followers declares that there is a real goddess who “grants wishes to anyone who sincerely asks for good things,” then I can easily test this claim. If I'm tempted to believe the claim, then it's especially crucial that I conduct an experiment. Accepting it as true without any confirmation would mean I'm just another goofball willing to believe anything. To test this claim, I might recruit some people to help me and instruct them to close their eyes and make a wish to this goddess for an immediate end to global child hunger and malnutrition. If that doesn't happen, then I would have a stronger reason to remain skeptical of the claim and deny it entry into my own personal pantheon of beliefs. However, I still wouldn't go so far as to declare that I have absolute knowledge of the goddess's nonexistence or impotence when it comes to granting wishes. Who knows? Maybe this particular goddess is real and maybe she really does grant wishes but she was busy that day, or maybe my volunteers didn't project their wishes correctly. I'll keep an open mind, but I won't believe in her, in part because the experiment failed.

Don't forget about the second step in this process: Do the research. Find out as much as you can about what has already been done. The claim you are curious about or tempted to believe in may have already been tested by someone somewhere. Search around. Skeptical Inquirer and Skeptic have excellent, searchable online archives. Survey the work of longtime skeptics James Randi, Ben Radford, Joe Nickell, and others, to see if they have already tackled the claim. If it's a news item, check with Snopes.com and also see what older, respected news media companies are reporting.

If you find relevant studies or experiments published in journals or elsewhere, a key point to consider is sample size. In general, experiments that involve testing something with people benefit from larger sample sizes. A tiny pool of test subjects makes it easier for results to become skewed and lead to false conclusions. One or two outliers can throw the small data set out of whack. For example, a therapeutic touch study with a sample size of four would be suspect based on small sample size alone. Finally, check to see if the experiment was double-blind and placebo-controlled. This is the gold standard for experiments because it reduces the chance of unintentional bias by the researchers.

One experiment is not the final word. Experiments are meant to repeated. Many things can go wrong. If possible, do it again and again. Have someone else conduct your test and see if the results are the same. Of course, you should consider the cost of time, energy, and resources when considering a claim and determine whether it's worth the trouble. If someone claims to have invented a new exercise that will render me immortal, I'll probably not be willing to invest much time thinking about it because it's so very unlikely to be true. However, if someone tells me that they developed a new pullup technique guaranteed to add five reps to my personal best, I might want to check that out.

Do science. Test unusual claims by conducting experiments when possible. Find out if others have already tested or investigated the topic.

V. Share your conclusion. After pausing, doubting, hypothesizing, analyzing, researching, and testing a claim, it's time to make your decision about accepting or rejecting it. Remember that rejection in the scientific and good-thinking sense does not mean you necessarily have disproved or even denied that the claim is true, only that you are unconvinced at this time.

Once you decide to accept or reject the claim, you still aren't finished. Now it's time to ask other people to look for errors in your reasoning. Share your conclusion with the smartest, most sensible people you know. Let others consider, challenge, and pick apart your conclusion or decision. This is the street version of the tried-and-tested publication and peer-review process that helps make science so productive. Scientists put their work out in the light where other scientists can review it, tear it apart if possible, and repeat experiments toward confirming the result or exposing its flaws. When a scientist searches for errors in the work of another scientist, she or he is not engaging in devious backstabbing. It's just science.

When it comes to your important beliefs, conclusions, and ideas, the more good thinkers you can recruit to double-check your thinking, the better.

Finally, it is vital that you consider whatever input you receive from others and be willing to change your mind if necessary. If someone points out a mistake or a flaw in your reasoning, don't be too irritated or offended. Be grateful that you have been given the chance to move a little closer to truth and reality.

No mind is an island. Sharing a conclusion with smart, sensible people is crucial because none of us is perfect. Asking others for input is not only necessary but also relatively easy to do these days, thanks to social media. Let those online networks work for you. Use them to strengthen your good thinking.

WISDOM, HUMILITY, AND THE EVER-PRESENT BLIND SPOT BIAS

I hope you have ignored that voice you heard in your mind from time to time throughout this chapter. That was your subconscious trying to make you feel secure and comfortable by convincing you that you're much too smart to fall for any of this stuff. That voice is the final cognitive bias we will consider here. This is the one that helps all of the others work so well against us. The blind spot bias reassures us by making us believe that we are rational and logical, that it's other people who are too emotional, too gullible, and just not sharp enough when it counts. Don't believe these lies. Be humble and stay grounded when the blind spot bias shows up. Be aware that we all tend to underestimate our vulnerability to cognitive biases. Research shows that it is natural for us to believe that we are better than most when it comes to critical thinking.41 But this reflexive and unjustified confidence only puts us at even greater risk because it encourages a further lowering of defenses—which, in most cases, are far too weak already. Be diligent. Maintain awareness and admit to your many standard human vulnerabilities. Strive to keep them under control, but know that you never will. No one can tame all of them all of the time. It's impossible, because these automatic mental processes are an intimate part of who we are. But in trying, by working to be more rational and walk a path closer to reality, we open ourselves to living richer, more honest lives.

There is a chance that the avalanche of deception, delusion and confusion wrought by the Internet and social media today will turn out to be good for humanity in the long run. At some point, it may just be too much. Enough people will be victimized enough times, and frustrations will rise until critical mass is achieved. No one will be able to deny the need for good thinking if the world is drowning in lies. In the meantime, with so much nonsense and propaganda flying at us from every direction every moment, we face the choice now to either lie down as constant fools and hapless victims, or to wake up, stand up, and think critically. Filter bubbles, fake news, deceptive advertising, and so on present difficult challenges; but you and I can take them on and win more battles than we lose. A sharp mind, wielded by a committed and consistent good thinker, can dodge or destroy almost any bad idea.