Why Ideologues Oversimplify Things
How does a zipper work? Rate your understanding on a scale from 0 (no clue) to 10 (easy-peasy). Write the number down. Now sketch out on a piece of paper how a zipper actually works. Add a brief description, as though you were trying to explain it very precisely to someone who’d never seen a zipper before. Give yourself a couple of minutes. Finished? Now reassess your understanding of zippers on the same scale.
Leonid Rozenblit and Frank Keil, researchers at Yale University, confronted hundreds of people with equally simple questions. How does a toilet work? How does a battery work? The results were always the same: we think we understand these things reasonably well until we’re forced to explain them. Only then do we appreciate how many gaps there are in our knowledge. You’re probably similar. You were convinced you understood more than you actually did. That’s the knowledge illusion.
If we mess up with such simple contraptions as zippers and toilets, imagine how ignorant we much be when it comes to the really big questions. Questions like: how much immigration is good for society in the long term? Or: should gene therapy be allowed? Or: does private gun ownership make societies safer?
Even with these unwieldy questions—interestingly, especially with these unwieldy questions—answers come shooting out like bullets from a pistol. But let’s be honest: we haven’t thought them through. Not even on a superficial level. Social issues are far more complex than zippers, toilets and batteries. Why? Because any intervention into social structures has consequences infinitely more far-reaching than flushing the loo. It’s not enough to consider the first wave of effects. You’ve got to account for the effects of the effects of the effects. Thinking through the chain reaction properly would take days, weeks, even months, and who’s got the time or the energy for that?
So we take comfortable shortcuts. At this point something peculiar happens: instead of reading books on the topic or consulting experts, we merely adopt the opinions of our peers. This may be a political party, a profession, a social group, a sports club or a street gang. Our opinions are therefore much less objective than we’d like to believe, being sourced primarily from a “community of knowledge,” as Steven Sloman and Philip Fernbach call it in their book The Knowledge Illusion. Sadly we’re not the independent thinkers we’d like to imagine we are. Instead, we treat our opinions rather like our clothes. We’ll wear whatever’s in fashion—or, more specifically, whatever’s in fashion among our peers.
When this inclination to toe the party line spreads beyond individual topics and starts constituting a whole worldview, it can spell disaster. That’s when we start talking about ideologies. Ideologies are party lines raised to the power of ten, and they come with a pre-packaged set of opinions.
Ideologies are highly dangerous. Their effect on the brain is like a high-voltage current, causing an array of rash decisions and blowing all kinds of fuses. It’s like, for example, when young European men with a university education swear loyalty to ISIS and fight to reintroduce the medieval teachings of Islam.
Avoid ideologies and dogmas at all cost—especially if you’re sympathetic to them. Ideologies are guaranteed to be wrong. They narrow your worldview and prompt you to make appalling decisions. I don’t know of a single dogmatist with anything approaching a good life.
So far so clear. The problem, however, is that many people don’t notice that they’re falling for an ideology. How do you recognize one? Here are three red flags: a) they explain everything, b) they’re irrefutable, and c) they’re obscure.
An excellent example of an irrefutable ideology with an explanation for everything is Marxism. If the concentration of wealth in a society increases, the faithful will immediately attribute it to the fundamental evil of capitalism—as described by Marx. If inequality decreases, however, they will explain it as the development of history toward a classless society—as predicted by Marx.
At first glance, such irrefutability appears to be an advantage. Who wouldn’t want a theory on hand that’s so powerful it means you’re always right? In reality, however, irrefutable theories are anything but invulnerable—in fact, they’re very easily exposed. When you meet someone showing signs of a dogmatic infection, ask them this question: “Tell me what specific facts you’d need in order to give up your worldview.” If they don’t have an answer, keep that person at arm’s length. You should ask yourself the same question, for that matter, if you suspect you’ve strayed too far into dogma territory.
To be as unassailable as possible, ideologies often hide behind a smokescreen of obscurantism. That’s the third red flag by which to recognize an ideology. Here’s an example: the ordinarily clear and articulate theologian Hans Küng describes God as “the absolute-relative, here-hereafter, transcendent-immanent, all-embracing and all-permeating most real reality in the heart of things, in man, in the history of mankind, in the world.” All-explaining, irrefutable and utterly obscure!
This type of language is a sure sign of ideological gibberish. Watch out for it—including in your own speech. Try to find your own words to express things. Don’t just mindlessly adopt the formulations and imagery of your peer group. An example: don’t talk about “the people” when your party means only one specific social group. Avoid slogans.
Be especially wary when speaking in public. Defending a dogmatic position in public has been shown to beat it even deeper into your brain. It becomes virtually ineradicable.
Start looking for counterarguments. You could even do as I suggested in Chapter 30 and imagine you’re on a TV talk show with five other guests, all of whom hold the opposite conviction from yours. Only when you can argue their views at least as eloquently as your own will you truly have earned your opinion.
In sum: think independently, don’t be too faithful to the party line, and above all give dogmas a wide berth. The quicker you understand that you don’t understand the world, the better you’ll understand the world.