The Illusion of Explanatory Depth

Adam Waytz
Psychologist; Associate Professor, Kellogg School of Management, Northwestern University

If you asked 100 people on the street if they understand how a refrigerator works, most would say yes. But ask them to produce a detailed, step-by-step explanation of exactly how, and you’d likely hear silence or stammering. This powerful but inaccurate feeling of knowing is what Leonid Rozenblit and Frank Keil in 2002 termed the illusion of explanatory depth (IOED), stating, “Most people feel they understand the world with far greater detail, coherence, and depth than they really do.”

Rozenblit and Keil initially demonstrated the IOED through multi-phase studies. In a first phase, they asked participants to rate how well they understood certain artifacts—such as a sewing machine, crossbow, or cell phone. In a second phase, they asked participants to write a detailed explanation of how each artifact worked and afterward asked them to re-rate how well they understood each one. Study after study showed that ratings of actual knowledge dropped markedly from phase one to phase two, after participants were faced with their inability to explain how the artifact in question operated. Of course, the IOED extends well beyond artifacts: to how we think about scientific fields, mental illnesses, economic markets, and virtually anything we’re capable of (mis)understanding.

At present, the IOED is pervasive, given that we have abundant access to information but consume it in a largely superficial way. A 2014 survey found that roughly six in ten Americans read news headlines and nothing more. Major geopolitical issues—civil wars in the Middle East, the latest climate-change research—are distilled into tweets, viral videos, memes, “explainer” Web sites, soundbites on comedy news shows, and daily e-newsletters that get inadvertently re-routed to the spam folder. We consume knowledge widely, but not deeply.

Understanding the IOED helps us combat political extremism. In 2013, Philip Fernbach and colleagues demonstrated that the IOED underlies people’s policy positions on issues like single-payer healthcare, a national flat tax, and a cap-and-trade system for carbon emissions. As in Rozenblit and Keil’s studies, Fernbach and colleagues first asked people to rate how well they understood those issues and then to explain how each issue worked and finally to re-rate their understanding. In addition, participants rated the extremity of their attitudes on those issues both before and after offering an explanation. Both self-reported understanding of an issue and attitude extremity dropped significantly after the attempts to explain the issue; people who strongly supported or opposed an issue became more moderate. What’s more, reduced extremity also reduced willingness to donate money to a group advocating for the issue. These studies suggest that the IOED is a powerful tool for cooling off heated political disagreements.

The IOED provides us with much-needed humility. In any domain of knowledge, often the most ignorant are the most overconfident in their understanding of that domain. Justin Kruger and David Dunning famously showed that the lowest performers on tests of logical reasoning, grammar, and humor are the most likely to overestimate their test scores. Only through gaining expertise in a topic do people recognize its complexity and calibrate their confidence accordingly. Having to explain a phenomenon forces us to confront this complexity and realize our ignorance. At a time when political polarization, income inequality, and urban-rural separation have fractured us regarding social and economic issues, recognizing our modest understanding of those issues is a first step to bridging the divides.