9

Decision Making

A bat and ball cost $1.10.

The bat costs one dollar more than the ball.

How much does the ball cost?

You now have a number in your mind. I’m sorry to tell you this, but chances are excellent that your answer is wrong. Don’t despair—over half the students at Harvard, MIT, and Princeton got it wrong too. And they didn’t do any better on the next two problems.

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long will it take for the patch to cover half of the lake?1

These three problems—easy, in hindsight—were devised by Shane Frederick, associate professor of marketing at Yale University, as part of the Cognitive Reflection Test he created in 2005 while working at MIT. He was interested in measuring people’s cognitive reasoning, and in particular how readily they could override the brain’s reflexive decision-making center—what is commonly called intuition.

For years, psychologists have been interested in the idea that our cognitive processes are divided into two modes of thinking, traditionally referred to as intuition, which produces “quick and associative” cognition, and reason, described as “slow and rule-governed.” Today, these cognitive systems are commonly referred to as System 1 and System 2. System 1 thinking is intuitive. It operates automatically, quickly, and effortlessly with no sense of voluntary control. System 2 is reflective. It operates in a controlled manner, slowly and with effort. The operations of System 2 thinking require concentration and are associated with subjective experiences that have rule-based applications.

Although we like to think of ourselves as having sturdy System 2 ability, in fact much of our thinking occurs in System 1. Let’s return to Frederick’s college students. More than half of them said the ball cost 10 cents. And what is most surprising, it is also the same answer most gave when they were asked, “Is that your final answer?”

It is clear the college students were stuck in System 1 thinking and could not, or would not, convert over to System 2. If they had taken even just one moment to think, Frederick said, they would have realized that the difference between $1 and 10 cents is 90 cents, not one dollar. The surprisingly high rate of errors among college students indicates two problems. First, people are not accustomed to thinking hard about problems and often rush to the first plausible answer that comes to mind. Second, the System 2 process does a bad job of monitoring System 1 thinking.

Frederick also discovered that people who did well on the Cognitive Reflection Test tended to be more patient in answering questions. System 2 thinking is a relatively slow process. When we are forced to answer quickly, we don’t have enough time to engage the rationality that is at the heart of the reflective process.

This is not to say intuition does not have a role in our thinking—far from it. I dare say we could not get through the day without our basic intuitions. When you’re driving a car, if the back end begins to slip sideways, intuition tells you to turn the wheel in the direction of the slide. You have no time to engage System 2 thinking and carefully ponder a list of different options.

Indeed, the role of intuition in our cognitive process has earned the attention of serious scientists. You may remember Daniel Kahneman from our chapter on psychology—the psychologist who won the Nobel Prize in Economics for his studies of human judgment and decision making.

Kahneman believes there are indeed cases where intuitive skill reveals the answer, but that such cases are dependent on two conditions. First, “the environment must be sufficiently regular to be predictable” second, there must be an “opportunity to learn these regularities through prolonged practice.” For familiar examples, think about the games of chess, bridge, and poker. They all occur in regular environments, and prolonged practice at them helps people develop intuitive skill. Kahneman also accepts the idea that army officers, firefighters, physicians, and nurses can develop skilled intuition largely because they all have had extensive experience in situations that, while obviously dramatic, have nonetheless been repeated many times over.

Kahneman concludes that intuitive skill exists mostly in people who operate in simple, predictable environments and that people in more complex environments are much less likely to develop this skill. Kahneman, who has spent much of his career studying clinicians, stock pickers, and economists, notes that evidence of intuitive skill is largely absent in this group. Put differently, intuition appears to work well in linear systems where cause and effect is easy to identify. But in nonlinear systems, including stock markets and economies, System 1 thinking, the intuitive side of our brain, is much less effectual.

Let’s return to our college students for a moment. We can assume they’re all smart, so why did they have trouble solving the problems? Why did they leap to a conclusion based totally on intuition (System 1 thinking), and why didn’t System 2 thinking correct their faulty answers? Because, in a nutshell, they lacked adequate reservoirs of information.

In his own writing about intuition, Kahneman called on a definition developed by Herbert Simon—another psychologist who also won a Nobel Prize in Economics based on his studies of decision making. “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”2 Thus, Kahneman believes, increasing the amount of information stored in memory increases our skill at intuitive thinking. Further, he says, the failure of System 2 to override System 1 is largely a resource condition. “In some judgmental tasks, information (in System 2 thinking) that could serve to supplement or correct the heuristic (occurring in System 1 thinking) is not neglected nor underweighted, but simply lacking.”3

Improving the resource condition of our System 2 thinking—that is to say, deepening and broadening our reserves of relevant information—is the principal reason this book was written.

image

College students are not investment professionals—at least not for a few more years. So we might say Shane Frederick’s pessimistic view of the thinking talent among undergraduates is premature and will eventually be righted. If Kahneman’s theory is correct, all that is needed is a bit more learning time with hands-on experience, and our young intellectuals will be able to calculate the cost of a baseball, how long it takes to make widgets, and how many days it takes lily pads to cover a lake. Soon these fresh-faced graduates, who are eager to slay the world, will become the next generation of investment experts.

Philip Tetlock, professor of psychology at the University of Pennsylvania, might also have had an optimistic view about their future had he not spent fifteen years (1988–2003) studying the decision-making process of 284 experts. He defined experts as people who appeared on television, were quoted in newspaper and magazine articles, advised governments and businesses, or participated in punditry roundtables. All of them were asked about the state of the world; all gave their prediction of what would happen next. Collectively, they made over 27,450 forecasts. Tetlock kept track of each one and calculated the results. How accurate were the forecasts? Sadly, but perhaps not surprisingly, the predictions of experts are no better than “dart-throwing chimpanzees.”4

How can this be? According to Tetlock, “How you think matters more than what you think?”5

It appears experts are penalized, like the rest of us, by thinking deficiencies. Specifically, experts suffer from overconfidence, hindsight bias, belief system defenses, and lack of Bayesian process. You may remember these mental errors from our chapter on psychology.

Such psychological biases are what penalize System 1 thinking. We rush to make an intuitive decision, not recognizing that our thinking errors are caused by our inherent biases and heuristics. It is only by tapping into our System 2 thinking that we can double-check the susceptibility of our initial decisions.

Some 2,600 years ago, the Greek warrior poet Archilochus wrote, “The fox knows many tricks, the hedgehog only one.” The quote was later made famous by Sir Isaiah Berlin in his popular essay, “The Hedgehog and the Fox: An Essay on Tolstoy’s View of History.” In it, Berlin divided writers and thinkers into two categories: hedgehogs, who viewed the world through the lens of a single defining idea, and foxes, who were skeptical of grand theories and instead drew on a wide variety of experiences before making a decision. Berlin was surprised by the controversy the essay created. “I never meant it very seriously,” he said. “I meant it as a kind of enjoyable intellectual game, but it was taken seriously.”6

Researchers were quick to grab the analogy to help explain their own research on decision making—Tetlock included. In his Expert Political Judgment study, Tetlock divided the forecasters into Hedgehogs and Foxes. Despite the overall dismal performance, he was able to discern differences. The aggregate success of the forecasters who behaved most like foxes was significantly greater than those who behaved like hedgehogs.

Why are hedgehogs penalized? First, because they have a tendency to fall in love with pet theories, which gives them too much confidence in forecasting events. More troubling, hedgehogs were too slow to change their viewpoint in response to disconfirming evidence. In his study, Tetlock said Foxes moved 59 percent of the prescribed amount toward alternate hypotheses, while Hedgehogs moved only 19 percent. In other words, Foxes were much better at updating their Bayesian inferences than Hedgehogs.

Unlike Hedgehogs, Foxes appreciate the limits of their own knowledge. They have better calibration and discrimination scores than Hedgehogs. (Calibration, which can be thought of as intellectual humility, measures how much your subjective probabilities correspond to objective probabilities. Discrimination, sometimes called justified decisiveness, measures whether you assign higher probabilities to things that occur than to things that do not.) Hedgehogs have a stubborn belief in how the world works, and they are more likely to assign probabilities to things that have not occurred than to things that actually occur.

Tetlock tells us Foxes have three distinct cognitive advantages.

1.  They begin with “reasonable starter” probability estimates. They have better “inertial-guidance” systems that keep their initial guesses closer to short-term base rates.

2.  They are willing to acknowledge their mistakes and update their views in response to new information. They have a healthy Bayesian process.

3.  They can see the pull of contradictory forces, and, most importantly, they can appreciate relevant analogies.7

Hedgehogs start with one big idea and follow through—no matter the logical implications of doing so. Foxes stitch together a collection of big ideas. They see and understand the analogies and then create an aggregate hypothesis. I think we can say the fox is the perfect mascot for the College of Liberal Arts Investing.

Keith Stanovich, professor of human development and applied psychology at the University of Toronto, believes that intelligence tests (like the ACT and SAT) measure important qualities but do a very poor job of measuring rational thought. “It is a mild predictor at best,” he says, “and some rational thinking skills are totally dissociated from intelligence.”8 Intelligence tests typically measure mental skills that have been developed over a long period. But remember, the most common thinking errors have less to do with intelligence and more with rationality—or, more accurately, the lack of it.

The idea that people with high IQs could be so bad at decision making at first seems counterintuitive. We assume that anyone with high intelligence will also act rationally. But Stanovich sees it differently. In his book, What Intelligence Tests Miss: The Psychology of Rational Thought, he coined the term “dysrationalia”—the inability to think and behave rationally despite having high intelligence.

Research in cognitive psychology suggests there are two principal causes of dysrationalia. The first is a processing problem. The second is a content problem.

Stanovich believes we process poorly. When solving a problem, he says, people have several different cognitive mechanisms to choose from. At one end of the spectrum are mechanisms with great computational power, but they are slow and require a great deal of concentration. At the opposite end of the spectrum are mechanisms that have low computational power, require very little concentration, and make quick action possible. “Humans are cognitive misers,” Stanovich writes, “because our basic tendency is to default to the processing mechanisms that require less computational effort, even if they are less accurate.”9 In a word, humans are lazy thinkers. They take the easy way out when solving problems and as a result, their solutions are often illogical.

The second cause of dysrationalia is the lack of adequate content. Psychologists who study decision making refer to content deficiency as a “mindware gap.” First articulated by David Perkins, a Harvard cognitive scientist, mindware refers to the rules, strategies, procedures, and knowledge people have at their mental disposal to help solve a problem. “Just as kitchenware consists in tools for working in the kitchen, and software consists in tools for working with your computer, mindware consists in the tools for the mind,” explains Perkins. “A piece of mindware is anything a person can learn that extends the person’s general powers to think critically and creatively.”10

Mindware gaps, he believes, are generally caused by the lack of a broad education. In Perkins’s view, schools do a good job of teaching the facts of each discipline but a poor job of connecting the facts of each discipline together in such a way to improve our overall understanding of the world. “What is missing,” he says, “is the metacurriculum—the ‘higher order’ curriculum that deals with good patterns of thinking in general and across subject matters.”11

How to fix this? At first, Perkins envisioned special courses that focused on the art of thinking. But he realized that adding courses to an already crowded curriculum would prove difficult. Instead, he thought what was needed with each subject matter was a direct injection of thoughtfulness—what he called a mindware booster shot. “I am an advocate of what is often called infusion,” he wrote, “integrating the teaching of new concepts in a deep and far-reaching way with subject matter instruction.”12

So now we come around to the heart of the matter. Perkins’s hope for a new way of learning perfectly aligns with the underlying principle of this book—that people studying the art and science of investing are best served by incorporating the “rules, strategies, procedures, and knowledge” from several different disciplines. In this regard, Investing: The Last Liberal Art is a direct example of a mindware booster shot.

It is rare when the Wall Street Journal and the New York Times reach the same conclusion. But they both agreed Daniel Kahneman’s new book, Thinking Fast and Slow, was one the top five nonfiction books in 2011. As of this writing (September 2012), it has appeared on the Times best seller list for twelve weeks and counting—a remarkable feat for a 500-page book on decision making. I take it as a positive sign. Finally, behavioral finance has become a part of the mainstream.

Kahneman tells us much of the book is about the biases of intuition. “However,” he writes, “the focus on error does not denigrate human intelligence any more than the attention to diseases in medical texts denies good health. Most of us are healthy most of the time, and most of our judgments and actions are appropriate most of the time. As we navigate our lives, we normally allow ourselves to be guided by impressions and feelings, and the confidence we have in our intuitive mind beliefs and preferences are usually justified. But not always. We are confident even when we are wrong and an objective observer is more likely to detect our errors than we are.” The goal of the book, says Kahneman, “is to improve and understand errors of judgment and choice, in others and eventually in ourselves.”13

My favorite chapter in the book came early. In Chapter 3, “The Lazy Controller,” Kahneman reminds us that cognitive effort is mental work. And as with all work, many of us have a tendency to get lazy when the task gets harder. We simply run out of gas. Several psychological studies show that people who are simultaneously challenged by demanding cognitive tasks and temptation are more likely to yield to temptation. If you are continually forced to do something over and over that is challenging, there is a tendency to exert less self-control when the next challenge arrives. Kahneman tells us that activities that put demands on System 2 thinking require self-control, and continuous exertion of self-control can be unpleasant.

Kahneman is surprised by the ease with which intelligent people appear satisfied enough with their initial answer that they stop thinking. He is reluctant to use the word “lazy” to describe the lack of System 2 monitoring, but lazy is what it seems to be. Kahneman notes that we often say of people who give up on thinking, “He didn’t bother to check whether what he said made sense” or “Unfortunately, she tends to say the first thing that comes into her mind.” What we should be thinking, he says, is “Does he usually have a lazy System 2 or was he unusually tired?” Or, for the second example, “She probably has trouble delaying gratification—a weak System 2.”

According to Kahneman, “Those who avoid the sin of intellectual sloth could be called ‘engaged.’ They are more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, more skeptical about their intuitions.”14 What does it mean to be engaged? Quite simply, it means your System 2 thinking is strong, vibrant, and less prone to fatigue. So distinct is System 2 thinking from System 1 thinking that Keith Stanovich has termed the two as having “separate minds.”

But a “separate mind” is only separate if it is distinguishable. If your System 2 thinking is not adequately armed with the required understanding of the major mental models collected from the study of several different disciplines, then its function will be weak—or, says Kahneman, lazy.

Having been schooled in modern portfolio theory and the efficient market hypothesis, will you quickly and automatically default to this physics-based model of how markets operate, or will you slow down your thinking and also consider the possibility that the market’s biological function could be altering the outcome? Even if the market looks hopelessly efficient, will you also consider that the wisdom of the crowds is only temporary—until the next diversity breakdown?

When you analyze your portfolio, will you resist the almost uncontrollable urge to sell a losing position, knowing full well the angst you feel is an irrational bias—the pain of loss being twice as discomforting as the pleasure of an equal unit of gain? Will you stop yourself from looking at your price positions day in and day out, knowing that the frequency with which you do is working against your better judgment? Or will you bow down to your first instinct and sell first and ask questions later?

When thinking about companies, markets, and economies, will you rest with your first description of events? Knowing that more than one description is possible and the dominant description is most often determined by the extent of media coverage, will you dig deeper to uncover additional, perhaps more appropriate, descriptions? Yes, it takes mental energy to do this. Yes, it will take more time to reach a decision. Yes, this is more difficult than defaulting to your first intuition.

Lastly, with all that you have to read to get through the requirements of your job, will you read a new book that will increase your understanding? As Charlie Munger has said so many times, it is only by reading that you are able to continuously learn.

All this and more are the mental exercises that help to close the mindware gap and strengthen your System 2 thinking. It serves to keep you engaged. It works to fully develop your separate mind.

My hope with this book is that it will inspire you to begin thinking about investing in a different way, as something more than a kaleidoscope of shifting numbers. But thinking about investing differently means thinking creatively. It requires a new and innovative approach to absorbing information and building mental models. You will recall from Chapter 1 that to construct a new latticework of mental models, we must first learn to think in multidisciplinary terms and to collect (or teach ourselves) fundamental ideas from several disciplines, and then we must be able to use metaphors to link what we have learned back to the investing world. Metaphor is the device for moving from areas we know and understand to new areas we don’t know much about. To build good mental models, we need a general awareness of the fundamentals of various disciplines, plus the ability to think metaphorically.

The art of model building depends on our skill at constructing building blocks.15 Think of the classic children’s toy, Lincoln Logs. To build a model of a cabin, children construct, using various logs, a replica of what they think a log cabin looks like. Now, the set comes with many different logs. Some are short and some are long; some are used for connecting the roof, others are used to frame the doors and windows. To build a good log cabin, the builder has to combine the logs together in such a way as to create a good model.

Constructing an effective model for investing is very similar to building a log cabin. We have, throughout this book, provided a number of different building blocks. Good model building is very much about combining the building blocks in a skillful, artful way. Properly combined, these building blocks will give you a reasonable model of how markets work and, I hope, add some insight that will help you become a better investor. Of course, what we can quickly appreciate is that if you have only a couple of building blocks, it will be very difficult to construct an exact model of a log cabin. This is also true of investing. If you possess only a few building blocks, how will you ever be able to construct a useful model?

The first rule in building an effective model, then, is to start with enough building blocks. To build our all-encompassing model of the market—a meta-model, if you will—we will use as building blocks the various mental models described in this book, the key ideas taken from individual disciplines. After we have collected enough building blocks, we can start to assemble them into a working model.

One critical difference between building a model of a log cabin and building a model of market behavior is that our investing model must be dynamic. It must have the ability to change as the circumstances change. As we have already discovered, the building blocks of fifty years ago are no longer relevant because the market, like a biological organism, has evolved.

A model that changes shape as its environment changes may be difficult to envision. To get a sense of how that could work, imagine a flight simulator. The great advantage of a simulator is that it allows pilots to train and perfect their skills under different scenarios without the risk of actually crashing the plane. Pilots learn to fly at night, in bad weather, or when the plane is experiencing mechanical difficulties. Each time they perform a simulation, they must construct a different flight model that will allow them to fly and land safely. Each one of those models may contain similar building blocks but assembled in a different sequence; the pilot is learning which building blocks to emphasize for each of the scenarios.

The pilot is also learning to recognize patterns and extrapolate information from them to make decisions. When a certain set of conditions presents itself, the pilot must be able to recognize an underlying pattern and to pull from it a useful idea. The pilot’s mental process goes something like this: I haven’t seen this exact situation before, but I saw something like it, and I know what worked in the earlier case, so I’ll start there and modify it as I go along.

Building an effective model for investing is very similar to operating a flight simulator. Because we know the environment is going to change continually, we must be in a position to shift the building blocks to construct different models. Pragmatically speaking, we are searching for the right combination of building blocks that best describes the current environment. Ultimately, when you have discovered the right building blocks for each scenario, you have built up experiences that in turn enable you to recognize patterns and make the correct decisions.

One thing to remember is that effective decision making is very much about weighting the right building blocks, putting them into some hierarchical structure. Of course, we may never fully know what all the optimal building blocks are, but we can put into place a process of improving what we already have. If we have a sufficient number of building blocks, then model building becomes very much about reweighting and recombining them in different situations.

One thing we know from recent research by John Holland and other scientists (see Chapter 1) is that people are more likely to change the weighting of their existing building blocks than to spend any time discovering new ones. And that is a mistake. We must, argues Holland, find a way to use productively what we already know and at the same time actively search for new knowledge—or, as Holland adroitly phrases it, we must strike a balance between exploitation and exploration. When our model reveals readily available profits, of course we should intensely exploit the market’s inefficiency. But we should never stop exploring for new building blocks.

Although the greatest number of ants in a colony will follow the most intense pheromone trail to a food source, there are always some ants that are randomly seeking the next food source. When Native Americans were sent out to hunt, most of those in the party would return to the proven hunting grounds. However, a few hunters, directed by a medicine man rolling spirit bones, were sent in different directions to find new herds. The same was true of Norwegian fishermen. Each day most of the ships in the fleet returned to the same spot where the previous day’s catch had yielded the greatest bounty, but a few vessels were also sent in random directions to locate the next school of fish. As investors, we too must strike a balance between exploiting what is most obvious while allocating some mental energy to exploring new possibilities.

By recombining our existing building blocks, we are in fact learning and adapting to a changing environment. Think back for a moment to the description of neural networks and the theory of connectionism in Chapter 1. It will be immediately obvious to you that by choosing and then recombining building blocks, what we are doing is creating our own neural network, our connectionist model.

The process is similar to genetic crossover that occurs in biological evolution. Indeed, biologists agree that genetic crossover is chiefly responsible for evolution. Similarly, the constant recombination of our existing mental building blocks will, over time, be responsible for the greatest amount of investment progress. However, there are occasions when a new and rare discovery opens up new opportunities for investors. In much the same way that a mutation can accelerate the evolutionary process, so too can newfound ideas speed us along in our understanding of how markets work. If you are able to discover a new building block, you have the potential to add another level to your model of understanding.

It’s important to understand that you have the opportunity to discover many new things and add new building blocks to your mental models without ever taking undue risk. You can throw a lot of theories and ideas into your thinking mix, assemble them into a model, and, like a pilot in a flight simulator, try them out in the marketplace. If the new building blocks prove useful, then keep them and give them the appropriate weight. But if they appear to add no value, you simply store them away and draw them up again some day in the future.

But remember, none of this will happen if you conclude that you already know enough. Never stop discovering new building blocks. When a corporation cuts its research and development budget to focus on the here and now, that may produce greater profits in the short term, but more likely it places the company in competitive jeopardy at some point in the future. Likewise, if we stop exploring for new ideas, we may still be able to navigate the stock market for a while, but most likely we are putting ourselves at a disadvantage for tomorrow’s changing environment.

At the center of the University of Pennsylvania campus, where Locust Walk crosses the Thirty-Seventh Street walkway, a life-size bronze statue depicts Benjamin Franklin sitting on a park bench. He wears a ruffled shirt and knickers, a long coat and vest, and square-buckled shoes. A pair of round bifocals sits on the very tip of his nose, and he is reading a copy of the Pennsylvania Gazette. Of the forty-one statues of Benjamin Franklin in Philadelphia, this one, designed by George W. Lundeen, is by far my favorite. The bench, underneath a beautiful shade tree, is a comfortable spot for a person to sit and reflect about a latticework of mental models, next to the man who so passionately advocated the value of a liberal arts education.

The Thirty-Seventh Street walkway is a major thoroughfare on the Penn campus. Each morning when class is in session, students spill out of the dormitory building called The Quadrangle and head uphill on Thirty-Seventh. When they reach the intersection with Locust Walk, they splinter off into separate groups, each group heading in a different direction toward the classes in their chosen discipline.

The physics and math majors turn right and head over to the David Rittenhouse Laboratory on Thirty-Third Street. Biology majors turn left and walk to the Leidy Laboratories on University Avenue. Sociology majors turn left for the Sociology Building located on Locust Walk. Psychology majors continue straight on Thirty-Seventh to the Psychology Building on Walnut Street. Philosophy majors turn right onto Locust and walk down to Logan Hall. The English majors walk a few more steps to Fisher Bennett Hall.

The finance students at Penn, who study at the famous Wharton School of Business, have the shortest distance to travel. As Benjamin Franklin watches silently, they turn right at the intersection and walk just a few steps to Steinberg Hall, Diedrich Hall, and Huntsman Hall. There they will spend the next four years taking courses on economics, management, finance, accounting, marketing, business, and public policy. At the end of the four years, with college degree in hand, most will seek a job in the financial services industry. A few will attend graduate school and earn an MBA degree for intensely studying for two more years what they have already learned in the previous four.

Sitting next to Benjamin Franklin one spring afternoon, I wondered to myself what opportunities these hard-charging finance students will have when they graduate, and what additional advantages they would receive if they had spent more of their college experience studying other disciplines. With just one course in physics, they would have learned about Newton’s principles, thermodynamics, relativity, and quantum mechanics. They might have been exposed to wave motion, turbulence, and nonlinearity. They might have realized that the same laws that describe the flow of lava at the earth’s center or demonstrate how small-scale shifts in plate tectonics cause large earthquakes also govern the forces in financial markets.

Biology majors at Penn spend four years studying molecular biology and evolution, microbiology and genetics, neurobiology, and the biology of invertebrates and vertebrates as well as botany and plant development. But a finance major who took but one course, The Molecular Biology of Life, would have learned about the genetics of animals, bacteria, and viruses, with particular attention to the ways in which modern cell and biological molecular genetic methods contribute to our understanding of evolutionary processes. From that one course, in one semester, a perceptive student might have recognized that the patterns that exist in biology look very similar to the patterns that occur in companies and markets.

Students at the Wharton School will have spent a great deal of time studying the theory and structure of financial markets, but what additional insights could they have learned by taking Social Problems and Public Policy, Technology and Society, Sociology at Work, or Social Stratification? To be a successful investor, you need not spend four years studying sociology, but even a few courses in this discipline would increase the awareness of how various systems organize, operate, thrive, fail, and then reorganize.

Today there is little debate over the fact that psychology affects investing. How much value-added benefit to their education would finance students derive from some basic courses in psychology? Consider, perhaps, a class on Physiology of Motivated Behaviors, where students seek to learn the links between brain structure and behavioral function. Or Cognitive Psychology, which investigates the mental processes in humans, including how people use pattern recognition to determine action. Surely no finance student would pass up the opportunity to take Behavioral Economics and Psychology, which applies psychological research to economic theory to examine what happens when agents with limited cognitive capacities make strategic decisions.

To work in finance, which is a job about making decisions, how could finance majors pass up courses in modern philosophy, logic, and critical thinking? What mental tools might they acquire by studying the theories of knowledge, mind, and reality expressed by Descartes, Kant, Hegel, James, and Wittgenstein? Think of the competitive advantages they could gain from a course on critical thinking, which would give them techniques for analyzing arguments in both natural and statistical language.

Yes, I know there is a lot to read in college, but why not use one of your three unrestricted electives and take Nineteenth Century American Literature, where you will read the outstanding literary treatments of American culture from the early Federalist period to the beginnings of the First World War? Better yet, take the Creative Nonfiction Writing class, a workshop course in writing expository prose; you will learn to write formal essays on topics such as autobiography, review, interview, analysis of advertising, and popular culture.

Of course, in being a finance major you will have a great deal of math to contend with in your accounting and economics courses, but what about adding Mathematics in the Age of Information? You would learn about mathematical reasoning and the media. Often there are mathematical assumptions embedded in stories printed in the media, and this course will teach you how to recognize and question the different mathematical postulates.

Watching the students pass one by one on their way to classes in their chosen major, I can’t help but wonder where they will all be in twenty-five years. Will their college education have adequately prepared them to compete at the highest level? Once they reach retirement age, will they be able to look back and measure their life’s work as a success or will they see it as something less than that?

These are the same questions Charlie Munger asked his classmates at the fiftieth reunion of the Harvard Law School class of 1948.16 “Was our education sufficiently multidisciplinary?” he asked. “In the last fifty years, how far has elite academia progressed toward attainable best-form multidisciplinarity?”

To make his point about single-focus thinking, Charlie often employs the proverb “To a man with only a hammer, every problem looks pretty much like a nail.” Now, said Charlie, “One partial cure for man-with-hammer tendency is obvious: if a man has a vast set of skills over multidisciplines, he, by definition, carries multiple tools and therefore will limit bad cognitive effects from the ‘man with a hammer’ tendency. If ‘A’ is a narrow professional doctrine and ‘B’ consists of the big, extra-useful concepts from other disciplines, then, clearly, the professional possessing ‘A’ plus ‘B’ will usually be better off than the poor possessor of ‘A’ alone. How could it be otherwise?”

Charlie believes that the broadscale problems we as a society face can be solved only by placing them on a latticework that spreads across many disciplines. Therefore, he argues, educational institutions should raise the fluency of a multidisciplinary education. Admittedly, Charlie is quick to add, “We don’t have to raise everyone’s skill in celestial mechanics to that of Laplace and also ask everyone to achieve a similar level in all other knowledge.” Remember, he said, “it turns out that the truly big ideas in each discipline, learned only in essence, carry most of the freight.” Furthermore, he continued, to attain broad multidisciplinary skills does not require us to lengthen the already expensive commitment to a college education. “We all know individuals, modern Benjamin Franklins, who have achieved a massive multidisciplinary synthesis with less time in formal education than is now available to our numerous brilliant young and thus become better performers in their own disciplines, not worse, despite diversion of learning time to matters outside the normal coverage of their own disciplines.” It is Charlie’s belief that society would be better off if more college courses across a broader spectrum were made mandatory rather than elective.

So as we near the end of this book, we find we have come back full circle to its beginning. The challenge we face as investors and very much as individuals has less to do with the knowledge that is available than with how we choose to put the pieces together. Similarly, the main problem in education revolves around assembling the pieces of curriculum. “The ongoing fragmentation of knowledge and resulting chaos are not reflections of the real world but artifacts of scholarship,” explains Edward O. Wilson in Consilience: The Unity of Knowledge.17 Consilience, which Wilson describes as the “jumping together” of knowledge from various disciplines, is the only way to create a common framework of explanation.

One of the principal goals of this book is to give you a broader explanation of how markets behave and in the process help you make better investment decisions. One thing we have learned thus far is that our failures to explain are caused by our failures to describe. If we cannot accurately describe a phenomenon, it is fairly certain we will not be able to accurately explain it. The lesson we are taking away from this book is that the descriptions based solely on finance theories are not enough to explain the behavior of markets.

The art of achieving what Charlie Munger calls “worldly wisdom” is a pursuit that appears to have more in common with the ancient and medieval periods than with contemporary studies, which mostly emphasize gaining specific knowledge in one particular field. No one would disagree that over the years we have increased our baskets of knowledge, but what is surely missing today is wisdom. Our institutions of higher learning may separate knowledge into categories, but wisdom is what unites them.

Those who make an effort to acquire worldly wisdom are beneficiaries of a special gift. Scientists at the Santa Fe Institute call it emergence. Charlie Munger calls it the lollapalooza effect: the extra turbocharge that comes when basic concepts combine and move in the same direction, reinforcing each other’s fundamental truths. But whatever you decide to call it, this broad-based understanding is the foundation of worldly wisdom.

The Roman poet Lucretius writes:

Nothing is more sweet that full possession

Of those calm heights, well built, well fortified

By wise men’s teaching, to look down from here

At others, wandering below, men lost,

Confused, in hectic search for the right road.

For many, many people, the financial markets are confusing, and investing has become a hectic search for the right road. But traveling more quickly down well-worn roads is not the answer. Rather, looking down from the calm heights of knowledge gained from wise men’s teaching is. Those who constantly scan in all directions for what can help them make good decisions will be the successful investors of the future.

Seated on the campus park bench, Benjamin Franklin and I watch as the last of the finance students, now late for class, rush past us. I can’t help but wonder if he too is thinking about their education and their future. Does he wonder if they have read broadly enough to develop “the connected idea of human affairs” he so eloquently advocated in his 1749 pamphlet? If they have begun to cultivate the habits of mind that will permit them to make connections and link ideas? If they are set on a course of lifelong learning?

He must be thinking about those things, for I think I can hear him quietly read aloud the headline on the Gazette he is holding: “The good education of youth has been esteemed by wise men in all ages as the sweet foundation of happiness.” It is a simple formula for personal and societal success, as valid today as it was 250 years ago. It is also a timeless road map for achieving worldly wisdom.