4
The Power of Numbers
Stories create connections and get remembered, but numbers convince people. They give a sense of precision to even the most imprecise stories, and putting a number on a judgment call makes you feel more comfortable when dealing with uncertainty. In this chapter, I begin by looking at the history of numbers, tracing their origins from ancient civilizations to the quantitative models of today. I then look at the power that numbers have over us, why we use them, and how developments in the last three decades have made data easier to collect, analyze, and disseminate. I close the chapter by examining the dangers of trusting numbers too much and how they can lead you into mistakenly thinking that you are being objective and in control when in fact you are neither.
A History of Numbers
The very first number systems go back to prehistoric times and were tallying systems that you can see depicted in cave paintings. The ancient civilizations all had their own version of number systems, with the Mayan system built around a base of 60. The Egyptians are believed to have invented the base 10 system that underlies mathematics today, and the numbers we use today, while called arabic numerals, were first used by Indians. Along the way, the Arabs discovered the magical properties of zero, and the Chinese explored the possibility of negative numbers.
In spite of these advances, for much of human existence, the use of numbers was restricted to a few, because data was difficult to get and save, computation was time intensive, and analytical tools were limited. In the Middle Ages, the birth of the insurance business and the strides made in statistical theory expanded the use of numbers in business. It was the development of financial markets in the nineteenth century that supercharged the use of numbers and saw the growth of number crunching as a profession, with actuaries, accountants, and stockbrokers joining in.
The invention of the computer in the middle of the last century changed the game again; the scale of number crunching expanded as machines replaced human labor. Until the personal computer was invented in the 1970s, though, those with access to large, expensive computer systems (generally large corporations, universities, and research units) had a decided advantage over the rest of us. The personal computer has democratized not only access to data but also to the tools needed to analyze the data by allowing much greater numbers of businesspeople, investors, and journalists to do what only a select few could have done in a prior generation.
The Power of Numbers
As machine power continues to expand at an exponential pace, there are clear trends toward using numbers more in decision making. Businesses talk about using big data to guide what products they should produce, who they should sell them to, and at what price. Investors have also become more number oriented, with a subset of investors (the quants) putting their trust entirely in data and sophisticated tools for analyzing the data. In this section, I would like to focus on what it is about numbers that draws people to them.
Numbers Are Precise
Early in this book, I referenced Moneyball, the book about Billy Beane, manager of the Oakland As, a professional baseball team.1 Baseball is a sport with a long history in the United States, and ironically for a sport that generates mounds of statistics about players, it has largely been run based on storytelling by scouts about young prospects, by managers about the right situational moves to make during a game, and by players on how to hit or pitch. Billy Beane revolutionized the sport by putting his faith in numbers, using the voluminous statistics that came out of games to determine who he would play on his teams and how the game would be played on the field. His success in creating a world-class team on a bare-bones budget not only made him a managerial star but led to others in baseball imitating him. In many ways, Michael Lewis encapsulates the tension between storytelling and numbers and makes the argument for numbers when he describes traditional baseball’s reaction to Beane’s efforts as “an example of how an unscientific culture responds, or fails to respond, to the scientific method.”2
The notion that numbers are scientific and more precise than stories is deeply held, and as a result of this belief, the revolution that Billy Beane brought to baseball has spread far and wide. The field of sabermetrics, named by Bill James, the baseball statistician and Beane’s intellectual mentor, has now found a place in other sports, with managers and players drawing from it. Nate Silver, a statistician by training, has upended the political punditry business, using numbers to challenge what he considers the fluffy stories told by conventional political experts. Not surprisingly, the field that has been upended the most by the data revolution has been business, partly because there is so much data available to analyze and partly because the payoff to using those data well can be immense.
In chapter 2 I pointed out how social media has created a platform for storytelling, but it is interesting that social media has also shown us how much we all care about numbers. You measure the content of Facebook posts by the number of likes and the reach of a Twitter tweet by its retweets, and there is evidence that you sometimes change what you write and say on social media to attract greater numbers of people.
Numbers Are Objective
At some point in time during our educational lives, we were taught (and often forgot) the scientific method. At least as described in a high school classroom, the essence of the scientific method is that you start with a hypothesis, conduct experiments or collect data, and then accept or reject the hypothesis based on the data. Implicit in this description is the message that a true scientist is unbiased and that it is the data that provides the answer to a question.
In chapter 2, when I looked at the dangers of storytelling, I noted how biases creep into stories and how difficult it is for listeners to push back in a storytelling world. One reason that people are so attracted to numbers is the perception, fair or unfair, that numbers are unbiased and thus agenda-free. While this presumption is not true, as you will see in the next section, it remains undeniable that while listeners may feel less connected with a person who presents a case primarily with numbers than with stories, they are also more likely to view the person as more objective.
Numbers Indicate Control
In The Little Prince, a children’s book, the Prince visits an asteroid and meets a man who counts the stars, insisting that if he were able to count them all, he would own them. That children’s tale has resonance, since many people seem to feel that measuring something or putting a number on it will allow them to control it better. Thus, even though a thermometer can only tell you that you have a fever and a blood pressure monitor provides a reading of your blood pressure at the time you take it, both seem to give you a sense of control over your health.
In business, the mantra has become: if you cannot measure it, you cannot manage it. That slogan is music to the ears of the firms that build, supply, and support measurement tools. There are areas of business in which being able to measure output and progress more accurately has led to significant progress. In inventory control, being able to track how much you have of each item in inventory in real time has allowed companies to simultaneously reduce their inventory and meet customer needs more promptly. In many segments of business, though, the reality is that the mantra has been modified to: if you measure it, you have already managed it. In other words, many businesses seem to have replaced serious analysis with more numbers.
CASE STUDY 4.1: THE POWER OF QUANT INVESTING
The power of numbers in investing is best seen in the growth of quant investing, whose promoters are open about the fact that their investing is based only on the numbers. In fact they compete with one another in explaining how much they have turned their investment processes over to the data and the power of their data-analytic tools. The roots of quant investing go back in time to a surprising source, Benjamin Graham, considered by many to be the father of modern value investing. Graham defined multiple screens for finding undervalued companies, and while applying these screens was difficult to do in his time, when data was often collected by hand and the screening was manual, screening for stocks today is easy and almost costless.
The Markowitz revolution that gave birth to modern portfolio theory also was a contributor to quant investing. The approach to finding efficient portfolios, that is, portfolios that delivered the highest returns for a given level of risk, that Harry Markowitz developed in the 1950s, was a computational nightmare at that time, given the limitations of both data access and analysis. Today it is possible for an individual investor, armed with a personal computer and online data, to generate efficient portfolios on stock samples that would have taken weeks to create a few decades ago.
In the late 1970s as historical returns data and accounting data became more accessible, a new strand of academic research emerged, in which researchers pored over past data, looking for systematic patterns. The initial findings from these studies, that small market capitalization stocks earned higher returns than larger market capitalization companies and that low price earnings (PE) stocks outperformed the market, were labeled anomalies by academics, because they did not fit the classical risk and return model predictions. For investors and portfolio managers, they became opportunities, market inefficiencies to exploit to generate higher returns.
In the last decade, as more data became available, some of it in real time, and computational power exploded, quant investing morphed into new and potentially troublesome forms. In his latest book, Flash Boys, Michael Lewis looks at a subset of investors called high-frequency traders who use high-powered computers to scan real-time price data for mispricing and trade on that mispricing. These dark pools are thus almost entirely number driven and are the logical end product of a purely number-driven investment process.
The Dangers of Numbers
Just as the strengths of storytelling become its weaknesses, the strengths of numbers can very quickly become weaknesses that can be exploited by number crunchers to push their agendas.
The Illusion of Precision
I used to use the words “precise” and “accurate” interchangeably, until a mathematician pointed out to me that the two words measured different things. He used a dartboard to illustrate the difference, noting that the precision of a model is captured by how close results from the model are to each other, given the same inputs, whereas the accuracy of a model is best measured by looking at how the results of the model compare with the actual numbers (figure 4.1).
image
Figure 4.1
Accuracy versus precision.
Put differently, you can create precise models that are inaccurate and accurate models that are imprecise. That contrast is worth noting, since the number-crunching disciplines often wrongly value precision over accuracy.
The more you work with numbers, the sooner you come to the realization that while numbers look or can be made to look precise, they are anything but precise, especially in the context of forecasting the future. In fact, statistics tries to make this imprecision explicit through its estimation process, wherein you are taught that when you make an estimate, you should also reveal the potential error in that estimate in the form of a standard error. In practice, and especially so in business and investing, that advice is ignored, and estimates are treated as facts, often leading to disastrous consequences.
There is a final aspect of numbers that adds to the imprecision. One of the key findings in behavioral economics is that our response to numbers depends not only on their magnitude but also on how they are framed. That is the weakness that retailers exploit when they mark up the price on an item to $2.50 and take 20 percent off, since shoppers seem more inclined to buy that item than a similar one priced at $2.00. In one of the more famous examples of this framing bias, subjects in an experiment were asked to choose between two treatments for 600 people affected by a deadly disease with the results for each of the treatments as shown in table 4.1. With positive framing, treatment A was chosen by 72 percent of the subjects over treatment B, though they had exactly the same numerical end results. With negative framing, treatment A was chosen by only 22 percent of the subjects over treatment B, again with the same end results. In the context of business, the analogies would be making money (positive) and losing money (negative) and businesses surviving (positive) and businesses failing (negative), with the implication that framing the same numbers differently can lead to different responses.
Table 4.1
The Effect of Framing
Framing Treatment A Treatment B
Positive Saves 200 people 33.33% chance of saving all 600 people, 66.67% chance of saving no one.
Negative Kills 400 people 33.33% chance that no one will die, 66.67% chance that everyone will die.
CASE STUDY 4.2: THE “NOISY” HISTORICAL EQUITY RISK PREMIUM
The equity risk premium, simply put, is the price investors charge for investing in equities (which are a risky investment class) as opposed to keeping their money in a riskless investment. Thus, if investors can earn an annual, guaranteed (making it riskless) return of 3 percent, the equity risk premium is what they will demand over and above that number for investing in stocks. Intuitively, you would expect the equity risk premium to be a function of how risk-averse investors are, with more risk aversion translating into a higher premium, and how risky they perceive equities to be as an investment class, with the perception that there is more risk leading to a rise in the equity risk premium.
Given that the equity risk premium is a key input for both corporate finance and valuation, how can you estimate this number? Most practitioners turn to history, looking at what investors have earned in the past on stocks relative to a riskless investment. In the United States, that database goes back a century or longer, though the stock market has expanded and matured over that period. If you assume that the U.S. Treasury cannot default and that the securities that it issues (Treasury bills and Treasury bonds) are thus guaranteed, risk-free investments, you can estimate historical equity risk premiums from past data. For instance, in the 1928–2015 time period, U.S. equities earned 11.41 percent on average each year, and the annual return on Treasury bonds was 5.23 percent over the same period. The difference of 6.18 percent is labeled a historical equity risk premium and used by practitioners as the estimate for the future.
Probing that number a little more, it should be noted that this average comes from stock returns that are volatile, with returns ranging from a high of almost 50 percent in 1933 to a low of close to −44 percent in 1931. Figure 4.2 captures this volatility in stock returns.
image
Figure 4.2
Annual returns on U.S. stocks and Treasury bonds from 1928 to 2015.
Source: Damodaran Online (http://pages.stern.nyu.edu/~adamodar).
The equity risk premium estimate of 6.18 percent now comes with a warning label in the form of a standard error of 2.30 percent. What does that mean? Loosely speaking, it suggests that your estimate could be wrong by as much as 4.60 percent in either direction, meaning that your true equity risk premium could be as low as 1.58 percent or as high as 10.78 percent.3
The numbers get even shakier if you bring in the fact that your estimation choices affected your estimate. Rather than use the 1928–2015 time period, you could have used a shorter period (say the last ten or fifty years) or a longer one (since you have some databases that go back to 1871). Instead of using the ten-year Treasury bond, you could have used a three-month Treasury bill or a thirty-year bond. Finally, you could have replaced the arithmetic average with a compounded or geometric average. Each of these choices would have yielded different estimates of the equity risk premium, as evidenced in table 4.2.
Table 4.2
Estimates of Annual Equity Risk Premium for the U.S. Estimation Choices
  Arithmetic average Geometric average
Stocks—Treasury bills Stocks—Treasury bonds Stocks—Treasury bills Stocks—Treasury bonds
1928–2015 7.92% 6.18% 6.05% 4.54%
1966–2015 6.05% 3.89% 4.69% 2.90%
2006–2015 7.87% 3.88% 6.11% 2.53%
Thus, using a different time period, a different measure for a riskless investment, and even a different manner of averaging returns can yield very different estimates of the equity risk premium for the United States. The equity risk premium is definitely an estimate, not a fact.
The Illusion of Objectivity
The fact that the way you frame numbers can change the way people respond to them provides a segue into the second delusion about numbers—numbers are objective and number crunchers have no agendas. Really? As you will see in detail in the next chapter, the process of collecting, analyzing, and presenting data provides multiple opportunities for bias to enter the process. To make things worse, in the hands of a skilled number cruncher, this bias can be hidden far better with numbers than with stories.
From the listeners’ perspective, there are different biases that come into play, in which the way you look at the numbers and the ones you choose to focus on will depend on your prior beliefs. To provide an example, I estimate the effective tax rates paid by publicly traded companies in the United States at the start of every year on my website. In the interests of providing comprehensive statistics, I report the average tax rates for each sector using three different approaches for averaging: a simple average of the tax rates across companies in the sector, a weighted average of the tax rates across companies in the sector, and a weighted average of the tax rates across only money-making companies in the sector. Each year there are journalists, politicians, and business trade groups that use my tax rate data, often to support very different agendas. The business trade groups, intent on showing that they pay their fair share of taxes, pick the tax rate measure that yields the highest value to make their case. Advocacy groups that believe U.S. corporations don’t pay their fair share in taxes look at the same table and find the tax rate measure that yields the lowest value to bolster their arguments. Both sides argue that the facts (and numbers) are on their side and neither will admit to the existence of bias.
CASE STUDY 4.3: NUMBERS AND BIAS WITH THE EQUITY RISK PREMIUM
In case study 4.2, I explained how making different estimation choices can yield very different estimates of the equity risk premium, with estimates ranging from 2.53 percent (a ten-year geometric average premium for stock over Treasury bonds) as a low number to 7.92 percent (the arithmetic average premium for stocks over Treasury bills from 1928 to 2015) as the high number. That should not surprise you, given the standard error of 2.30 percent that I estimated for the risk premium from 1928 to 2015.
The equity risk premium that you choose to use has consequences, and one venue where the effects can be large is in the regulation of utilities (power, water) in the United States. For decades, the companies in these businesses have been allowed to operate as monopolies in their regional domains, but in return, regulatory commissions determine how much utilities can increase the prices of their products. In making this judgment, these commissions look at what a fair rate of return should be for investors in these companies and then allow the product price to rise to deliver this return. That fair rate of return for much of the last few decades has been computed with the equity risk premium as a key ingredient, with the rate of return increasing as the equity risk premium goes up.
Not surprisingly, the regulated companies and the regulatory authorities have very different perspectives on which number they should be using from table 4.2. The companies push for the highest premium they can get away with, perhaps even the 7.92 percent, because the higher premium translates into a higher rate of return and more substantial price increases. The regulatory commission, on the other hand, would prefer the use of a lower premium, since that will then keep a lid on product price increases and make consumers happier. Each side claims that its estimate of the premium is a fact and it is often left to legal forums or arbitration panels to split the difference.
The Illusion of Control
Measuring something does not mean that you are controlling it. Just as a thermometer can tell you that you have a fever but cannot treat it, measuring the standard deviation of a portfolio only tells you that it is risky but does not protect you from that risk. That said, it is true that you may feel more in control when you are able to measure something and that the more time you spend with numbers, the more you may use measurement tools as a crutch.
In corporate finance and valuation, areas in which I spend most of my time, I notice this phenomenon play out in many places. The first is in the use of what if or sensitivity analysis, often as an addendum to a valuation or a project analysis. In most cases, these analyses happen after the decision has already been made, and the only explanation I can provide for why analysts spend so much time on them is that it makes them feel more in control. The second is in the attention that analysts give to small and often irrelevant details. I say, only half jokingly, that when in doubt, I add decimals to my final numbers, whether they be valuations of companies or rates of return for projects.
The danger with deluding yourself that you are in control, just because you have a sophisticated measurement tool, is that you may not only let the numbers overwhelm your common sense but that you will not prepare yourself properly for the dangers ahead. That, unfortunately, was what happened at banks around the world during the banking crisis in 2008. In the two decades prior to the crisis, these banks had developed a risk measure called “value at risk” (VAR), which allowed them to see in numerical terms their worst-case scenarios in terms of losses from their businesses. In the intervening period, risk-management experts and academics refined VAR to make it more powerful and more complex, with the stated intent of making it more effective. As bank managers became increasingly reliant on VAR, they also let down their guards and concluded that if the computed VAR was within their defined safe limits, their risk taking was also under control. In 2008 that delusion fell apart, as the weaknesses in VAR’s core assumptions were exposed and banks that thought they were protected from catastrophic risk found that they were not.
CASE STUDY 4.4: THE SAD (BUT TRUE) STORY OF LONG-TERM CAPITAL MANAGEMENT
If you trust numbers too much, you should pay heed to the experiences of Long-Term Capital Management (LTCM). The firm, which was founded in the early 1990s by ex–Salomon Brothers trader John Meriwether, promised to bring together the best minds in finance to find and take advantage of mispricing opportunities in the bond market. Delivering on the first part of the promise, Meriwether lured the best bond traders from Salomon and brought on board two Nobel Prize winners, Myron Scholes and Bob Merton. In the first few years of its existence, the firm also lived up to the second part of the promise, earning extraordinary returns for the elite of Wall Street. In those years, LTCM was the envy of the rest of the Street as it used low-cost debt to augment its capital and earn substantial returns on mostly safe investment opportunities.
As the funds at their disposal got larger, the firm had to widen its search to include riskier investments, though it did find those investments by analyzing the data. By itself, this would not have been fatal, but the firm continued to use the same leverage on these riskier investments as it did on its safe investments. It did so because the complex models it had built told it that while the individual investments were risky, based on their history, they would not move together and that the portfolio was therefore a safe one.
In 1997, the strategy unraveled as collapses in one market (Russia) spread into other markets as well. As the portfolio dropped in value, LTCM found itself facing the downside of its size and high leverage. Unable to unwind its large positions without affecting market prices and facing the pressures of lenders, LTCM faced certain bankruptcy. Fearing that it would bring down other investors in the market, the Federal Reserve engineered a bank-led bailout of the firm.
What are the lessons we can learn from the fiasco? Besides the cynical one that it is good to have friends in high places, you could argue that the fall of LTCM teaches you that having access to the most brilliant minds, the most up-to-date data, and the best models in investing or business does not equate to success.
The Intimidation Factor
If you are a corporate financial analyst, a consultant, or a banker facing a skeptical audience, one simple technique to silence the room is to open up a complex spreadsheet filled with numbers. This works particularly well if your audience is not comfortable with numbers, but even if you have a numerically literate audience, the human mind is generally incapable of looking at a hundred numbers on a page and making sense of them.
The fact that numbers can be intimidating is not a secret to either number crunchers or to their audiences. For number crunchers, that intimidation works at cutting off debate and preventing probing questions that may uncover large and potentially fatal weaknesses embedded in the numbers. For those in the audience, the numbers offer an excuse for not doing their homework. When things fall apart, as they did with VAR in 2008, both the number crunchers and the number users blame the model for their failures.
I know that I am capable of using numbers to bludgeon those who disagree with me on my valuations or investment judgments. When asked a question that cuts to the heart of my investment thesis, perhaps exposing its weakness, I feel the urge to pull up an equation that will either deflect the question or leave the questioner uncertain about the basis for his or her question, but I also know that doing so will only make my judgments less sound.
The Imitation Problem
If the numbers are all that drive your decisions, as some purist number crunchers claim, you are in big trouble as the decision maker, for two reasons. The first is that you have positioned yourself perfectly for being outsourced, replaced not just by a cheaper number cruncher in a different location but by a machine. After all, if your strong point is that you can be machine-like in your decision making, objective and driven just by the numbers, a machine would be better at that task than you will ever be. That, of course, is the promise of young financial technology firms that offer robo-investing advice: they ask investors for numbers (age, income, financial savings, and retirement plans) just as a financial advisor would, and the computer then generates an investment portfolio based upon the numbers.
If your defense against the outsourcing is that you have better data and more powerful computers than most other people, you open yourself up to a second problem, which is that a purely number-driven decision process is easy to imitate. Thus, if you are a “quant hedge fund” and build an elaborate quantitative model to find the best stocks to buy and sell, all I would need to do is be able to see the stocks you buy and sell, and with a powerful enough computer of my own, I should be able to replicate your strategy.
The Lemming Problem
Let’s assume that you live in big data heaven, where you and everyone else has huge databases and powerful computers to analyze and make sense of the data. Since you all share the same data, and perhaps even use the same tools, you are going to highlight the same opportunities, generally at about the same time, and seek them out for profit. That process is going to create “herding,” when you buy and sell the same stocks at the same time. So what? That herding will create momentum, which will reinforce your decisions at least in the short term but will also cause you to be collectively wrong if there is a structural shift in the underlying process (business, market, or economy). The data, after all, comes from the past, and if the future is going to be different from the past, the consequence of the structural shift, the predictions based upon the data will come apart.
The implications are sobering. As we move increasingly to a data-driven world, and more and more people have access to that data, it stands to reason that we will see more booms and busts than we have historically. Bubbles in markets will be bigger than they used to be and when these bubbles burst, as they inevitably will, the carnage is going to be greater as well.
Storytelling as Antidote
If numbers are dangerous because they come with the illusions of control, precision, and objectivity and can be easily imitated, how will adding stories to numbers reduce those problems? First, the nature of stories is that they are fuzzy and remind us that precise as the numbers look, changing your story will change the numbers. Second, that recognition also will dispel the notion that you somehow can deliver the numbers you have forecast, since stories can be changed by forces out of your control. Third, when you are forced to unveil the story that backs your numbers, your biases are visible not just to the rest of the world but to yourself. I also believe that the capacity to combine stories with numbers makes it more difficult for others to imitate you, if you are successful. Unlike models that can be easily copied, storytelling is more nuanced, personal, and difficult to replicate.
The one problem that adding stories to numbers will not solve, at least in the near term, is herding. The groupthink that leads people to pile into the same stocks and investments because the numbers lead them there will also lead to them to reinforce one another’s stories. There is an argument to be made, though, that the best way to break the madness of crowds is with a combination of an alternative (and more realistic) story, backed up by numbers to give it credibility.
CASE STUDY 4.5: THE FALL OF QUANT INVESTING
In case study 4.1, I presented quant investing as the positive culmination of the data revolution, a version of Moneyball for financial markets, in which number crunching replaced the hand waving and storytelling of an earlier age. In this follow-up, I want to look at how the dangers of numbers—they are imprecise, are vehicles for bias, and provide the illusion of control—have played out in the downfall of at least some aspects of quant investing.
Let’s start with the imprecision of the numbers. The good news, if you are a money baller in financial markets, is that they create immense amounts of data, some from companies’ financial filings but far more from the market itself (price changes, trading volume). The bad news is that the data is extraordinarily noisy, as you can see, even at the market level, in my computations of standard errors for the equity risk premium in case study 4.2. Almost every quant strategy is built on past data, and its promise (usually taking the form of an alpha or excess return) comes with the qualifier that the past is not a predictor of the future and that even if it is, there is a great deal of uncertainty about outcomes.
As for bias, much as we try not to, it is impossible not to let your biases creep into not only how you crunch the numbers but also in how you read the data entrails. Once you create a quant strategy, attach your name to it, and sell it to clients, you are irrevocably on a biased path, where you will find confirmation that your strategy works, even when it is on the verge of collapse.
Finally, it took the market crisis in 2008 to reveal to hedge funds how little control they had over investment outcomes. As developed markets went through contortions that had never been seen in recent history, models that had been built carefully on historical data not only gave false signals but did so for lots of investors at the same time.
I am not ready to bury quant investing yet, since the forces that brought it to the forefront are still around us, but I think that its successes and failures reveal both the promise and peril of numbers. Quant investing, to succeed and prosper, has to find a place for storytelling and narrative in conjunction with the numbers, and if it does, it will not only be more successful but it will also be more difficult to imitate and outsource.
Conclusion
I am naturally drawn to numbers but one of the ironies of working with numbers is that the more I work with them, the more skeptical I become about purely number-driven arguments. In my work with financial data, both accounting and market driven, I have learned about how much noise there is in that data and how difficult it is to make predictions based on that data. I believe in the scientific method, but I don’t believe there are many pure scientists out there. All research is biased, with the only questions becoming about the direction and magnitude of the bias. Thus, it is my job when presented with a numbers-driven argument to probe for the biases of the person making the argument, and once I find them, to adjust the numbers to reflect that bias. Finally, I have learned that it takes hubris on my part to believe that just because I put a number on a process or variable, I control it or even understand it. Thus, I can offer you a dozen different numerical measures for risk, most with great academic pedigrees, but I struggle on a daily basis to understand what exactly risk is and how it affects us as investors.