images

IF ONLY…PROBABILITY AND COINCIDENCES, GOOD, BAD, AND UGLY

Most people have a very poor feel for probability, their vocabulary for the notion impoverished, and, for some, it is restricted to variants of these three terms: miraculous, 50-50, and absolutely certain. This indifference to the topic reveals itself in all areas of life, including the way people talk, write, and think about their lives. I sometimes ask students which self-proclaimed psychic is more probabilistically impressive—the one who correctly predicts 54 of 100 coin flips or the one who correctly predicts 26 of 100 coin flips? Most say the first one and don't see how impressively unlikely the second one's predictions are. But anyone who could predict so few correct answers would be invaluable; just negate everything he or she said, and you'd have a true clairvoyant.

This cluelessness about the pervasive mathstuff of chance is accentuated when the events in question are not clear-cut like coin flips but are more nebulous. For example, let's assume something awful occurs, say, a fatal car accident. It's hard not to review the events just prior to the accident and conclude that had any one of them been different, the accident would not have occurred. Furthermore, we can wonder about how probable each of the preceding events was. What if he had bought milk earlier? What if the grocery store had longer hours and he didn't decide to go to the closer convenience store? What if he hadn't taken the shortcut to the store? What if he hadn't made the traffic light and arrived at the next intersection a minute later? What if the driver of the other car had not been fired that morning? What if he hadn't been extended credit at the bar? What if he hadn't been kicked out of the bar at the time he was? And on and on.

We might try to assign a probability to each of these more-or-less independent events. Recall a couple of very useful basic ideas of probability: (i) events are independent if the occurrence of one doesn't affect the probability of the others occurring and (ii) the probability of a number of independent events occurring is obtained by multiplying their individual probabilities. That is, if some event A occurs 1/4th of the time and an independent event B occurs 1/3rd of the time, then B occurs in 1/3rd of those instances in which the event A has occurred, so the events A and B occur together 1/3 × 1/4 or 1/12th of the time. Using these ideas, we conclude that the probability of all the accident's antecedents occurring is almost impossibly minuscule.

If, to illustrate, we roll a pair of dice two times, flip a coin three times, and spin a roulette wheel twice, the probability of obtaining a 7, an 11, three heads, and a 27 and 31 is, since the events are independent, equal to the product 6/36 × 2/36 × 1/2 × 1/2 × 1/2 × 1/38 × 1/38, which equals 1/1,247.616 or about .0000008. This minuscule probability is correct but misleading in the sense that any sequence of independent events all occurring will have a vanishingly small probability.

We can, of course, also carry out the above thought experiment for positive events of interest, say, meeting one's future spouse at a protest in Madison, Wisconsin, or having one's book make the New York Times bestseller list. What if you hadn't gone to that demonstration or hadn't written that piece for the book review? And if we ask for the probability of the sequence of events leading up to that significant event, the answer will always be that it's tiny, a different tiny number, but still quite tiny. If long enough, any particular sequence of events will be very, very improbable, so the wonderment that is often a consequence of discovering this improbability is natural, but isn't really justified.

Another example: I mentioned earlier that there are almost 10 to the 68th power—a one with 68 zeroes after it—orderings of the 52 cards in a deck. Truly an unfathomable number, but it's not hard to devise even everyday situations that give rise to much larger numbers. Now if we shuffle this deck of cards for a long time and then examine the particular ordering of the cards that happens to result, we would be justified in concluding that the probability of this particular ordering of the cards having occurred is approximately 1 chance in 10 to the 68th power. This certainly qualifies as minusculely minuscule. Still, we would not be justified in concluding that the shuffle's ordering could not have possibly resulted in this particular ordering because its a priori probability is so very, very tiny. Some ordering had to result from the shuffling, and this one did. Incidentally, the vast, vast majority of orderings of the deck that occur during shuffling have never appeared in the history of human card playing.

Resistance to these sorts of considerations stems, I think, from a religious belief that the event in question, whatever it is, with its trail of antecedents was somehow ordained, “meant to be,” cosmically special, or a “miracle.” If the issue concerns playing cards, most people will buy the probabilistic analysis. Not so with more emotion-laden matters. I've always found it hard to argue with people making such claims with the glaze of certainty in their eyes and their child or a loved one in their thoughts. (More than hard, it's sometimes cruel.)

No one in that frame of mind is here now, so I'll simply make an observation. If an event or sequence of events gives rise to a supposed miracle that is deemed a divine intervention, the one deeming it so must face some obvious questions. Why, for example, do so many people refer to the rescuing of a few children after a destructive tornado as a miracle when they chalk up the death of perhaps dozens of equally innocent children in the same disaster to a meteorological anomaly? It would seem either both are the result of divine intervention or both are a consequence of atmospheric conditions. The same point holds for other tragedies. If a recovery from a disease after a long series of struggles and treatments is considered a miraculous case of divine intervention, then to what do we attribute the contracting of the disease in the first place?

Incidentally, this religious nonsense seems at odds with what the book of Ecclesiastes wisely says: “Time and chance happeneth to them all.”1

A related factor in our reaction to fortuitous events is how easily imaginable or psychologically available seemingly “nearby” outcomes are. It's easy to imagine missing the traffic light and the accident not occurring. In the other direction, it's also easy to imagine not going to the event and not meeting one's future spouse. The word nearby is in quotes because the parallel sequences of events whereby, say, the accident occurs or doesn't occur or you meet or don't meet someone are not necessarily close except in a psychological sense.

Your lottery ticket may differ slightly from the multimillion-dollar winning ticket in that 4 or 5 of the numbers you picked are the same or maybe differ by only 1 from the numbers on the winning ticket. You consider yourself unlucky even though your ticket wasn't close to the winning ticket in any but a psychological sense, which makes winning easier to imagine. Your ticket might as well have had every number numerically distant from the numbers on the winning ticket. There is no reasonable metric with respect to which one ticket is “close” to another or is “almost” the winning ticket.

(I was once on a TV show devoted to science that attempted to make clear how unlikely winning the lottery really is. With cameras running behind me I went into a New York convenience store crowded with ticket buyers, bought a ticket from the clerk, and with a flourish tore it up in front of everyone. You would think that I had kicked a puppy to judge from people's surprise and outrage.)

Finally, I note that this matter of accidental yet life-changing occurrences is, of course, related to the sheer ubiquity of coincidences. Most people don't realize that the number of possible coincidences we might observe grows exponentially with the ever-increasing torrent of numbers, acronyms, facts and factoids, blogs, tweets, ads, and Big Data in general that wash over us every waking moment. The vast, vast majority of this stupendous number of coincidences are meaningless despite the widespread tendency to read personal significance into just about everything.

A standard example of how easily coincidences can arise is provided by the first letters of the planets in order of their distances from the sun and the first letters of the months in calendar order. Is the SUN in MVEMJSUNp significant? (The lower case p reflects Pluto's demotion.) Is the JASON in JFMAMJJASOND significant? Of course not. Neither is the notable (but only notable) biographical fact that Mark Twain was born on the day Halley's comet appeared in 1835 and died on the day it returned in 1910.

Another example is illustrated by the diagram below, a random sequence of 250 simulated coin flips, the Hs or Ts each appearing with probability 1/2. Note the number of runs (strings of consecutive Hs or Ts) and the way there seem to be coincidental clusters and other patterns. If you felt you had to account for these, you would have to invent convoluted explanations that would of necessity be quite false. Random sequences rarely look completely random.

images

Finally, let me illustrate with a more extended instance I've written about before. It involves 9-11, a numerical phrase that is happily and finally appearing less and less frequently. On Wednesday, September 11, 2002—9/11/02—the New York State lottery numbers were 911, an eerie coincidence that set many people to thinking or, perhaps more accurately, to not thinking. A natural question comes to mind: How likely is this? After all, the lottery took place in New York State on the anniversary of the mass murder exactly one year before. Remember, however, that on any given day, each of the 1,000 possibilities—000, 001,…, 233,…, 714,…, 998, 999—is as likely to come up as any other. This is true of September 11 as well, so the probability that 911 would come up on that date is simply 1 in a 1,000. This probability is small, but not minuscule.2

This leads to what I like to call the Fundamental Confusion of Coincidences: the probability of an unusual event or sequence of events is usually very small, in fact often minuscule, yet the probability of some event or sequence of events of the same vaguely defined general sort is usually quite high. People regularly confuse these two probabilities.

Thus the broader, more appropriate question that should come to mind regarding this 911 story is: What is the probability that some event of this same general sort—something that is resonant with the date or is likely to stimulate us to think of it—would occur on September 11? The answer is impossible to say with any precision, but it is quite high. First off, there are two daily drawings in the New York State lottery, so there were two chances for 911 to come up that day, increasing the probability to (a bit under) 1 in 500. More importantly, there were innumerable other ways for an uncanny coincidence to occur.

How many addresses or license plates, for example, have 911 in them? At each of these addresses and for each of these vehicles, something could have occurred that caused people to think of September 11. Possibilities include an accident, murder, or arrest of someone suspected of terrorism, related to a victim of the attack, or otherwise associated with it. Or consider sports scores and statistics. The stock market is also a major producer of numbers. Another “close” example is the September 10 closing value of the September S&P 500 futures contracts. It was 911.

But this is all too easy to do. There are an indeterminate number of ways for such events to come about even though the probability of any particular one of them is tiny. Furthermore, after such an event occurs, people glom onto its tiny probability and, falling victim to the Fundamental Confusion, neglect to ask the much more pertinent question: How likely is something vaguely similar to this to occur?

Keep this in mind when you read the following excerpt from the great science fiction author Arthur C. Clarke. In his 1973 novel, Rendezvous with Rama, Clarke wrote: “At 0940 GMT on the morning of September 11 in the exceptionally beautiful summer of the year 2077, most of the inhabitants of Europe saw a dazzling fireball…. Somewhere above Austria it began to disintegrate…. The cities of Padua and Verona were wiped from the face of the earth, and the last glories of Venice sank forever.”3

Similar coincidences occur in our personal lives and biographies, where their insignificance may be even harder to acknowledge because of our familiarity with everyday situations. Our always active pattern-seeking brains lead us to see meaning and an agent behind almost everything. This may be a consequence of our evolutionary development in information-poor environments, but whatever the cause our brains have been primed to see patterns that aren't there, to infer agency where there is only randomness, and to invent nonexistent, seemingly explanatory entities. Sadly, one very important realm in which the assertion of randomness is false is natural selection, which many creationists continue to proclaim is a completely random process.

As I've noted elsewhere, surely the most amazing coincidence of all would be the complete absence of all coincidences.

INNUMERACY, A MATHEMATICIAN READS THE NEWSPAPER, AND THEIR AFTERMATH

For a while after coming to Temple University from the University of Wisconsin I taught an outside course on basic mathematics, mainly arithmetic, to nurses at a local medical school for a little extra money. The people in my class—mostly women, a couple of men—seemed temperamentally suited to the profession, appearing to be compassionate, empathetic, and caring. Unfortunately most of them were quite innumerate. Try as I did to get across the rudiments of percentages, numerical prefixes, proportions, simple calculations, and unit conversions, many never managed to distinguish 2 percent from .02 percent nor could they, despite my pleading, believe it was even very important to do so. The following semester I was scheduled to teach the course again and told the program supervisor that this level of mathematical naïveté was dangerous and that I wouldn't want to be attended to by many of the soon-to-be nurses were I to be hospitalized. The next day I received a letter informing me that my services would no longer be needed.

Although I'd been aware of widespread innumeracy for a long time, this incident and the general dismissal of the relevance and importance of mathematics, even by some very accomplished people in organizations (including writers’ groups) to which I belonged, were exasperating and catalyzing. I wrote a piece on the subject for Newsweek, and at the behest of Rafe Sagalyn, who noted it and soon became my agent, I expanded it into my book Innumeracy, which was published by Farrar, Straus and Giroux in 1989. A book on the mathematical aspects of various topical issues and the consequences of mathematical illiteracy, it became a national bestseller that remained on the New York Times bestseller list for four and a half months.4 Although I'd had the experiences related above and others of the same type, its success was a surprise and something of a turning point in my and my family's life. I don't want to be falsely modest; it is a very good book that promulgated an important message, imparted some essential mathematical ideas, and popularized a most useful word (humble brag). I was surprised at the wonderful reviews the book received, among them one by the aforementioned Douglas Hofstadter, author of Gödel, Escher, and Bach, who enthused, “Our society would be unimaginably different if the average person truly understood the ideas in this marvelous and important book.”5

Still, an important point I'd like to emphasize is that, as with many such turning points, a huge element of luck was involved. Probably essential to the book's success is that the date of publication just happened to coincide with the release of the first widely heralded report on the mathematical failings of American students. This news hook gave warrant to people wishing to mention the book in places and contexts other than the book section. Besides the Newsweek piece mentioned above, I had also recently written a piece on innumeracy and literature for the New York Times, and these two articles led to an appearance on NBC's Today show, which would not have occurred except for a last-minute cancellation by another guest. After these fortuitous events, I witnessed the exhilarating effects of the media's hall of mirrors, whereby one appearance leads to another, which spawns several more. No online social media or blogs existed in those antediluvian times, so the appearances were on local television and radio across the country and, as the book became better known, in the national media as well, from NPR and talks at the Smithsonian, the National Academy of Sciences, and Harvard's Hasty Pudding Club, to Larry King Live, People magazine, and Late Show with David Letterman. The book helped jump-start a new wave of discussions about mathematical pedagogy (culminating in the so-called Common Core and a slew of other STEM-related initiatives stressing a more conceptual understanding of mathematics) and was even the answer to a question on the popular game show Jeopardy.

One appearance was particularly memorable. I was being interviewed on a local morning show in Texas by a former beauty queen who seemed miffed that she had been dragooned into interviewing a mathematician with unruly hair. Her assistant stood behind me holding up posters from which she read the questions in a singsong, perfunctory way, her eyes glazing over until the next poster went up behind me. Exhausted from the book tour and a bit annoyed, I waited until the assistant put the poster down and asked the interviewer to repeat the question. As I suspected she wasn't even aware what she had asked me and stumbled badly. After the shortened interview she hurried off the set, and the segment producer thanked me for underlining what she had been telling the station management for months.

The whole experience was fun. I had caught a wave and felt like a passive passenger on a jerky Ferris wheel as the interviews, profiles, and speaking invitations poured in. The publicity has, of course, subsided, but it led to a change in my self-conception from mathematician to writer. This transition was facilitated by the aforementioned fact that my educational background at Wisconsin was a bit anomalous for a mathematician. My choice of a possible undergraduate major changed from classics and English to philosophy and physics and ultimately to mathematics. I have remained throughout a mathematics professor at Temple University, a state university in Philadelphia, where I still enjoy teaching a wide variety of courses to a wide variety of students. I have, however, devoted much of my time to writing eight (this is the ninth) books ranging from A Mathematician Reads the Newspaper6 to I Think, Therefore I Laugh,7 as well as columns, articles, and op-eds for the Guardian (in the United Kingdom), the New York Times, and a host of other publications on issues ranging from mammogram false-positives to the mathematics of the BP oil spill. For ten years I also wrote the monthly “Who's Counting” column for ABCNews.com.8 It dealt with a variety of mathematical issues, especially mathematical aspects of stories in the news, and was a task that greatly increased my respect for columnists who must write something topical and insightful two or three times a week.

A Mathematician Reads the Newspaper was published a few years after Innumeracy, and it also fared quite well, even making the Random House list of the 100 best nonfiction books of the century. (I've resisted my natural tendency to debunk the list.) Because of the book, Innumeracy, my columns, and other writings I was asked by the dean of the Columbia School of Journalism to develop and teach a course on quantitative literacy for aspiring journalists. I did so but was disappointed, haunted even, at the mathematical naïveté of the students in the course, who made it necessary to begin with remedial arithmetic instead of the problems and oddities associated with numbers in the news. Such a course should be a requirement at Columbia and at all schools of journalism, but unfortunately it is not.

Another consequence of the newspaper book was that I was asked to write weekly op-eds and editorials for the Philadelphia Daily News. I enjoyed writing them, but the letters to the editor the paper received in response to some of them were a little dispiriting, at least at first. I felt a little like Nathanael West's anonymous male columnist Miss Lonelyhearts, although the title Miss Emptyheads would have been more apt. After a short while, however, I found the letters and misunderstandings humorous in something like the way I still find it amusing to give the name Ludwig Wittgenstein (or Baruch Spinoza or René Descartes) to snooty restaurant maître d's and later hear “table for Ludwig Wittgenstein, right this way.”

As a writer on many topics, most but not all tangentially related to mathematics, I've met a number of interesting people whom I likely would not have met otherwise had I not been lucky enough to receive the publicity and recognition I did. A minor drawback associated with my writings, however, is that I was, and to an extent still am, asked the same questions repeatedly. For a short while the question was: What is David Letterman really like? Answer: How am I supposed to know? Letterman seemed like a cool professional with a quick wit and a riveting gaze.

Perennial questions are: Is there anything new in math to be discovered? What can you do if math was your worst subject and you don't have that kind of brain? How long did it take you to write this or that book? What is the probability of this or that coincidence? (Allow me to indulge once again my professorial tendency to reiterate and note that though the probability may be quite small, it is very often the answer to the wrong question. The right question is usually: What is the probability of something of this vaguely defined general sort occurring? and This probability is almost always much larger.)

The most common pedagogical question is and was: Should students be able to use calculators, web apps, and related aids? Answer: The short one is yes. The question, however, suggests a very narrow conception of mathematics as essentially glorified computation. The truth is that computational skills aren't much more essential to mathematical understanding and modeling than typing skills are to reading and writing. As I've stressed repeatedly, mathematics is no more computation than literature is typing. I repeat: mathematics is no more computation than literature is typing. (And nobody says you're a slow typist, so you had better forget that novel you're working on.) Often what is crucial in both journalism and mathematical applications is context, a bogus etymology for which is conte, the French word for story, and x and t, the most common variable names used in mathematics.

A somewhat darker aspect of writing, at least for me, was the many vituperative letters and e-mails I received over the years because of my writings, especially because of Innumeracy, in which I made a few innocuous comments, one being that I always use my middle name in public contexts to distinguish myself from the then pope, John Paul. My book Irreligion,9 although not particularly snarky, and my columns have also elicited hate mail. Simple multiplication offers a partial explanation. Even if the percentage of very angry, deranged, or self-righteous people is tiny, if your work reaches a large enough number of people, the product of these two numbers will generate a fair amount of nasty and nutty stuff: self-proclaimed Nazis denouncing this or that, pictures of me with a bull's-eye on my forehead, schizophrenic ramblings, and much more.

A minor, less alarming downside to the publicity my writings have received is that I'm often termed a “crusader” for mathematics. I am flattered by the metaphorical use of the term, but it also makes me wince a little. Mathematics doesn't need any human crusaders; not only is it a beautiful subject and critical for the development of science and technology, but it is also an imperialist discipline that, broadly interpreted, can invade and occupy almost every other discipline, arguably including the writing of biographies. To demonstrate its all-encompassing range, I've added to whatever mathematics course I have taught over the years odd examples, seeming paradoxes, everyday applications, probability and logic puzzles, and open-ended problems. I've also made liberal use of a bit of pedagogical wisdom suggested by April Fools’ hoaxes: in class I sometimes intentionally talk BS to encourage skepticism and critical thinking. I hate students’ mindless nodding at whatever it is Herr (or in my case Hair) Professor says, no matter how nonsensical.

This approach and much of the material, usually in the form of little vignettes rather than formulas or equations, have made it into my books and columns. Still, exploring the ever-increasing salience of mathematics, expounding some mathematical notions that belong in everyone's intellectual tool kit, and pointing out the consequences of innumeracy does not make me a “math crusader” brutally converting the innumerate infidels but rather simply a mathematician who writes. I do, however, take a tiny smidgen of credit for the now larger number of very talented authors devoted to communicating mathematical ideas and applications.

Finally, the crusader characterization does suggest a complexity-theoretic point: being typecast is unavoidable. Necessarily we all are to one degree or another, and the reason is that we're only capable of handling a limited amount of complexity and nuance, and labels and stereotypes are often useful ways of summarizing data sets or people that are beyond our complexity horizons. The bottom line is that I'm that still rather odd creature, a mathematician who writes, and therefore I'm perhaps a bit like a dog who plays chess. What's remarkable is not that the dog isn't very good at the game but that he plays at all.