However large the number of physical things (remember, the number of particles in the universe is way, way below 10100), the numbers appearing in analyses of the universe and its parts is much, much greater.
In refuting the Austrian physicist and philosopher Ludwig Boltzman’s claim that molecules in an enclosed space will forever increase, mathematician Henri Poincaré showed that the molecules will return to their original state, thereby decreasing entropy. For a gas of just forty liters, this will happen in a brief 10trillion trillion seconds. (The universe is only 1017 seconds old.1)
In his delightful book The Secret Lives of Numbers, mathematician George G. Szpiro reminds us of the distinction between convergent and divergent series. Convergent series approach a limit. For example: 1 + 1/2 + 1/4 + 1/8 … approaches 2.
Divergent series, on the other hand, go wandering into infinity. However, they may do so very slowly. For example, the “harmonic series” 1 + 1/2 + 1/3 + 1/4 … takes 178 million steps before it reaches 20.
You have a half-dollar piece and a silver dollar. I have a quarter. We toss all three coins. Each tail counts zero. Each head counts its value in points. The player with the greater number of points in each round wins the value of the other player’s coin(s).
Thus, if I toss a head with my quarter and you toss a head with your half-dollar and a head with your silver dollar, you win 25 cents (my quarter).
I would be a fool to play. Right?
Wrong. The results will balance out.
Each time you win, you win only 25 cents. Each time I win, I win $1.50. You will win six times as often, but I will win six times as much each time I win.
ME | YOU | RESULT |
---|---|---|
H | H H | You win 25 cents. |
H | H T | You win 25 cents. |
H | T H | You win 25 cents. |
T | T H | You win 25 cents. |
T | H H | You win 25 cents. |
T | H T | You win 25 cents. |
T | T T | No one wins. |
H | T T | I win $1.50. |
As long as the coins are adjacent on the doubling sequence (25 × 2 = 50; 50 × 2 = 100), no coin is duplicated, and one player has at least one coin and the other at least two, the sides will always be even.
Consider this: Primes of the form 3n + 2 are more numerous than primes of the form 3n + 1 until you get to 608,981,813,029, at which point primes of the form 3n + 1 take the lead. The lead switches back and forth, but for an infinity of values, 3n + 1 wins.
John Derbyshire, author and novelist, makes a fascinating point. Did you know you can put the Empire State Building into a 1-inch cube? You just need enough dimensions.
Longest corner-to-corner diagonal in a one-inch 2-cube (i.e., square): square root of 2
Longest corner-to-corner diagonal in a one-inch 3-cube (i.e., cube): square root of 3
Longest corner-to-corner diagonal in a one-inch 4-cube (i.e., “hypercube”): square root of 4
Longest corner-to-corner diagonal in a one-inch 5-cube: square root of 5
And so on.
So the long diagonal of a one-inch n -cube is sqr(n). If n is about 200 million, the one-dimensional version of the Empire State Building will slide in nicely!
A “perfect number” is a number for which the integers that divide evenly into the number (other than the number itself) also add up to the number. The smallest perfect number is 6. Six is evenly divisible by 1, 2, and 3, and 6 = 1 + 2 + 3. In addition, 6 = 1 × 2 × 3, making it what might be termed a “perfect, perfect number.” (I bet that it can be easily proven that 6 is the only “perfect, perfect number, but not so easily that I can do it.) The next perfect number is 28 (28 = 1 + 2 + 4 + 7 + 14). It is, obviously, not a “perfect, perfect number,” as 1 × 2 × 4 × 7 × 14 = 784, not 28.
There are only about forty perfect numbers. The largest has over 130,000 digits. All are even numbers. Is there an odd perfect number? No one has discovered one or proven that there could not be one.
Incidentally, a “sublime” number is one that has the number of divisors (that leave no remainder) that is perfect and the sum of these divisors is perfect. There are only two known sublime numbers. One is 698,655,567,023,837,898,670,371,734,243,169,822,657,830,773,351,885,970,528,324,860,512,791,691,264. The other is 12 (1 + 2 + 3 + 4 + 6 + 12, the six divisors, sum to 28). The first two perfect numbers are 6 and 28.
As Euler proved, the only solution in positive integers for ab = ba is 24 = 42.
Most people know that the probability of tossing a coin and having it come up heads is 1 in 2 (0.50); of rolling a four with a six-sided die is 1 in 6 (0.166+); and of landing on an eight on a 1–10 wheel is 1 in 10 (0.1). However, when you ask these people the probability of rolling at least one four in six rolls or landing at least once on eight in ten spins, they will usually answer 1 in 2 (or 50 percent or fifty-fifty or “even money”—these are all the same thing).
As you can see below, when the possibilities are four or more, the probability of at least once attaining the 1-in-n event in n attempts is always roughly 2 in 3 (.66+), or, more accurately, a bit above 63 percent. (In other words, the probability of rolling at least one four in six rolls of the die (a one-in-six chance) or having the spinner land on “8” in ten spins (a one-in-ten chance) is a bit below two-thirds. But even if the wheel has a million numbers and you win only if the spinner lands on number 562,194, if you spin the spinner a million times, there is about a 63% chance that you will land on 562,194 at least once.
If there are only two or three possibilities, then the likelihood is somewhat higher. Take a coin as an example. The likelihood of tossing a head is, of course, one in two. If you toss the coin twice, the possibilities are
head-head
head-tail
tail-head
tail-tail
As you can see, the chance of tossing at least one head in two tosses is 3 in 4 (0.75); only tail-tail fails to get a head. Once we get to a 1-in-6 case, like the die, we fall below the 2-in-3 likelihood and soon approach the 63+ percent likelihood, but never go below it.
The table below shows the 1-in-n probability of at least one occurrence of an event with a probability of 1 in n in n events: for example, selecting the Jack of Hearts in fifty-two attempts, with the deck being reshuffled each time.
1 in 2 (coin) | 0.75 |
1 in 3 | 0.70+ |
1 in 4 | 0.68+ |
1 in 5 | 0.67+ |
1 in 6 (die) | 0.66+ |
1 in 10 (1–10 wheel) | 0.65+ |
1 in 12 | 0.65− |
… | |
1 in n | 0.63+ |
With every increase in n, the probability decreases, but it decreases by a smaller amount each time. The limit is 1 - 1/e, which is >0.63, but <0.64. (The e is our old friend from compound interest.) Thus, if you try an event a million times that has a one-in-a-million probability, the probability of hitting the event at least once is >0.63 and <0.64. No matter how large n is, there will always be a better than 0.63 probability of hitting a 1-in-n event in n tries.
You will have a 0.50-0.50 chance of hitting n in just about 0.7n tries. Thus, if a wheel is numbered 1–100 and you want the spinner to land on number 64, you have a 0.50-0.50 chance of landing on number 64 at least once if you spin the wheel seventy times and a 0.63+ chance if you spin it a hundred times.
This is perhaps the personal anecdote one most often hears in informal mathematical circles. I have got to admit that it never struck me as being as spellbinding or hilarious as it does others, but here it is for the sake of completeness.
When the great mathematician Srinivasa Ramanujan was on his deathbed, he was visited by G. H. Hardy. The number theory expert told Ramanujan that he had hoped that the number of the taxi would be an interesting one that would cheer up Ramanujan. Alas, he had to inform Ramanujan that the number was an uninteresting 1,729.
“Oh, no,” cried Ramanujan, as he leaped from his deathbed, “that is a very interesting number; it is the smallest number that is expressible as the sum of two cubes in two different ways” (13 + 123 = 1 + 1,728 = 1,729 and 93 + 103 = 729 + 1,000 = 1,729).
Ramanujan was a scientist of numbers. His fascination with their nature led him to care more about discoveries about numbers than proof that a discovery was, in fact, a discovery of a “true fact” of numbers. His astonishing intuition and intelligence did indeed lead to many discoveries of the highest order.
The largest number I have ever seen in a discussion of the physical world is 1070,000,000,000,000, which John Barrow (The Constants of Nature: The Numbers That Encode the Deepest Secrets of the Universe) estimates as the number of possible wiring connections between the neurons of the brain. This is, of course, the number of possibilities, not a number referring to any actual physical reality, and it dwarfs the number of possible combinations of DNA (103,480,000,000).
Consider this: You are given a medical test for a dreaded disease—say, Floogle’s Sliding Nose Disease—and you get a positive result. This test is 98 percent accurate for both positives and negatives. Do you kiss your nose “good-bye”?
Do not do it. You are more likely to lose your nose doing that than from Floogle’s Disease. Here’s why.
Say 10,000 people take the test for Floogle’s, and 50 of them actually have the disease. Of the 50 who have Floogle’s, 49 will correctly get positive results (i.e., 98 percent of 50 = 49), and 1 will incorrectly get a positive result (i.e., 2 percent of 50 = 1).
Of the 9,950 who do not have Floogle’s, 9,751 will correctly get negative results (i.e., 98 percent of 9,950 = 9,750), and 199 will incorrectly get positive results, false-positive test results (i.e., 2 percent of 9,950 = 199). (These numbers have been rounded off very slightly.)
Thus, of the 248 people who get positive test results (49 + 199), only 49 actually have Floogle’s disease. About 80 percent of those people who got positive results do not have Floogle’s disease.
This does not mean that the test is of little use. With the assumptions we have made (i.e., that the test is 98 percent accurate, for both positives and negatives, and 50 of 10,000 have Floogle’s), a positive result tells us little, but a negative result makes it very, very likely that one does not have the disease.
Bertrand Russell, philosopher, historian, mathematician, logician, created a logical paradox that struck at the heart of all logic.
In the town of Whatsgoingonhere (not the name that Russell used), the barber shaves all those and only those who do not shave themselves. Who shaves the barber?
If the barber shaves himself, he breaks the rule that he shaves only those who do not shave themselves.
If the barber does not shave himself, he must, according to the rules, shave himself.
Where do integers come from? Well, there are a lot of different answers to this question, but here is an especially elegant one.
A number is a set. For example, 4 is the set containing four things. Whether these things are chairs, beliefs, planets, or left-handed bowlers under three feet tall is irrelevant. It is the “fourness” that is the set that is the number 4.
Now, an empty set is merely a set with nothing in it—no chairs, no beliefs, no planets, no left-handed bowlers under three feet tall. Call this set “zero” and represent it as { }.
Where sets of 4 differ in their being four chairs or four beliefs or four planets or four left-handed bowlers, there is only one empty set; whether it is empty of chairs or beliefs or planets or left-handed bowlers is irrelevant. It is empty.
Now consider a set that contains the empty set. Represent it as {{ }}. Call this set “1.”
The set that contains the set that contains the empty set—{{{ }}}—we will call “2.”
And this goes on, forever. Out of nothing (the empty set), we (well, actually, the great mathematician John von Neumann) create all the integers.
You have two coins, and you are going to toss each once. What is the probability that you will toss matching coins (i.e., two heads or two tails)?
It is simple. Note that there are four equally probable possibilities:
Head Head Head Tail Tail Head Tail Tail
Thus, there is a 50 percent chance that you will throw matching coins.
The following is not so simple; it is simply tricky. You have four loose socks in a bag, two blue and two red. You pull out two socks. What are the chances that you will pull out two matching socks?
The problem seems identical to the coin toss. There are four possibilities:
Blue Blue Blue Red Red Blue Red Red
Two of the four possibilities give two socks of the same color. So this is the same problem as that of the coins and has the same 50 percent probability, right? Well, actually, no.
In the coin problem, the events were independent (i.e., unconnected). Tossing, say, a head on the first toss does not affect the probability of the second toss coming up heads or coming up tails. The possibilities for the second toss remain at one head or one tail.
However, in the sock problem, your selecting a blue sock changes the possibilities for the second pull. It is no longer two blues and two reds, but one blue and two reds because you have used up one of the blues. In the coin problem, tossing a head did not use up a head.
Incidentally, the coin problem is analogous to roulette, in which each spin of the wheel is independent (so that there can be no system that wins at roulette). The sock problem is analogous to blackjack, in which cards are used up, making a system possible, but very, very difficult.
Random numbers play a crucial role in science. Like “time,” “cause,” and other basic conceptions of the constituents of reality, the meaning of “random” seems too obvious to require thought—until we think about it.
When we think of random numbers, we think of numbers that possess no pattern, sequences for which we can do no better than guessing when we predict the next number in the sequence. So, for example, it must be true of such a number that each digit tends to appear about one-tenth of the time (in no predictable order), each pair of digits about one-hundredth of the time (if this is not the case, then there is a patterned bias built in). A number with this quality is called “normal.” (Note that no rational fraction possesses this quality because such fractions always repeat: 1/3 is 0.3333 …, and 1/97 seems random for 96 digits and then keeps repeating those 96 digits. Also, the constraints we have mentioned must take into account the number of digits in the string; a string of ten digits is clearly not likely to be random if it is 2222222222. However, in a string of a quadrillion digits, 222222222 will appear many times.)
However, there is another quality that we tend to automatically associate with randomness: not merely is there no pattern, but there is no easy algorithm. Thus, while pi has no pattern (at least that we can discern in pi’s first billion digits), the sequence is generated by an easy rule (an algorithm for pi), and this enables us to predict every digit.
The two forms of “randomness” are in conflict. As Martin Gardner suggests, write down ten one-digit numbers. If you wrote down any number more than once, the sequence fails to meet the first criterion (i.e., it is biased toward the number you wrote down more than once). However, if you write each number just once, you satisfy the first criterion, but fail to meet the second (because, given nine digits, one can predict the tenth). This sort of conflict is inherent in all attempts at randomness, and there is no such thing as a sequence that meets both criteria.
There is, in other words, a limit precluding our attaining that which we tend to assume is true of what we mean by “random.”
Say you have a bunch of packages of various sizes and shapes, and you have a number of identical bins to put them in. Surprisingly, there is no algorithm known that guarantees that you attain the most efficient packing (i.e., the minimum number of bins). This is known as an NP-complete problem, and, as in the case of the traveling salesman problem (see page 2), it is probable that there is no efficient algorithm (because the number of bins grows arithmetically, but the time to solve the problem grows geometrically). To get the ideal packing, you pretty much have to try all the possibilities, and if there are more than a few bins, this would take all the computers in the world a billion, trillion, or more years.
However, if you simply fill the first bin as much as you can (paying no attention to package size), then do the same with the next bin, and so on, you will need only about 70 percent more bins than if you tried every possibility. If you pack in order of decreasing size (put the largest remaining package in the bin, then the next largest that will fit, etc.), you will come within 20 percent of the minimum number of bins.
1.Paul Davies, About Time: Einstein’s Unfinished Revolution.