Chaos, Catastrophe and Randomness
CHAOS THEORY WAS DISCOVERED IN THE 1970S AND 80S, and became something of a household term. But it was used long before that, as a 1938 article makes clear. The article, perhaps a prime example of the dismal state of science writing early in the century, used the word “chaos,” although not in its current sense. “A new mathematical definition of chaos which brings ‘utter confusion’ for the first time under the control of man,” was the way it was described. And the researcher who described a mathematical analysis of chaos? None other than Norbert Wiener, a famous child prodigy who became a professor at the Massachusetts Institute of Technology and founder of the field named cybernetics, which he defined as the science of control and communication. His paper, though, despite the confusing Times article, actually was not particularly notable or influential.
But toward the end of the century, in part because of a series of articles in the Times by James Gleick, and his best-selling book, Chaos, the notion of chaos left the rarefied realm of mathematics and captured the popular imagination. It was sometimes even confused with another buzzword of the time, catastrophe theory. Can you predict disasters, experts asked, if they occur only when a sequence of random events happens to arise? While chaos theory still lives, catastrophe theory fizzled and died, though related questions did not. What is a random event? What are coincidences anyway and how should people interpret them?
Along the way, researchers also started to ask about data mining, the attempt to sift through data and find meaningful patterns. The problem is that there are always patterns, just by chance, in large data sets. So how do you know which are real and which are random noise?
It’s an issue that arises over and over again. A cluster of cancers emerges in a town. Does it signal the presence of a potent carcinogen? Or is it a predictable statistical quirk? How would you know?
A sort of mirror image of the search for patterns is the quest for true randomness, streams of numbers that actually have no pattern. Malcolm W. Browne, in an article, explained why that is important: “Mathematical ‘models’ designed to predict stock prices, atmospheric warming, airplane skin friction, chemical reactions, epidemics, population growth, the outcome of battles, the location of oil deposits and hundreds of other complex matters” rely on a statistical method that requires—absolutely depends on—a string of truly random numbers. The challenge is to find a way of generating those numbers and of showing that they really are random. Not so easy, as Browne explains, and some computer programs that were used to generate strings of random numbers actually were generating strings that had patterns; they were not random.
The answers to all these questions about randomness and chance can be surprising, revealing, and can make you see the world in a different way. At the very least, they may encourage a bit of skepticism. And they may encourage respect for the enormous difficulties that beset scientists venturing into this uncertain territory.
Chaos Is Defined by New Calculus
A new mathematical definition of chaos, which brings “utter confusion” for the first time under the control of man, was reported to the Fourth International Congress for Applied Mechanics here today.
The definition is a new form of calculus. It enables the scientists to predict what will happen in states of complete confusion. Practical uses are many. Examples are the solving of air turbulence which hampers airplane flights and the flow of liquids in pipes.
This calculus was reported by Dr. Norbert Wiener of the Massachusetts Institute of Technology. The mathematics first card indexes sample kinds of chaos. An engineer can select the sample which most nearly resembles his problem.
The steps of a drunken person illustrate the samples. The wabbly walk was one of the problems in chaos which first interested Dr. Wiener. Each step has no relation to previous steps. But calculus can show how far the “drunk” is likely to go in a given time.
Problem in Planets’ Motions
“Perturbations” of the planets, their slight, irregular wanderings from their orbits, are a practical example of chaotic movement. Astronomers have been figuring them for centuries, and an explanation by Dr. Albert Einstein of one unexpected motion helped build his fame.
The new definition, Dr. Wiener explained, is entirely mathematical. It defines “pure” chaos. Practical examples of the “pure” state are difficult, he said, but closest to 100 percent chaos are the noises you hear in a vacuum tube, like the radio. Electrons make them, by striking target. But no single electric makes a sound. The racket comes from “mass,” that is, when a lot of electrons hit simultaneously.
There are three steps in the new calculus, said Dr. Wiener. The first was the “ergotic” (from the Greek word “ergon,” meaning work) theorem of Willard Gibbs, American scientist of the last century. It declared that a chaotic system would run through all possible phases of confusion.
Scientists, said Dr. Wiener, found that the systems did not run the entire gamut.
The next step was by Dr. G. D. Birkhoff of Harvard, with new mathematics by which the time when chaotic events would happen could be averaged.
Adds Factor of Space
Dr. Wiener’s word adds “space” to chaos, so as to tell not only the time but the place where a change is likely.
In airplane flights one of the “chaos” troubles is to figure when and where the smooth flow of air over the top of a streamlined wing will break up into eddies, and what kind of eddies. The eddies interfere with the plane’s lifting power.
Engineers by experiment measure some of these eddies. But it is impossible to measure them all. Designers have to guess on the basis of their sample eddies. The new mathematics is designed to take the guess work out of these experiments by showing whether the samples are well chosen.
Dr. Norbert Wiener, professor of mathematics at M.I.T. since 1932, received his bachelor’s degree from Tufts College in 1909 at the age of 14 and won is his Ph.D. at Harvard when he was 18.
September 14, 1938
Experts Debate the Prediction of Disasters
An increasingly bitter debate is spreading among experts in many fields of science over “catastrophe theory”—a mathematical concept intended to explain and predict events as disparate as chemical reactions, dog bites, stock market crashes and the outbreak of war.
General interest in the principle has reached the point that a lecture on the subject at the University of California attracted 1,000 people, and the supply of papers on catastrophe theory is exhausted almost as soon as they are published. Catastrophe theory is a concept that says the probability that sudden events will occur can be accurately forecast by plotting developments mathematically.
Called Mathematically Unsound
Its proponents say that catastrophe theory promises to predict all kinds of sudden events, even from very small amounts of data, and even if some of the data are wrong.
Critics of catastrophe theory contend that is the mathematically unsound, that it has little or nothing to do with the real world and that predictions drawn from it are so vague and general as to be worthless.
There are eminent mathematicians on both sides of the argument, although the tide seems to have turned in the last year in favor of the critics.
The dispute among mathematics has spread to many fields—biology, social and political science, psychology and other “soft sciences”—where “catastrophists” see potential applications of the principle.
Proponents of the catastrophe theory say that represents the greatest revolution in mathematical thought since the discovery of calculus three centuries ago. Catastrophe theory is based on the idea that pictorial representations can symbolize real events.
One of the pictures is the simple line graph, drawn as a two-dimensional figure, which shows how something changes between one state or point and another.
More complex graphs can be drawn on three-dimensional surfaces—spheres, cylinders and paraboloids, for example. Multi-dimensional graphs contain much more information than two-dimensional graphs, and can represent more subtle changes and relationships.
Topology, the mathematical study of the shapes of surfaces, is the science from which catastrophe theory developed in the mid-1960s, largely because of the work of René Thom, the French mathematician.
Dr. Thom’s particular interest was in a kind of surface that folds of surface that folds on itself, changing abruptly from a smooth form to a continuous break of some kind. He postulated that there are seven basic forms of such a curve, and he called the discontinuous break characteristic of all of them a “catastrophe.”
Dr. Thom and his followers, especially E. Christopher Zeeman, the English mathematician, proposed that such surfaces, mathematical abstractions in themselves, might serve as suitable models for describing real events, and they set out to prove the idea.
Dr. Zeeman suggested that many real events—a sudden change in a chemical reaction, the sudden buckling of a steel girder, the sudden decision of a dog to attack, the sudden differentiation of growing cells into an embryo—could be represented as “catastrophes” on suitable mathematical surfaces.
Prison Officials Interested
He even asserted that if the right surface were chosen to symbolize the variable factors affecting an event, only a small amount of information would be needed to predict the shape of an entire surface, and from this the event it represented could be predicted.
Among the groups that sought applications in Dr. Zeeman’s work was the British prison system. Prison authorities helped Dr. Zeeman in a study he made of the events that had led to a 1972 riot at Gartree Prison, which, he said, could have been plotted on a “catastrophe” surface.
By refining the technique it should be possible, he reasoned, to identify the times and circumstances under which a riot would be likely and to take preventative measures.
Dr. Zeeman published other papers contending that catastrophe surfaces may represent heartbeats, the onset of certain kinds of mental illness and social unrest. For several years, the scientific community appeared to accept these ideas without comment.
But lately, the work of Dr. Zeeman and others has come under increasingly heavy challenge.
A volley of scientific papers by Hector J. Sussman and Raphael S. Zahler, associate professors of mathematics at Rutgers University, was the first concerted attack on the ideas of catastrophe theory. Science, a respected American journal, soon followed this with a strong criticism entitled “Catastrophe Theory: The Emperor Has No Clothes.”
Dr. Sussman and others accused the catastrophists of trying to intimidate non-mathematicians (and even mathematicians unfamiliar with their work) with such phrases as “deep mathematics.”
Catastrophe theory, Dr. Sussman wrote, “is tantamount to claiming that the world can be deduced from pure thought—a claim few scientists would accept.”
He added: “Catastrophe theory offers to mathematicians the hope of applying mathematics without having to know anything but mathematics. An appealing dream for mathematicians, but a dream that cannot come true.”
Dr. Zeeman has not replied in print to the growing criticism. In an interview with The New York Times, he said that his published papers spoke for themselves and that Dr. Sussman’s critiques contained “many misquotations, misrepresentations and mathematical mistakes.”
Responding to the charge that catastrophe theory is too vague and general to be useful, he asked, “What would you say of numbers, one, two, three, and so forth? They are also general and vague, but would you contend they are not useful?”
“Close Friend for Years”
He said that he was not attempting to prove the utility of catastrophe theory in any field of research, but merely to offer it to others as a tool that they might find useful. In three or four years, he said, social scientists may begin making predictions based on the theory.
But many mathematicians reject even this possibility. Among them is Dr. Stephen Smale, a professor at the University of California and winner of the Fields Medal, the highest recognition that a mathematician can receive. (There is no Nobel Prize in mathematics.)
“I’ve been a close friend of Chris Zeeman for years,” Dr. Smale said in an interview, “and he awarded me an honorary degree at the University of Warwick in Coventry, where he is director of mathematical research.
“But in a sense I reject catastrophe theory completely. It is more of a philosophy than mathematics, and even as a philosophy it doesn’t explain the real world. Zeeman’s assertions constitute a heavy-handed argument he just cannot justify,” Dr. Smale concluded.
Despite such criticism, the number of Dr. Zeeman’s supporters appears to be growing in some fields. Dr. Robert T. Holt, professor of political science at the University of Minnesota, is one of them.
“In the social sciences it is terribly difficult to obtain and use exact data of the kind produced by the natural sciences,” he said in an interview, “and the catastrophe theory work of Thom and Zeeman offers us a way to get around this.
“For example,” he said, “both the beginning and end of the First World War came as big surprises to everyone. Gradual processes suddenly and unexpectedly resulted in the outbreak of the war and the ending of the war. We believe these can be plotted on catastrophe curves of the types proposed by Thom and Zeeman.
“We believe, also, that it would be possible to identify times at which a war would be most probable or least probable using the theory.
“This gives us a mathematical tool which is qualitative as well as quantitative, and I think you have to appreciate the spirit of the theory. Zeeman’s papers are clues, not proofs, and we should use them in that spirit.”
November 19, 1977
Solving the Mathematical Riddle of Chaos
A premature summery Sunday takes Cornell University by surprise, and students head off to sun themselves at Treman State Park. The waterfall trail is closed, supposedly because of hazardous winter conditions. But not everyone is playing by the rules. High above the falls, a man stands at streamside, just where the smooth flow of water begins to speed and shudder. He is sweating slightly in sports coat and corduroys and puffing on a cigarette. Suddenly, in what might be a demented high-speed parody of a tennis spectator, he starts turning his head from side to side, over and over again.
His companions have walked ahead toward the quieter pools upstream, but Mitchell Feigenbaum is totally absorbed. “You can focus on something, a bit of foam or something,” he says. “If you move your head fast enough, you can all of a sudden discern the whole structure of the surface, and you can feel it in your stomach.” He takes another pull on his cigarette. “But for anyone with a mathematical background, if you look at this stuff, or you see clouds with all their puffs on top of puffs, or you stand at a sea wall in a storm, you know that you really don’t know anything.”
Here, where the water flashes over the rocks in indistinguishable eddies and cascades, chaos begins, and traditionally that is where science stops. For as long as the world has had physicists inquiring into the laws of nature, it has had a sense of deep ignorance about chaos—disorder, turbulence, in water, in the atmosphere, in the erratic fluctuations of wildlife populations, in the fibrillation of the human heart. The mathematics has simply not existed.
But Feigenbaum, a 39-year-old physicist at Cornell, has become a midwife for a new scientific discipline that is exploring turbulence and disorder of a kind that a decade ago seemed impenetrable. It is built on the discovery that of all the possible paths to disorder, nature favors just a few. It has its own technique of using computers and its own special language. Its practitioners—physicists, mathematicians, chemists, biologists—number no more than a few hundred scientists in the United States and Europe and a few dozen in the Soviet Union and Japan. At the moment, this new discipline doesn’t even have a name, just a nickname: chaos.
“Chaos is asking very, very hard questions,” says Joseph Ford, Regents Professor of the Georgia Institute of Technology, who organized an influential early conference on the subject in 1977. “It offers the possibility that the answers are going to severely modify our view of the universe. There is a notion that we are beginning to get the microscopic details of how the universe may work.”
The growing numbers of researchers in the field hope that chaos will ultimately suggest ways of predicting weather and earthquakes, designing optical computers and jet engines, explaining economic trends and the physiology of the heart. Feigenbaum and his colleagues try to be cautious about predicting real-world applications, and in their cooler moments they emphasize that true turbulence is in some ways as murky as ever. The whole business has a trendy appeal, perhaps trendier than some physicists consider justified. The very notion of order amid chaos, after all, has long been a cliche, applied to all kinds of scientific endeavor.
Academic caution has not discouraged stock-market analysts from showing up expectantly at chaos conferences. Nor, for that matter, has it discouraged program managers in the Departments of Energy and Defense from financing a variety of theoretical projects in the new discipline. Los Alamos National Laboratory has created a Center for Nonlinear Studies to coordinate work on chaos and related problems, and similar institutions are being set up on several university campuses.
“The notion that these phenomena really are very common has led to a whole new birth of investigation of varieties of ways in which systems behave chaotically,” says Paul C. Martin, dean of Harvard University’s Division of Applied Sciences.
Feigenbaum, a product of Brooklyn and Los Alamos who had trouble getting his early discoveries published, made his first key breakthrough in 1976 and 1977, and in the last few years the word has been spreading in a remarkable round of conferences and symposiums and workshops. This year’s Nobel Symposium, beginning tomorrow in Goteborg, Sweden, is devoted to chaos, with an opening address by Feigenbaum. Even so, awareness is just now penetrating the nontechnical segments of the academic world. “The folk in the academic community are beginning to recognize that there’s something called chaos,” Ford says. “They’re not sure if they like it or if they’d prefer it would go away, but they know something’s afoot.”
Chaos seems to be everywhere. It’s in a rising column of cigarette smoke that suddenly breaks into wild swirls. It’s in a dripping faucet that goes from a steady pattern to a random one. “It’s in the behavior of the weather, the behavior of an airplane in flight, the behavior of oil flowing in underground pipes,” says Kenneth G. Wilson, professor of physics at Cornell. They behave in strikingly similar ways. The actual stuff that is disorderly seems to matter less than the disorder itself.
Wilson’s Nobel Prize–winning work in physics in the 1960s provided a springboard for Feigenbaum, who made his own discoveries at Los Alamos National Laboratory. Using a calculator and then a series of desktop computers, Feigenbaum found regularities in the behavior of some simple mathematical equations when they were worked over and over again, with the results being fed back into the equations as input. The equations could be manipulated to produce, first, orderly strings of numbers—series with recognizable logic—and then disorderly ones. As Feigenbaum watched the numbers turn from orderly to disorderly, he began to see patterns in the transition, and the patterns were pretty, the way such things are pretty to a mathematician with an eye for unexpected simplicity.
At first, Feigenbaum and his colleagues assumed that the patterns had something to do with the particular equations he was examining. But to their enduring surprise, completely different equations seemed to produce the same patterns in the transition to disorder. A mathematical description of these patterns always seemed to involve certain particular numbers, now known informally as Feigenbaum numbers. The patterns seemed to have universality, and universality was magic.
Even then, Feigenbaum’s work might have made no more than good dinner-table conversation, without lasting significance. Indeed, the scientific journals wouldn’t touch the articles he offered them. But the state of things in a half-dozen different fields made the time right for his discovery. The implication was that if these regularities were universal, they would appear in real systems governed by mathematical equations far more complicated than the ones Feigenbaum had explored—equations for the motion of fluids or the change in animal populations. And sure enough, Feigenbaum’s computer patterns turned up in 1978 experiments with fluids in France. The news traveled fast.
“You get these subjects that become popular when one person makes a breakthrough that makes it possible to do a lot of research in an area,” Wilson said. “Here the breakthrough was Feigenbaum’s work, finding a way to study the borderline between organized behavior and chaotic behavior.”
A few of Feigenbaum’s colleagues now believe they are seeing the beginnings of a true course change for physics. The field has been dominated for two generations by the glittering abstractions of high-energy particles and quantum mechanics. The achievements have changed the 20th-century landscape, beginning with the atom bomb and nuclear power. But to some—especially younger physicists—progress year to year is beginning to seem slow, the naming of new particles is beginning to seem empty, the body of theory is beginning to seem cluttered. There has long been a feeling, not always expressed openly, that theoretical physics has strayed far from human intuition about the world.
Whether this will prove to be fruitful heresy or just plain heresy, no one knows. But some of those who think physics may be working its way into a corner are looking to chaos as a way out. They like the feel of it. For one thing, it has a crisp mathematical quality that recalls Newton’s revolution in physics. For another, it opens a window on the great gulf between knowledge of what one thing does—one water molecule, one bit of heart tissue, one neuron—and what millions of them can do.
Watch two bits of foam flowing side by side at the bottom of a waterfall. What can you guess about how close they were at the top? Nothing. As far as mathematics is concerned, God might just as well have taken all those water molecules under the table and shuffled them personally.
Physicists always assumed that when they saw such a random relationship between what goes into a system and what comes out, the randomness had to be part of the system, in the form of noise or error. In a way, the modern study of chaos began with a creeping realization in the 1960s that quite simple mathematical equations produced results every bit as violent as a waterfall. Tiny differences in input could quickly become overwhelming differences in output—a phenomenon given the name “sensitive dependence on initial conditions.” In weather, for example, this translates into what is only half-jokingly known as the Butterfly Effect—the notion that a butterfly flapping its wings today in Peking might affect the weather next month in New York. It is not a notion designed to give comfort to long-range forecasters.
Understanding of the Butterfly Effect began with Edward N. Lorenz of the Massachusetts Institute of Technology, who worked out many of the unimagined properties of these systems. But his work was published mainly in meteorology journals, and few physicists appreciated the consequences of it even five years ago. For scientists who had not actually played with the equations themselves, the notion could be deeply implausible.
“You’d get up and give a talk about sensitive dependence on initial conditions in front of a physics department,” recalls J. Doyne Farmer, now a Los Alamos physicist, “and people would say, ‘Well, are you sure this isn’t just numerical error?’ You’d get a lot of, even, hostility.” Farmer was one of a group of graduate students in physics at the University of California at Santa Cruz who got so caught up in the excitement of chaos that they dropped their earlier specialties cold, to the dismay of the faculty. They thus entered a small but growing body of chaos lore as the “Santa Cruz Dynamical Systems Collective.”
For biologists, among others, all this was interesting. Consider for example, a population of fish. In a limitless ocean, it might grow smoothly from year to year, and biologists could describe the change with a mathematical model using a linear equation. But to make the model one step more realistic, biologists might look at fish in a pond. There, if one begins with just a few fish, the population grows; when there are too many for the amount of food, it declines. That one twist makes the system nonlinear. The waxing and waning of the fish population can still be described quite simply by an equation, repeated over and over again to calculate the change from generation to generation, but the equation can do startling things.
It could be that, no matter how many fish you start with in the pond, the population eventually settles down to a stable number, like a pendulum making shorter and shorter swings until it comes to a halt. But a few biologists, attuned to the mathematical developments of the 1960s and 70s, saw the other possibilities. One of them was Robert M. May, professor of biology at Princeton University.
“Most of our intuitions formed in mathematics or physics courses are that, if you have a fairly simple description of something, then it’s going to describe fairly simple behavior,” May says. “But if you take a population that is more boom-and-busty in its dynamics, you find that it doesn’t do something simple as you thought.”
Working out the results on a computer, a scientist could alter the model, slowly changing one variable in an equation to make it represent a more boom-and-busty fish population, like turning a knob to raise the heat under a pot of boiling water. After a certain point, the population might start to oscillate, settling down not to one stable number but to two different numbers in alternating years. Then, as the boom-and-busty variable changed further, the period of oscillation might change to four years, then eight, and so on until suddenly the population became completely erratic, changing year by year in an unpredictable way. That pattern, called “period doubling,” is now recognized as one fundamental kind of transition to chaos.
The real world, of course, does have randomness and noise that don’t figure in an idealized calculation of this kind—a fish population might depend on a hundred different variables, beginning with the weather. But period doubling did turn up in experiments with fluids. A half-inch cube of liquid, for example, is slowly heated from below. The warm liquid rises in a smooth rolling motion. Then, as the temperature difference is increased, wobbles appear in the rolls, and then wobbles in the wobbles. A probe, measuring the speed of the liquid at a given point, finds first a constant speed, then an oscillation between two speeds, each repeating itself, say, once a second. Then the two speeds split to four, each showing up half as often, and so on, until chaos sets in.
“The world of biology out there outside the laboratory really has so much environmental noise and so much interaction among species that I don’t look to see the crisp patterns, the Feigenbaum numbers, appearing the way they do in physics,” May says. “And yet I think that many of the oscillations you see in the real world are oscillations of the kind that are latent in these nonlinear sorts of equations.” Some of his colleagues are even more optimistic, expecting chaos to help in understanding cycles in measles epidemics or predator-prey populations.
May published an intriguing article about all this in Nature in 1976, exhorting his colleagues to look at the odd behaviors that develop when simple equations act upon themselves over and over again, and to teach their students about them. “It was biologists who said, ‘Hey, this is really important—deterministic models do mad things,’” he says. “Mitch went one step further and said, ‘Not only do they do mad things but there are important regularities you might look for and take seriously.’
“It’s not enough to have an idea, you’ve got to understand how important it can be. And even if you’ve noticed how important it is, if there isn’t an audience of people ready to listen to you, it doesn’t matter.” In 1976, the audience was not quite ready.
Sometime in 1974, the physicists in the Theoretical Division at Los Alamos became aware that their new colleague was working 26-hour days, which meant that his waking schedule was slowly rolling in and out of phase with theirs. It also meant that he was taking long, lonely walks in the starlight that hammers down through the thin air of the mesas, and this sometimes distressed the local police. He was an unusual case. He had exactly one published article to his name, he was working on nothing that seemed to have any particular promise, yet, at the age of 29, he had already become a sort of resident savant among the savants.
“He acquired the reputation of being the superbright person you went to see when you couldn’t figure things out,” says Peter A. Carruthers, a senior fellow of Los Alamos who was then head of the division. “He became the perfect consultant, and when people could find him, they would bring him problems about anything at all.”
Some of his friends, though, were wondering whether he was ever going to produce any work of his own. As willing as he was to do impromptu magic with their problems, he didn’t seem interested in devoting his own research to any question less cosmic than, say, What is time?
Mitchell Jay Feigenbaum was born Dec. 19, 1944, in Philadelphia, where his father was a chemist in the Navy Yard, eradicating cockroaches on ships bound for the Pacific. After the war, his parents moved back to the Flatbush section of Brooklyn, his father working for the Port of New York Authority and then for Clairol and his mother working as a teacher in the public schools. Soon enough, the mystery of the universe announced itself to Feigenbaum through a Silvertone radio sitting in the living room.
“Every morning at 5:30 or 6 it would play,” he says. “For some reason it struck me as strange that it could make music with nothing coming in. Was remarkable. Because I understood phonograph at that point, when I was 4 or 5 my grandmother gave me a special dispensation that I could put on the 78 records.” When Feigenbaum talks, his passionate eyes opening like electronic shutters in sudden darkness, his hair sweeping stochastically back from his brow in the style of busts of German composers, he tends to start dropping certain articles and pronouns. “But the radio made music with nothing coming in, and it struck me to understand that was deepest thing in the world. It’s not the stuff of wisdom, but maybe one can be struck by anything. In fact, even now, I don’t know the answer.”
Growing up smart, through Samuel J. Tilden High School and City College, was in some measure a matter of steering an uneven course between the world of mind and the world of other people. “First I was immensely social when I was little. People would come wanting to beat me up; it’s a matter of course living in Brooklyn, and I would convince them that they should become my friend. I never fought when I was little, never, I completely avoided. And then suddenly in second grade, something clicked, and I realized I could learn things. But somewhere I started losing all my friends.
“By college I had none, working at night and totally preoccupied, and I completely missed puberty and adolescence and everything. But my last term, I realized I had some improvement to do, so I started showing up in the cafeteria. I would listen to people talking about incredible nonsense. There was this fellow talking about how his shaver could never give him a close shave, but when he took the head off it it worked much better. I just sat myself down and forced myself to listen to that, and after a few months, I began to be able to talk to people. It was a serious thing. I realized I had completely lost contact with the world.”
His choice of specialties began pragmatically. “When I was 10 or 11, it was known in the air in Brooklyn, first of all that electrical engineers worked on radios, and second that an electrical engineer could maybe get a job and earn a salary. In college, it grew clear that what I wanted to know about a radio was more apt to be in physics.
“I always had a feeling that all things of humanities were pleasant things, but somehow they were what you’re supposed to know just by being a person. But the mathematical and scientific stuff, that was work, and that was how you contributed to the human race. I don’t know how I thought such a mishegoss.”
Feigenbaum graduated in 1964 and went on to M.I.T., where he got his doctorate in elementary particle physics in 1970. Then he spent a fruitless four years at Cornell and at the Virginia Polytechnic Institute—fruitless, that is, in terms of the steady publication of work on manageable problems that is essential for a young university scientist. In his own mind, he still believes this was a time when he did the best thinking of his life, on the most profound questions, but it led nowhere. He had nothing to show—it was all, as he says, loose and floppy. He believes that his career was almost lost. But Peter Carruthers, who knew him at Cornell as perhaps the most talented young student he had ever seen, brought him to Los Alamos.
When inspiration came to Feigenbaum, it was in the form of a picture, a mental image of two small wavy forms and one big one. That was all. He had been working for months on the problem of period doubling, running numbers through a Hewlett-Packard calculator for hours on end, then moving on to a big Control Data computer, and he had discovered his first universal constant. It is an expression of how exactly rapidly a system undergoes period doubling on its way toward chaos—always. It is 4.669201609 ….
When Feigenbaum discovered this number, he did what any mathematician would do. He calculated it to great precision and tried multiplying and dividing different combinations of other well-known constants, pi and e and others, to see if his number was related somehow. It didn’t seem to be, and for months he couldn’t shake a suspicion that somehow he was still missing the point. He felt as though fate had dropped a jewel in his lap and he didn’t know what to do with it.
Then, at lunch one day, that vague wavy image came to him—no more, perhaps, than the visible top of a vast iceberg of mental processing that had taken place below the waterline of consciousness. It had to do with scaling, the way the small features of a thing relate to the large features, and it gave him the path he needed. Scaling, as Kenneth Wilson had discovered a decade before, can let you throw out a lot of unnecessary information and focus on what truly describes the essence of a structure. For period doubling, scaling showed not only when one value—a total population or a fluid speed—would break into two, but also just where the new values would be found. Scaling was an intimate feature of the peculiar world Feigenbaum was beginning to explore.
It is a world that depends on the existence of computers as no discipline ever has before. For Feigenbaum, to begin with, the computer was a way to bridge a professional gap between theory and experimentation. For particle physicists, any experiment is costly in time, money and technical sophistication. But to explore the realm of partial differential equations, Feigenbaum and others have made the computer their own accelerator, cloud chamber and camera all in one.
Early users of computers discovered that problems which could not be solved directly could often be simulated, and simulation has become a standard technique for things like weather forecasting. Given the weather at one time at thousands of points, and given equations for how weather changes, a computer can simulate the changes and see what happens. That is useful for certain problems; it is a powerful technique. Unfortunately, it is a complicated world. No existing computer, and no computer that will be built soon, can calculate enough points with enough speed to handle a rich system for long, which is why you are almost as well off flipping a coin as reading the forecast in today’s paper for Thursday’s weather.
Even if computers could be made unimaginably powerful, someone would still have to gather information more nimbly than is possible with satellites and balloons. And even if that could be managed, there is still the Butterfly Effect.
Feigenbaum’s technique, although it owes something to the spirit of simulation, is quite different. He does not use computers so much to solve problems as to explore them. He wants to develop intuition, the way an outfielder has developed intuition about fly balls by watching thousands of them sail skyward.
“It’s a way to go in and probe a problem,” Feigenbaum says. “You look at something, you see what comes out and you figure out a little better what was going on. It isn’t a way to get a careful set of data—the data, in fact, aren’t even interesting. The idea is to hone in and see what are the principles. Each one of these problems that you start playing with, each one is slightly more complicated than the next. There are surprises in each of them, and one has to learn the repertoire of what these things do.”
It can be a dauntingly esoteric, yet surprisingly basic. If you tune in for a moment to Feigenbaum describing his work to what he thinks of as a general audience, you might hear something like this: “… and all it is is a Duffing’s equation—this is a damped, driven, anharmonic oscillator. It’s a quartic postential, so just an x-cubed force law, I haven’t even put in a linear term there, it’s explicitly dissipative….”
But he might just as well be talking about a child getting pushed on a playground swing. It is easy for a mathematician to describe what happens, with just a few traditional equations. The swing gets a regular push, it accelerates down, decelerates up, loses a bit to friction, gets another push. The motion can quickly settle down to a pattern, with the swing coming to the same height each time. But, odd as it seems, the motion can also turn out to be utterly unpredictable.
The unpredictability comes from two essential features of this simple system—each producing a little mathematical twist, and each shared by the more interesting systems of nature, like the weather. The swing is damped, and it is driven: damped because friction is trying to bring it to a halt, driven because it is getting a periodic push. The weather is damped and driven in the same way, always losing energy through friction and dissipation of heat to outer space, and always getting a push from the sun.
Traditional mathematics can’t answer the easiest questions about the future of a child’s swing—how fast the swing will travel on average, how many times an hour the swing will get at least four feet off the ground. A computer can look at the problem by simulating it, rapidly calculating each cycle. But the tiny imprecision built into each calculation builds up, because this is a system with, yes, sensitive dependence on initial conditions. Before long, a simulation of even this trivial example starts to require immense computer power.
Feigenbaum uses his computers to poke around in systems like this, changing the friction or the force or the timing to see what happens. A screen that turns the data into color pictures is his way of allowing quick, intuitive recognition of patterns. At Los Alamos, despite the presence of the world’s finest supercomputers, it took him months to get the kind of graphics equipment he needed. The only appropriate terminals were in high-security areas—“behind the fence,” in the local parlance—and finally the Theoretical Division had to buy him one, disguising it budgetarily as a $20,000 “desktop calculator.”
Now, having left after an unhappy divorce ended a two-and-a-half-year marriage and made Los Alamos too small for comfort, he has one small computer in his apartment in Ithaca, N.Y., and a $100,000 model, paid for by the Army, in his Cornell office. He has customized them to run a version of Basic—a wellknown language that encourages an interactive, trial-and-error style—jazzed up with some fast machine-language routines of his own. He is using them to explore systems with more complicated possibilities than the relatively simple ones he started with.
Watching Feigenbaum work can be mesmerizing. It is, first of all, a process of continual abstraction, winnowing out unnecessary detail to isolate the essentials. There is no need to chart the whole history of a system like a child’s swing, for example—one need only look at the top of each cycle. So Feigenbaum might begin by placing a point on a blank screen, its east-west position representing one variable, its north-south another. The computer picks up the dot and runs with it, tracing, perhaps, a closed curve that loops back on itself. Start somewhere else and the computer traces a different curve—the curves representing a regularity in the system that might have been hard to perceive directly. Somewhere else, and the dot remains stationary, representing a perfectly regular cycle that doesn’t change from swing to swing. When Feigenbaum is done, he has a multicolor map of the system’s behavior, with areas of recognizable pattern and areas of disorder, dots sprinkled randomly. For some systems, a very special sort of map can appear. It is known as a strange attractor, and it has become a kind of emblem for the whole discipline. A strange attractor isn’t drawn smoothly. It begins with dots jumping about erratically, until a pattern starts to take shape gradually, as if appearing out of the mist. It is a kind of curve folded in upon itself over and over again.
The strange attractor has two odd qualities. Because it is made up of points scattered in an unpredictable order, two points close together on the curve may have begun unpredictably far apart—another example of sensitivity to initial conditions. But it is the second quality, a scaling quality, that has captured many scientists’ imaginations. When a piece of a strange attractor is magnified, it shows detail that was at first invisible. When the detail is magnified, even more detail appears, in patterns of infinite depth. The closer one looks, the more the curve seems to be repeating itself at smaller scales, like an unending sequence of Russian dolls one inside another. Or like the puffs upon puffs upon puffs in a cumulus cloud, or the whorls within whorls in a turbulent stream.
“What Mitchell has done,” said Doyne Farmer of Los Alamos, “is build a mathematical analogue of a microscope. It allows you to look at iterated functions, to zoom in on certain pieces of them and blow them up, and to do that over and over again. You can use this microscope to see what the fine structure of these equations looks like.”
Feigenbaum’s first papers from Los Alamos, in 1976 and 1977, were returned unpublished. It didn’t matter. “In truth, the dissemination of information no longer goes through journals,” Feigenbaum says. “It all goes through a well-supported preprint system.” Indeed, in the age of photocopying, scientists have mailing lists for their papers, and staying current is very much a matter of making sure you are on the right mailing lists. Even now, Feigenbaum’s complete published works would barely fill a small book. Still, he has managed to save every rejection letter.
A 1977 conference in Como, Italy, organized by Joseph Ford of Georgia, was a turning point. “It became clear that somewhere a time had become right for such a subject,” Feigenbaum says. “People weren’t necessarily there to understand about fluids, or about particle physics, but about all sorts of curiosities that were arising in nonlinear problems.”
Ford himself remembers it as a kind of epiphany. “In disciplines from astronomy to zoology, people were just publishing in their narrow disciplinary journals, totally unaware that the other people were around. In their own areas, they were regarded as a bit eccentric. But if you just changed the words, they were all doing the same thing. These people were just weepingly grateful to find out everybody else was there. Mitch’s talk was the highlight of the meeting. It was one of those things whose time had come.”
Dr. Richard J. Cohen at M.I.T. is working on what he believes is probably the main health problem for Western society. Robert Shaw at the Institute for Advanced Study is on the verge of finishing a long-awaited paper on the physics of a dripping faucet.
Cohen, who holds a joint appointment in the Harvard-M.I.T. Division of Health Sciences and Technology and in the M.I.T. physics department, has developed a novel approach to the chaos that develops in the human heart in hundreds of thousands of cardiac deaths each year in this country. About half the time, chaos in the heart—ventricular fibrillation—comes after a blocked artery causes the death of tissue. But half the time, the cause appears to be only some kind of electrical instability.
With animal experiments, and with a computer model, Cohen and several colleagues are exploring the circumstances in which the heartbeat begins to disintegrate until the heart is no longer pumping blood. Their hope is to develop ways of identifying people who are at risk before, as Cohen says, “the catastrophic event occurs.” They have already observed period doubling and some of the other classic patterns in the onset of chaos.
“We’re trying to approach this problem from a very different perspective than has been done traditionally,” he says. “It is a clear instance of the Feigenbaum phenomenon, a regular phenomenon which, under certain circumstances, becomes chaotic, and it turns out that the electrical activity in the heart has many parallels with other systems that develop chaotic behavior.”
The dripping faucet, for example. Shaw, a physicist who is another product of the Santa Cruz Collective, has been studying it for several years as a theoretical paradigm of chaos. He, too, has both a computer model and an experimental setup—in his case a brass nozzle with a plastic tub of water above it. The drops of water interrupt a beam of light, to measure the intervals precisely, and the whole thing is controlled by a little computer.
A slow drip can be quite regular, each drop a little bag of surface tension that breaks off when it reaches a certain size. But the size of the drop changes slightly depending on the speed of the flow and depending on whether there is a little rebound from the drop before. And that is enough to make the system nonlinear.
“If you turn it up you can see a regime where the drops are still separate but the pitter-patter becomes irregular,” Shaw says. “As it turns out, it’s not a predictable pattern beyond a short period of time.”
It also shows sensitivity to initial conditions, so vibration has been a serious issue. Shaw found himself doing much of the work at night, when people would be less likely to be tramping by in the corridor. “Unfortunately,” he says, “it turns out that the thing is an excellent seismometer.”
For some scientists, there is reason to pause when they explore systems as simple as a faucet and find that they are, as Shaw says, eternally creative.
Practically speaking, it means that scientists have to think differently about the problems of nature. It changes their intuitions about what the answers can look like, and that changes the questions they ask. Chaos becomes a technique for doing science—but it also becomes a conceptual framework on which theoreticians can hang some of their most treasured suspicions about the workings of the universe.
To some physicists, chaos seems like a kind of answer to the problem of free will. The realization that the simplest, most deterministic equations can look just like random noise suggests—philosophically, at least—that the Calvinists’ deterministic view of the world can be reconciled with the appearance of free will. To people like Ford, chaos is also something like a death knell for the probabilistic ideas of quantum mechanics. “Chaos makes it absolutely clear what the limits are,” he says. Last year he gave a talk titled, after Einstein, “Does God Play Dice with the Universe?”
“The answer is yes, of course,” he says. “But they’re loaded dice. And the main objective of physics now is to find out by what rules were they loaded and how can we use them to our own ends.”
To some artificial intelligence specialists at the Institute for Advanced Study and at Los Alamos, chaos suggests a means of linking the simple behavior of neurons to the unpredictable behavior of brains. And to Feigenbaum himself, it is at least a glimpse of a way to link the analytic achievements of his profession to his intuitions about the world.
In the last few years, he has begun going to museums to look at how artists handle complicated subjects, especially subjects with interesting texture, like Turner’s water, painted with small swirls atop large swirls, and then even smaller swirls atop those. “It’s abundantly obvious that one doesn’t know the world around us in detail,” he says. “What artists have accomplished is realizing that there’s only a small amount of stuff that’s important, and then seeing what it was. So they can do some of my research for me. “I truly do want to know how to describe clouds. But to say there’s a piece over here with that much density, and next to it a piece with this much density—to accumulate that much detailed information, I think is wrong. It’s certainly not how a human being perceives those things, and it’s not how an artist perceives them. Somewhere the business of writing down partial differential equations is not to have done the work on the problem.”
He is sitting at a small table in his bare Ithaca apartment. There is also a bed, his computer, a formidable record collection, two massive speakers and virtually no other furniture. He stubs out one cigarette and lights another. “One has to look for different ways,” he says. “One has to look for scaling structures—how do big details relate to little details. You look at fluid disturbances, complicated structures in which the complexity has come about by a persistent process. At some level they don’t care very much what the size of the process is—it could be the size of a pea or the size of a basketball. The process doesn’t care where it is, and moreover it doesn’t care how long it’s been going. The only things that can ever be universal, in a sense, are scaling things.
“In a way, art is a theory about the way the world looks to human beings. Somehow the wondrous promise of the earth is that there are things beautiful in it, things wondrous and alluring, and by virtue of your trade you want to understand them.” He puts the cigarette down. The smoke rises from the ashtray, first in a thin column and then, in a small gesture to universality, in broken eddies and tendrils that continue swirling upward to the ceiling.
June 10, 1984
Newton never said it, nor Galileo, nor Freud. The historian of science I. Bernard Cohen scoured the annals of discovery for scientists who announced that their own work was revolutionary, but he could produce only this short list: Symmer, Marat, Lavoisier, von Liebig, Hamilton, Darwin, Virchow, Cantor, Einstein, Minkowski, von Laue, Wegener, Compton, Just, Watson and Benoit Mandelbrot. Sixteen visionaries and cranks—and on a bright windless day it is the latest and last-named who appears in Woods Hole, Mass., ready to propagate his particular revolution at a scientific lecture and clambake.
He tours the Woods Hole Oceanographic Institution, mildly annoyed that his lecture has been left off the printed calendar of events. He steps into the auditorium, briefly intercepted by a young woman pressing her father’s business card into his hand. The calendar notwithstanding, every seat is taken. In the front row, three researchers puzzle over Scientific American’s computer pictures of the “Mandelbrot set,” a numerical construct billed as the most complex object in mathematics. He listens to his introduction (“… taught economics at Harvard, engineering at Yale, physiology at the Einstein College of Medicine….”). He clips a microphone onto the front of his short-sleeved shirt and begins confidently: “Very often when I listen to the list of my previous jobs, I wonder if I exist. The intersection of such sets is surely empty.” Indeed, for three decades, Benoit Mandelbrot failed to exist in a long list of different fields. He was always an outsider, taking an unorthodox approach to an unfashionable corner of mathematics, exploring disciplines in which he was rarely welcomed, hiding his grandest ideas to get his papers published—surviving mainly on the confidence of his colleagues at I.B.M.’s Thomas J. Watson Research Center in Yorktown Heights, N.Y.
But now, at the age of 61, he can watch his revolution advance. Mandelbrot has invented a new way of describing, calculating and thinking about shapes that are irregular and fragmented, jagged and broken up—shapes like the crystalline curves of snowflakes or the discontinuous dusts of galaxies. He had an insight into an organizing structure that lies hidden among the hideous complication of such shapes—an insight that is now transforming the way scientists look at the natural world, and what they find when they look.
Within a decade, Mandelbrot’s discovery has evolved from an oddity into a full-fledged science, providing a new language for talking about a particularly rich part of the universe. His tools have been taken up by physicists, chemists, seismologists, metallurgists, probability theorists, physiologists and economists. They are discovering applications in areas ranging from the search for underground oil deposits to the creation of polymers and the strengthening of steel. A new geometry has emerged, and it turns out to be nature’s own.
The shapes of classical geometry are lines and planes, circles and spheres, triangles and cones. They represent a powerful abstraction of reality, and they inspired a powerful philosophy of Platonic harmony. But for some kinds of complexity, they turn out to be the wrong kind of abstraction.
Clouds are not spheres, Mandelbrot is fond of saying. Mountains are not cones. Lightning does not travel in a straight line. The new geometry mirrors a universe that is rough, not rounded, scabrous, not smooth. It is a geometry of the pitted, pocked and broken up, the twisted, tangled and intertwined.
The understanding of nature’s complexity awaited a suspicion that the complexity was not just random, not just accident. It required a faith that the interesting feature of a lightning bolt’s path, for example, was not the straight line denoting its direction, but rather the distribution of its zigs and zags. Mandelbrot, beginning three decades ago, has dusted off some obscure mathematics that now illuminate these special kinds of complexity. He makes a claim about the world, and the claim is that such odd shapes carry meaning. The pits and tangles may be more than blemishes distorting the classic shapes of Euclidean geometry. They may be the keys to the essence of a thing.
To name the structures he was isolating, Mandelbrot rummaged through his son’s Latin dictionary one winter’s afternoon in 1975 and coined the word “fractal,” to connote both fractured and fractional.
Suddenly, fractals are everywhere. In physics and other fields, fractals conferences have become the rage. Scientists in some fields find the concept simply a convenience. Others, with a more hard-core attitude, have formed a cadre that one physicist calls a “fractals mafia.” The Department of Defense and private industry are putting more and more money into fractals research.
Mandelbrot himself, an I.B.M. Fellow and now also professor of the practice of mathematics at Harvard, has lately gathered many honors, including, this year, the Barnard Medal for Meritorious Service to Science. It is awarded every five years by the National Academy of Sciences under the administration of Columbia University for the most worthy “discovery in physical or astronomical science” or “novel application of science to purposes beneficial to the human race.”
Yet he remains a peculiar sort of genius. He is reviled by some colleagues, who think he is unnaturally obsessed with his place in history. They complain that he hectors them bitterly about giving due credit. Unquestionably, in his years as a perpetual heretic he honed an appreciation for the tactics, as well as the substance, of scientific achievement. As a Polish émigré in France, and then as a French émigré in the United States, he missed chunks of basic schooling, and he has remained on the periphery of the traditional dogmas—at home nowhere. In field after field, his insights were of a kind that might have been possible only for an outsider.
“From the very beginning, his angle has been to show people that they were making faulty assumptions simply because they didn’t have the tools to look beyond them,” says David B. Mumford, professor of mathematics at Harvard. “They had missed a whole range of things. Mandelbrot’s idea was that you could draw a thing, and that by drawing it you suddenly became aware of what was really going on. And the things he drew, I don’t think that even he anticipated that they would turn out to be so strikingly beautiful.”
Above all, he is a geometer. Where the main channels of mathematics have favored analysis—the manipulation of functions and the solving of equations—Mandelbrot’s way of thinking has always been visual, spatial, turning abstract problems into vivid, recognizable shapes. His work almost depends on its esthetic quality. “Geometry is sensual, one touches things,” Mandelbrot says. “I see things before I formulate them.”
In a compartmentalized academic world, a common tide can remain unfelt, and the recognition of a new kind of complexity has been developing slowly in many disciplines for nearly a generation. Mandelbrot’s work on fractals is a part of the revolution in understanding chaos, the study of turbulence and disorder in a whole range of phenomena. The unifying ideas have brought together scientists who thought their own observations were idiosyncratic, and who had no systematic way of understanding them.
Now they see universality. The insights of fractal geometry help scientists who study the way things meld together, the way they branch apart, or the way they shatter. It is a way of looking at materials—the microscopically jagged surfaces of metals, the tiny holes and channels of porous oil-bearing rock, the fragmented landscapes of an earthquake zone. A new kind of symmetry has emerged, not of left-to-right or front-to-back but of small-scale patterns to patterns on larger and larger scales—the self-similarity of a broccoli floret whose tiny bifurcations echo the branching of the stalk as a whole.
Fractals also describe the way things cluster in space and time—the distribution of galaxies, of Saturn’s rings, of (as one Japanese physics paper contends) cars on a crowded highway. Mandelbrot made his most important early leap of imagination when, at I.B.M., he was asked to examine the problem of “noise,” unexplained errors, in electronic transmission lines. It was a nasty and expensive problem for the resident experts. The errors were not completely random—they tended to come in bunches, and Mandelbrot realized that the degree of bunching remained constant whether he plotted them by the second or by the hour. Oddly, the patterns that arose, and the mathematical description of them, seemed to apply just as well to very different problems, from fluctuating cotton prices back through the 19th century to the rising and falling of the River Nile through two millenniums.
Things wear their irregularity in an unexpectedly orderly fashion. They have self-similarity on different scales. When you zoom in, looking closer and closer, the irregularities don’t smooth out. Rather, they tend to look exactly as irregular as before.
Mandelbrot himself believes that the notion of self-similarity strikes ancient chords in our culture. Indeed, an old strain in Western thought honors the idea, going back to Aristotle. Leibniz imagined that a drop of water contained a whole teeming universe, containing, in turn, water drops with new universes within. “To see a world in a grain of sand,” Blake wrote, and often that was just what scientists were disposed to see. When sperm were first discovered, each was thought to be a homunculus, a tiny but fully formed human.
But self-similarity withered as a scientific principle, and for a good reason. It just didn’t fit the facts. “The complete failure of this great old myth of self-similarity led to the opposite view,” Mandelbrot says, “that as you look closer, each time the world changes. I don’t pretend that atoms are like small solar systems—they’re not. But what I bring back in is a return to the old myth where it is applicable.”
The peculiar kinds of structures that display self-similarity often seem to have been repeatedly folded in upon themselves, so that the more one magnifies them the more detail they show. Mandelbrot likes to quote Jonathan Swift: “So, naturalists observe, a flea/Hath smaller fleas that on him prey;/And these have smaller still to bite ’em;/And so proceed ad infinitum.”
Self-similarity can lead to some seeming paradoxes. Mandelbrot put one forward when he asked, in the title of an early technical article, “How Long Is the Coast of Britain?”
“People usually give two answers,” he says. “Either ‘I don’t know, it’s not my field,’ or ‘I don’t know, but I’ll look it up in the encyclopedia.’”
In fact, it depends on the length of your ruler. As the scale becomes finer and finer, bays and peninsulas reveal new subbays and subpeninsulas, and the length—truly—increases without limit, at least down to atomic scales.
“You’re used to mathematics that characterize objects that are fundamentally smooth,” says Ralph E. Gomory, head of research for I.B.M., who encouraged Mandelbrot’s work on fractals when it was least fashionable. “Yet these things model a tremendous class of physical phenomena. It really enables us to realize the unity that is out there.”
It is hard to break the habit of thinking of things in terms of how big they usually are, how long they usually last. A hurricane, for example, is a storm of a certain size by definition. But the definition is imposed on nature. In reality, atmospheric scientists are realizing that tumult in the air forms a continuum, from the gusty swirling of litter on a city street corner to the vast cyclonic systems visible from outer space. The message of the new geometry is that such categories mislead—that the ends of a continuum are of a piece with the middle, where one finds storms like the violent rain cell last August, small and short-lived enough to elude the detectors at Dallas–Fort Worth Airport, but intense enough to bring down Delta Flight 191.
Blood vessels, from aorta to capillaries, form a continuum. They branch and divide and branch again until they become so narrow that blood cells are forced to slide through single file. They fill the three-dimensional space of the human body so efficiently that in most tissue no cell is ever more than three or four cells away from a blood vessel. It’s the Merchant of Venice syndrome, as Mandelbrot points out—not only can’t you take a pound of flesh without spilling blood, you can’t take a milligram. Yet the vessels and blood themselves take up little space, no more than about 5 percent of the body’s volume. As a design feature, that is important, because blood is physiologically costly. The value of fractal geometry to physiologists is that it provides a way of measuring the branching, a way of calculating a number that predicts how rapidly vessels will divide on smaller and smaller scales.
The key is that, mathematically speaking, fractals are shapes with fractional dimensions—a notion that turns out to have useful practical results, even though it is impossible to visualize. For shapes with the infinite convolutions of fractals, the number of dimensions expresses the ability to fill space, in a sense. A curve with 1.3 dimensions, for example, lies somewhere between a one-dimensional line and a two-dimensional plane. The greater the dimension, in effect, the greater the degree of complexity and roughness.
“I started looking in the trash cans of science for such phenomena, because I suspected that what I was observing was not an exception but perhaps very widespread,” Mandelbrot says.
“I attended lectures and looked in unfashionable periodicals, most of them of little or no yield, but once in a while finding some interesting things. In a way it was a naturalist’s approach, not a theoretician’s approach. But my gamble paid off.”
Around the turn of the century, some mathematicians realized that it was possible to construct shapes with peculiar properties like infinite self-embeddedness.
Imagine a simple triangle, each side one foot long. Now imagine a transformation in which a new triangle, identical but smaller, is fitted onto the middle third of each side. The result is a Star of David. The shape’s border now has 12 segments totaling four feet in length—the transformation has lengthened the border by one-third.
If the transformation is repeated on each of the 12 sides, the shape—known as a Koch curve, after Helge von Koch, its inventor—starts to look like a snow-flake. And if the transformation is repeated indefinitely, the shape develops some strange characteristics. For one thing, even though it contains a finite area—an area not much bigger than the original triangle—its outline has become infinitely long.
Such shapes seemed intriguing but perverse, clearly figments of the imagination and ostentatiously unlike anything found in nature. The mathematicians who enjoyed constructing such oddities became one side of a gap that emerged between mathematics and the physical sciences.
“They went around patting each other on the back for developing strange constructs outside of nature—they thought they were outsmarting Mother Nature,” says Richard F. Voss, an I.B.M. colleague of Mandelbrot’s. “What Benoit has done is let some of these monsters out of the closet. They’ve become the foundation for a new way of looking at the complicated parts of nature.”
The mind cannot fully visualize the whole complexity of infinite self-embeddedness. But to someone with a geometer’s way of thinking about form, this kind of repetition of structure on finer and finer scales can open a whole world.
“When I came in this game,” Mandelbrot says, “there was a total absence of intuition. One had to create an intuition from scratch. Intuition as it was trained by the usual tools—the hand, the pencil and the ruler—found these shapes quite monstrous and pathological.”
“Intuition is not something that is given,” Mandelbrot says. “I’ve trained my intuition to accept as obvious shapes which were initially rejected as absurd, and I find everyone else can do the same. These shapes provide a handle to representing nature, and intuition can be changed and refined and modified to include them.”
Scientists, no less than builders, work with the tools at hand—and the tools of classical analysis had led mathematicians away from the fanciful shapes that Mandelbrot’s imagination ultimately revived.
“The analyst’s bag of tricks consists of basically constructing functions,” says Mumford at Harvard, “typically solving differential equations that enable you to explore a situation. You would never think of drawing a thing. But with some things, by drawing them you could suddenly get to the heart of the situation.”
The drawing tool that makes this exploration possible, of course, is the computer, ideally suited to a kind of picture making that requires millions of repetitions of simple calculations.
When Mandelbrot began his work, he was not alone in examining the powerful notion of self-similarity. The physicists Kenneth G. Wilson, Michael Fisher and Leo Kadanoff did so in developing renormalization group analysis, a rich and highly productive advance that won Wilson the Nobel Prize in 1982. Renormalization is a way of slicing through the mathematical thickets that surround problems involving millions of pieces of matter interacting on scales from the very small to the very large.
“It was an analytical thing for Ken; it enabled him to compute things,” Mumford says. “The unique thing in Mandelbrot’s point of view was the realization that there was beautiful geometry in it.”
Now that Mandelbrot has become a fixture of the scientific lecture circuit, with his indispensible trays of color slides, with his wispy white hair straggling back from his heavy brow—now that he seems to be everywhere, some mathematicians are starting to wonder whether enough isn’t enough.
In fact, some of them call him a megalomaniac. “Of course,” says Mumford, “he is a bit of a megalomaniac, he has this incredible ego, but it’s beautiful stuff he does, so most people let him get away with it.”
The business of taking and giving credit can become obsessive in science. Mandelbrot does plenty of both. His book, The Fractal Geometry of Nature—his one major work, having gone through several incarnations in several languages—rings with the first person: “I claim ….” “I conceived and developed … and implemented ….” “I have confirmed ….” “I show ….” “I coined ….” “In my travels through newly opened or newly settled territory,” he writes, “I was often moved to exert the right of naming its landmarks.”
Some scientists don’t appreciate this kind of thing. Nor are they mollified that Mandelbrot is equally copious with his references to predecessors, some thoroughly obscure. They think it’s just his way of trying to position himself squarely in the center, setting himself up as a pope, casting his benedictions from one side of the field to the other.
They also say they resent the way he pops in and out of different disciplines, making his claims and conjectures and leaving the real work of proving them to others. And, for that matter, they don’t like the way he calls to complain about their articles when he feels they haven’t given him enough credit.
“Matters of credit are everywhere,” Mandelbrot says. “And there is a basic unlikelihood of what I have done having been done by one person, which makes people think it’s unlikely so it’s probably untrue. It’s something which is very much part of everyday life in academia, where tempers are strong.”
It hardly matters. The face of genius need not always wear an Einstein’s saintlike mien. But some scientists believe the reaction to Mandelbrot comes from his persistent crossing of disciplinary lines, his refusal to abide by the usual dogmas.
“His kind of work offends a lot of pure mathematicians, because it doesn’t involve the kind of solid proofs that they like to see,” says J. Doyne Farmer, a physicist at Los Alamos National Laboratory. “They get bothered when some guy comes along and does something that is clearly mathematics but not the way that mathematics is normally done—theorem, proof, theorem, proof, theorem, proof.
“There were a lot of mathematicians going around looking carefully at certain trees that they didn’t think even really existed in the physical world. Mandelbrot said there was a forest there. And he gives a very nice and compelling description of what that forest is and why it’s important.”
For decades, Mandelbrot feels, he had to play games with his work. He had to couch original ideas in terms that would not give offense. He had to delete his visionary-sounding prefaces to get his articles published. When he wrote the first version of his book, published in French in 1975, he felt he was forced to pretend it contained nothing too startling. That, he says, is why he wrote the latest version explicitly as “a manifesto and a casebook.” He was coping with the politics of science.
“The politics affected the style in a sense which I later came to regret,” he says. “I was saying, ‘It’s natural to ….’ ‘It’s an interesting observation that ….’ Now, in fact, it was anything but natural, and the interesting observation was in fact the result of very long investigations and search for proof and self-criticism. It had a philosophical and removed attitude which I felt was necessary to get it accepted. The politics was that, if I said I was proposing a radical departure, that would have been the end of the readers’ interest.
“Later on, I got back some such statements, people saying, ‘It is natural to observe ….’ That was not what I had bargained for.”
Mandelbrot’s attitude toward the scientific establishment, along with his attitude toward established science, has roots in his odd early history. He was born in Warsaw in 1924 to a Lithuanian Jewish family, his father a clothing wholesaler, his mother a dentist. The family moved to Paris in 1936. The war pushed him to Tulle, in south central France. His schooling was, at best, irregular and discontinuous. Even now, Mandelbrot claims not to know the alphabet. Having to use a telephone book, he says, reduces him to near helplessness.
Afterward, at the renowned École Polytechnique, he found himself mentally sidestepping mathematics that required a training he never had. He would find roundabout solutions, take geometric shortcuts, transforming the problems into shapes and working the shapes over in his mind. “I could never have done them the straight way,” he says. “The system molds people in a certain fashion. I was not affected.” When he came to the United States, first to study at California Institute of Technology in 1947, he remained an outsider, and that feeling stayed with him when he first began to bring his ideas about self-similarity to the attention of different fields.
“I don’t belong to any of the groups,” he says, “and my work flies against the natural tendency of everything to divide itself into pieces.” Looking back, Mandelbrot sees his approaches to people in various disciplines as having fallen into sadly predictable stages.
The first stage, as he recalls it, was always the same: Who are you and why are you interested in our field?
Second: How does it relate to what we have been doing, and why don’t you explain it on the basis of what we know?
Third: Are you sure it’s standard mathematics? (Yes, I’m sure.) Then why don’t we know it? (Because it’s standard but very obscure.) “Most of mathematics has never been read,” Mandelbrot says. “Mathematics is like the delta of a river. It has many branches, some broad and moving rapidly, some narrow and moving little or not at all. In physics, everything which has not worked out has disappeared. In mathematics, people can actually say, ‘A problem raised by Eisenstein in 1842 deserves further attention.’ Many of the things I have done belong to slow-moving branches.”
Fourth: What do people in these branches think about your work? (They don’t care, because it doesn’t add to the mathematics. In fact, they are surprised that their ideas represent nature.) To reach stages of acceptance and participation, Mandelbrot exploited the uncanny tendency of fractals to turn up in different fields. To economists, some of Mandelbrot’s fractal patterns looked indistinguishable from records of stock-market prices. To hydrologists, they looked like historical records of the rise and fall of rivers. By using simple mathematics based on the principle of self-similarity at different scales, and by varying the fractal dimension to mimic more or less violent tendencies to fluctuation, he produced convincing fakes of natural phenomena. Similar techniques, perfected at I.B.M., allow computers to produce astonishingly realistic landscapes—fractal coastlines, fractal mountains, fractal clouds—and they have been adopted wholesale by Hollywood’s special-effects industry.
Mandelbrot’s mimicking of stock market and river charts does not help in predicting particular rallies and floods. But they have brought some important realizations. Economists needed to understand the heretical idea that prices don’t change in a smooth, continuous flow. They can change abruptly, in instantaneous jumps. And dam builders, reservoir builders and risk insurers of all kinds needed to understand that traditional notions of probability were leading them to underestimate the likelihood of the rarest, most catastrophic events.
When a scientific revolution takes place—a thoroughgoing conceptual turnabout with its own lexicon and its own special tools—it may nevertheless leave the culture absolutely cold. But the ideas of fractal geometry, the ideas of chaos, the ideas of what specialists like to call nonlinear dynamics—these notions seem to work as metaphors, too, for the way nonscientists think about the world. Pure mathematicians treat metaphors with justifiable suspicion. But the recognition of structure amid complexity and randomness, the search for similar patterns at different scales, the willingness to think in terms of discontinuities and leaps, has spread to places where strict mathematics may or may not apply.
These changing ideas arise from a sense of the world that nonscientists share with scientists. No one has to imagine what the universe might be like on microscopic or telescopic scales—microscopes and telescopes make those images part of everyday experience. “Photographs in magazines have brought to the ordinary person things which Leibniz was thinking about in the 17th century,” Mandelbrot says. “Leibniz had no illustrations, he had no tools, the questions he was asking remained abstract.” At the same time, with computers everywhere, it is easier to develop an instinct for the possibilities of simple operations repeated over and over again. Programs to produce simple fractals on home screens have already become a popular staple of computer bulletin-board networks.
To Mandelbrot, with his obstinate fondness for the antique and the obscure, parallels to his geometry are everywhere, in art as much as science. His fractal ideal in architecture would be something like the Paris Opera, brimming with beaux arts complexity, a mixture of orderliness and wildness and—most important—something to hold the eye at any scale, from five feet away or 500 yards. The antifractal ideal, of course, comes from the Bauhaus, all Euclidean simplicity.
“As humans we live in a world with trees and mountains, which have this extreme wealth of structure,” he says. “Structures which are very spare seem unsatisfying.”
In some areas, the appreciation of scaling patterns matters most as a unifying principle. “There is a very subtle question which is quite important to my viewpoint of the world,” Mandelbrot says. “Is the climate different from the ordinary changes in the weather? Are there some mechanisms which rule daily fluctuations, and different mechanisms for yearly fluctuations, droughts and so on?”
The same question applies to economics. Daily fluctuations are treated one way, while the great changes that bring prosperity or depression are thought to belong to a different order of things.
“In each case,” Mandelbrot says, “my attitude is, let’s see what’s different from the point of view of geometry. What comes out all seems to fall on a continuum—the mechanisms don’t seem to be different.”
Surely droughts and depressions have real causes, physical and political causes. But the claim of fractal geometry is that those causes, whatever they are, belong to the same universe that controls events at much finer scales.
Graphic computer representations of some simple fractals look so real—so alive—that they sometimes seem like parlor tricks. At I.B.M., Richard Voss creates moss, seaweed, roots. “How can something with such simple rules produce structures that look so alive?” he asks. There is no answer, except a hopeful suspicion that this new mathematics has hit upon a fundamental organizing principle of nature. Perhaps laws of adaptation and economies of genetic coding tend to stay in certain paths. These are notions just beginning to be explored.
“The questions the field attacks are questions people ask themselves,” Mandelbrot says. “They are questions children ask. What shape is a mountain? Why is a cloud the way it is? People want to describe this universe, this reality, which is so very fugitive.”
December 8, 1985
Snowflake’s Riddle Yields to Probing of Science
Science has conquered the snowflake problem.
In resolving two of nature’s most poetic and maddening riddles—why are snowflakes symmetrical, and why are they all different—theoretical physicists have created a new body of mathematics for the laws that control the delicate branching growth of an unstable solidifying crystal.
Where traditionally snowflakes were left to weather experts and atmospheric scientists, now they have become part of a growing science of pattern formation that is drawing together theorists, computer modelers and engineers with practical problems ranging from metallurgy to flame propagation to oil recovery. The principles that govern ice crystals apply as well to metallic crystals, whose microscopic structure helps determine the strength of cast alloys.
Generations of snowflake-watchers sketched and catalogued the variegated patterns formed by airborne ice crystals: plates and columns, crystals and polycrystals, needles and dendrites. But snowflakes obey mathematical laws of surprising subtlety, and it has been impossible to predict precisely how fast a tip will grow, how narrow it will be or how often it will branch.
“In the last two years, those problems have been solved,” said Herbert Levine of the Schlumberger-Doll Research Center in Connecticut, one of the pioneers of pattern-formation research. The most recent breakthrough has provided a working mathematics for tying together the forces that stabilize the growing patterns and the forces that destabilize them.
By understanding the interplay of randomness and determinism, large-scale processes and microscopic processes, this new wing of physics has related a variety of subjects that were formerly treated separately.
“We’ve reached a very interesting point scientifically where we’re starting to look at a whole bunch of older problems of pattern formation in nature, how complex formations emerge out of a generally featureless soup,” said James S. Langer of the Institute for Theoretical Physics in Santa Barbara, Calif. “We finally seem to have a good idea of what controls these things.”
A key to the new approach has been the availability of computers with which scientists could propose models, test them, make pictures of the results and then improve their models. Only recently, though, after more than five years of research by several groups, have computer simulations succeeded in realistically capturing the physics of crystal growth.
One problem is that such growth entails, as Dr. Langer says, “a highly nonlinear unstable free boundary problem,” meaning that models need to track a complex, wiggly boundary that changes dynamically. “That’s tough, trying to understand where this boundary is moving,” he said. “If you guess wrong, the computer program just blows up on you.”
Another problem has been deciding which of the many physical forces involved are important and which can safely be ignored. Most important, as scientists have long realized, is the diffusion of the heat released when water freezes.
When solidification proceeds from outside to inside, as in an ice tray, the boundary between solid and liquid generally remains stable and smooth, at a speed controlled by the ability of the walls to draw away the heat. But when a crystal solidifies outward from an initial seed—as a snowflake does, grabbing water molecules while it falls through the moist air—the process becomes unstable.
Any bit of boundary that gets out ahead of its neighbors gains an advantage in picking up new water molecules and therefore grows that much faster—the “lightning-rod effect.” Tips, or “dendrites,” form, moving rapidly outward and tending to give birth to subbranches.
This much has been known for years. But the physics of heat diffusion and unstable growth cannot completely explain the patterns scientists observe when they look at snowflakes under microscopes or grow them in the laboratory. Recently Dr. Langer’s group in California and Dr. Levine’s in Connecticut separately worked out a way to incorporate another process: surface tension.
Where diffusion creates instability, surface tension creates stability, preferring smooth boundaries like the wall of a soap bubble. It costs energy to make surfaces that are rough. And where diffusion is mainly a large-scale, macroscopic process, surface tension is strongest at the microscopic scales.
The competition between these forces makes for tricky mathematics, particularly since the equations must relate scales of millimeters to scales of molecules. Traditionally, physicists assumed that for practical purposes they could disregard the tiny surface-tension effects.
“That turned out to be just wrong,” Dr. Levine said. “The breakthrough was showing that by throwing away this particular physical effect one was throwing away the right solution to the problem.”
The reason is that the surface effects prove much more sensitive to the molecular crystal structure of a solidifying substance—in the case of ice, a natural hexagonal configuration. That gives ice a built-in preference for six directions of growth.
To their surprise, the physicists found that the delicate balancing act of stability and instability amplifies this microscopic preference, creating the magnificent lacework characteristic of snowflakes.
In effect, a snowflake records the history of all the changing weather conditions it has experienced. As a growing flake falls to earth, typically floating in the wind for an hour or more, the choices made by the branching tips at any instant depend sensitively on such things as the temperature, the humidity and the presence of impurities in the atmosphere.
The nature of turbulent air is such that any pair of snowflakes will follow very different paths, and enough combinations of patterns are possible to more than justify the folklore that all snowflakes are different. But why are all six arms of a snowflake alike?
“Lots of people have thought that there has to be some mechanical equivalent of somebody sitting at the center of the snowflake and telling all of them to do the same thing,” Dr. Langer said.
But first of all, careful examination shows that snowflakes are not exactly symmetrical. And second, the six arms of one snowflake, less than a millimeter across, will have experienced nearly identical growing conditions—much closer than any two snowflakes experience, and close enough to explain their similarity.
In metallurgy, specialists seek a precise understanding of what controls the speed of crystal growth and the degree of irregularity because these, in turn, often control the tensile strength of an alloy after it solidifies. These scientists are also benefiting from the new style of using computer models to study such problems.
“There’s a brand new interaction between technology and science, connected largely by the computer,” Dr. Langer said. “People in industry say, ‘We’re dealing with more and more complex systems, and we’re not going to do it by hunt-and-find any more—it’s too tough.’”
Meanwhile, physics groups at École Normale Superieure in Paris and California Institute of Technology in Pasadena are pursuing the new approach to pattern formation, and a physicist at Emory University in Atlanta, Fereydoon Family, has used the mathematics to create startlingly lifelike computer pictures of snowflakes.
One computer snowflake, an aggregation of 10,000 or more particles, requires about eight hours of high-speed calculation, and very slight changes in temperature or humidity produce vivid changes in the resulting patterns, Dr. Family said. He will present the results at the March meeting of the American Physical Society in New York.
Experimentalists, too, are pushing the science of pattern formation forward. Jerry P. Gollub, a physicist at Haverford College and the University of Pennsylvania, has conducted a series of experiments designed to shed light on the precise shape of the convoluted structures that appear behind the growing tip of a dendrite, a problem that continues to elude theorists.
Using a microscope and crystals of ammonium bromide, he has been able to resolve details as small as 500 angstroms, or 1/20th the width of a human hair. Most recently he has been able to characterize the irregularities within rows of these sprouting side branches.
In the back of their minds, many of these physicists nurse a belief that their work on pattern formation may apply to developmental biology as well. Some types of algae, for example, closely resemble patterns under investigation by physicists.
“There is a clear connection between this problem of stability and the early differentiation of certain organisms when they start from an egg and gradually acquire structure,” Dr. Gollub said. “What we’re really doing is pushing science in a new direction through a simultaneous development in mathematics and experiment.
“On the one hand snowflakes are important because there are lots of crystals in nature,” he said, “but in the long run I think the most important aspect will be this general development of tools and ways of thinking. It is those things that are most likely to carry over into other areas of investigation.”
January 6, 1987
Tales of Chaos: Tumbling Moons and Unstable Asteroids
Bad news for those who thought they could at least rely on the stable clockwork predictability of heavenly bodies: the solar system has some surprises in store.
A new picture of celestial motion suggests that the earth’s neighbors include erratically tumbling moons and unstably orbiting asteroids, a chaotic menagerie that seems to provide, among other things, a solution to the longtime mystery of where meteorites come from.
“We have a strong prejudice that the solar system is dull—perhaps pretty, but not very interesting dynamically,” Jack Wisdom of the Massachusetts Institute of Technology told an audience of physicists, mathematicians, chemists and biologists here this week. In fact, he said, the solar system’s spins and orbits are neither so simple or so regular as Newton’s descendants might expect.
New computer calculations, based on observations from the Voyager satellites and ground-based telescopes, show that just about every moon—except the earth’s—has experienced millions of years chaotic tumbling, Dr. Wisdom reported. As he showed with a computer-generated movie, “tumbling” is a kind of motion utterly unlike the familiar spin of a body around an axis.
A tumbling moon falls end over end, twists sidewise, speeds up, slows down, all the time obeying Newton’s completely deterministic laws of motion, yet defying prediction in a way that scientists used to consider impossible. One of Saturn’s moons, a 250-mile-long football-shaped body called Hyperion, appears to be tumbling now, and its unpredictability is such that even if Voyager I had measured Hyperion’s position and speed to 10-digit accuracy, God’s own computer would not have been able to calculate where it would be when Voyager II came by a year later.
Hyperion is not alone, according to Dr. Wisdom’s latest work. The moons of Mars, Phobos and Deimos, went through tumbling episodes on the order of tens of millions of years, and the propensity to tumble is the rule, not the exception.
Strangely, tumbling results from precisely the same gravitational laws and tidal forces that cause moons to lock into a stable orbital “resonance” with their planets. The earth’s moon spins once for every orbit, thus always showing the same face. Nor must the ratio be one to one. Mercury, for example, was long thought to keep the same face pointed to the sun; in fact it rotates exactly three times in every two of its “years.”
Such resonances are not permanent, because energy drains away, lost to tidal friction. And when a moon breaks out of a locked mode, tumbling can ensue. The lesson for science—and it has been a hard lesson even for physicists to grasp—is that chaotic behavior is just the flip side of a simple Newtonian coin.
“Here is a departure from everything you’ve expected,” said James Yorke, a mathematician from the University of Maryland. “You have no special perturbation or complexity. It’s just basic physics—yet it’s doing something that nobody thought about.”
This was Chaos ’87, a conference subtitled “The Physics of Chaos and Systems Far from Equilibrium” and attended by theorists and experimenters from 14 countries, including China and India. “We’re the revolutionaries plotting to bring order into chaos,” David Campbell of Los Alamos National Laboratory told his colleagues.
The scientists belong to a rapidly growing interdisciplinary field devoted to complex dynamical behavior in systems from astronomy to biology. They presented new results on the fantastic structures that arise in crystal growth and on the bizarre palm tree–like formations of intense shock waves. They described highly sensitive experiments designed to tackle the problem of tumult in fluid flows. One biologist promised a chaos approach to the problem of sexual reproduction, but ran out of time.
As some saw it, the latest results suggest progress in moving beyond “simple chaos” to more complex systems—for example, in one version of the jargon, beyond “wimpy turbulence” to “macho turbulence.” At any rate, they argued, the era of simple, linear mathematics is over. “Ideal problems are just in the textbooks, for the joy or desperation of students,” said F. Tito Arecchi, a laser physicist from Istituto Nazionale di Ottica in Florence.
A new theoretical approach to fluid turbulence came from Steven A. Orszag of Princeton University who also showed his audience a slide depicting a newspaper advertisement for the Off Broadway play A Girl’s Guide to Chaos, described as a belly full of laughs and a glorious gigglefest.
“I hope these views of chaos don’t apply here,” Dr. Orszag said. He need not have worried.
As is customary at high-level conferences, speakers did not always pitch their talks to the broadest possible segment of their hundreds of listeners. Mitchell J. Feigenbaum of Rockefeller University, whose theoretical breakthroughs in the 1970s gave strong impetus to the study of chaos, was one of several speakers who presented dazzling exercises in mathematical physics, greeted by a presumably respectful silence.
Did Dr. Feigenbaum feel that his presentation was understood? “Certainly,” he said. “I’m sure Yves Pomeau understood it,” he said, referring to a theorist from École Normale Superieure in Paris. “And I know Tito Arecchi understood it, because he’s heard part of it before.”
A key to the progress in chaos—both cause and consequence—is the ubiquity of computers. James Glimm, a mathematician from New York University’s Courant Institute, studying a variety of “things that wrap around and get mixed up and entangled with one another,” went so far as to say that there are now three ways of doing science.
“There used to be two: theory and experiment,” he said. “Now we have computing as well.” And it was computing that enabled Dr. Wisdom of M.I.T. to solve the meteorite mystery.
Astronomers have long suspected that meteorites striking the earth were stragglers from the asteroid belt, the vast region between Mars and Jupiter filled with flotsam and jetsam. Other sources were proposed—comets, for example—but the asteroid belt has the most obvious abundance of suitable material.
The problem is, if meteorites are wayward asteroids, how do they get from there to here?
Some scientists proposed asteroid collisions as a mechanism. But to change an asteroid’s orbit enough to send it all the way to earth, a collision would have to cause a change in speed of well over 10,000 miles an hour. Any collision capable of imparting that much energy would smash an asteroid into oblivion.
“It just doesn’t work,” said George Wetherill of the Carnegie Institution in Washington, an authority on the solar system who suggested the problem to Dr. Wisdom. “You need a gentle gravitational mechanism.”
Dr. Wisdom, meanwhile, was thinking about a different problem, the “Kirkwood gaps” in the asteroid belts. Astronomers have known for a century that asteroids with certain orbital periods were strangely sparse.
It happens that the missing asteroids are the ones that would have certain resonances with the orbiter of Jupiter, such as 3 to 1, 5 to 2 and 7 to 3. So some theorists speculated that the periodic gravitational kick of Jupiter might account for the missing asteroids, but calculations failed to demonstrate this idea. On the contrary—in the long run, it seemed, asteroid orbits should remain stable.
But Dr. Wisdom, using new mathematical techniques and a special-purpose celestial mechanics computer, built for him by an M.I.T. colleague, has solved both problems at once. Over very long time scales, when the perturbing influences of both Jupiter and Saturn are taken into account, the seemingly regular orbits of asteroids that stray into the Kirkwood gaps turn chaotic.
For millions of years, he found, such an orbit seems predictable. Then the path grows increasingly eccentric until it begins to cross the orbit of Mars and then Earth. Collisions or close encounters with those planets are inevitable.
To many astronomers, such chaotic “bursts” of eccentricity seemed implausible. “When Wisdom showed you could get this chaotic effect, it was met with considerable skepticism,” Dr. Wetherill said. “There was a tendency among almost everybody to say, ‘Well, he just doesn’t know how to integrate yet, wait till he gets a little older, he’ll understand these things.’”
But most are now convinced. Among other things, his trajectories correspond to a quirky piece of meteorite lore known as the afternoon effect: the tendency of meteorites to fall most heavily in the afternoon. As it happens, afternoon is when any part of the earth is facing the right way to capture misguided asteroids.
“It’s sort of fun to see the true harmonies of the spheres here,” Dr. Wisdom says. His next project is Saturn’s rings, and he is also carrying out intensive calculations to check whether the solar system as a whole is stable or whether the planets will eventually go chaotic themselves.
It is an open question, but so far, he said, the solar system seems stable.
January 20, 1987
Fluid Math Made Simple—Sort Of
Of all the hard problems occupying the world’s supercomputers, one heads the list: calculating the flow of fluids.
The equations that engineers use to study how air flows past the wing of an aircraft or water around a ship’s hull are notoriously hard to solve, straining even the most powerful computers. Yet these kinds of calculations pop up again and again, not only in designing planes, ships, rockets and automobiles but in forecasting the weather, which is one of the most difficult fluid-flow problems of them all.
For the last few decades, computer scientists have tried to extend their grasp of such problems by designing ever more powerful computers, like the Cray-2 recently dedicated by the National Aeronautics and Space Administration at the Ames Research Center in Mountain View, Calif. But even it is inadequate. What is needed, some physicists have come to believe, is a radically new approach to mathematical modeling, the method by which computers are used to simulate and predict the phenomena of the world.
Guided by this vision, a strange new wing of theoretical physics has found a way to sidestep completely the calculations that computers find so fiendish. Instead of using complex equations to mirror fluid flow, physicists are creating fantastically simple mathematical models known as cellular automata.
A cellular automaton is a large array of cells, like the squares of a checkerboard or the hexagons of a honeycomb, which can be projected onto a computer screen. On this lattice, dots hop from cell to cell, colliding and recoiling according to a few simple rules programmed into the computer. From millions of these minute interactions, a picture emerges of such familiar physical phenomena as the pattern water takes when it flows past a rock in a stream.
These models, which are called automata because they run according to their own rules of motion, provide physicists with a miniature simulation of the universe. In the real world, space and time are continuous, or so physicists assume. In the simplified world of cellular automata, space is represented by the grid of cells. In the real world, molecules can move freely in infinitely many directions, at infinitely many speeds. In cellular automata, the dots are allowed to occupy only the cells of the lattice and can travel at only one speed.
By imposing so many constraints, these models seem at first to sacrifice the richness of the real world. But they gain in return an astonishing improvement in the speed of calculation. And to the surprise of some physicists, they seem to work.
“We’re realizing that we don’t need all these details that we’ve been worrying about all these years,” said Brosl Hasslacher, a physicist at Los Alamos National Laboratory. “The reason people are super excited is that these skeletal micro-worlds, which completely eliminate the details of real fluids, capture everything. I think we’ve just begun to see the power of it.”
American and French researchers say the new models are especially useful for calculating flows around complicated shapes, like the rear-view mirrors on the sides of a car. In a standard supercomputer, such shapes require many numbers to describe, making the calculations extremely complex. In cellular automata, shapes can simply be drawn onto the grid with a computer graphics terminal.
Because the same simple calculations are repeated millions of times, cellular automata are especially well suited to a new breed of computers that perform parallel processing. Even the powerful Cray supercomputers are essentially serial machines: they break a problem into thousands of pieces that must be funneled one at a time through a large central processor. Parallel machines avoid this bottleneck by using many processors to solve many pieces of a problem simultaneously.
The idea of using cellular automata to simulate fluids is new, but the models go back to John von Neumann, who in the 1940s was one of the founders of computer science. Recently several theorists, most notably Stephen Wolfram, director of the new Center for Complex Systems Research at the University of Illinois, and Tommaso Toffoli at the Massachusetts Institute of Technology, have turned to these simplified models as a way of studying a variety of fundamental questions in physics.
“You’re playing God’s game,” said Dr. Toffoli, whose group has built machines designed especially for cellular automata. “What you have essentially is a universe synthesizer: You give the laws, the rules of the game.”
The rules can be adjusted for many sorts of problems. At the Illinois center, Norman H. Packard has designed cellular automata in which the tiny dots interact to generate the kaleidoscopic patterns typical of the formation of snowflakes and other crystals.
For some scientists, this communication across scales, from the small and simple to the large and complex, gives cellular automata their deepest charm.
“I find it fantastic and beautiful that the tiny, trivial world of the lattice gas can give rise to the intricate structures of hydrodynamic flow,” Leo Kadanoff of the University of Chicago wrote in Physics Today. “The physical universe is also wonderfully simple at some levels, but overpoweringly rich in others.”
April 19, 1987
That the stock market embodies turbulence, mayhem and unpredictability, no survivor of October 1987 can doubt. Some economists, borrowing the vocabulary of a new branch of science, now believe that it also represents chaos.
The science of chaos—a fast-growing, interdisciplinary exploration of complex systems from the weather to the human heart—has challenged conventional approaches to random-seeming phenomena, offering innovative techniques for unraveling disorder. Economists are beginning to apply those techniques to the especially intricate and self-conscious brand of disorder displayed by the financial markets. In the aftermath of the explosive movements of the last month, some researchers believe that the methods of chaos theory may be particularly appropriate to the stock market, a system famous for creating trends and patterns and then violently defying them.
As applied to economics, a notoriously fickle science, such ideas are uncertain and untested. Nevertheless, for those who follow the market closely, they offer a new way of looking at familiar problems, from the market’s internal workings—the intricate web of trading that runs from brokerage computers to the floors of the exchanges—to the overarching forces of the world’s economy.
“We now know very clearly that stock market prices cannot be analyzed by the old procedures that we used,” said James Ramsey, a New York University economist who has become a specialist in chaos. “People are asking more cogent questions, and they’re observing behavior that begins to be amenable to the ideas of chaotic dynamics.”
The stock market is the economy’s most visible showplace for the waxing and waning of wealth and confidence—a sensitive hybrid of the facts of corporate finance and the whims of mass psychology—and even the relatively unorthodox economists who have been thinking about chaos disagree about just how their new ideas apply. Nevertheless, they are already engaging in some provocative speculation:
Some contend that the market may be becoming unbalanced as information flows more efficiently and as traders grow more sophisticated in responding to it. As the global network of buying and selling becomes increasingly interconnected and computerized, they suggest, it may be leading to volatility of a kind never before seen.
Others suggest that chaotic leaps in prices undermine some key techniques for hedging against loss, requiring a reassessment of traditional market safeguards. Chaos theory cannot help in predicting stock prices, they say, but it may help guide those who make the rules by which the game is played.
Chaos theory provides a more subtle way of thinking about the effects of global forces like the budget deficit and the balance of trade. Such effects, the researchers say, can interact in unexpected ways, with time lags that sometimes obscure their importance.
Traditional ways of looking at stock market data, from random walk theory to technical analysis, come into serious question in light of chaos, according to some economists.
Extreme events—unexpectedly fast and unexpectedly large—are a hallmark of chaotic systems. Physicists have learned to focus on the way tiny fluctuations are magnified, turning small bits of instability into large-scale booms and busts. Some scientists believe that the stormy oscillations of the last month reflect those tendencies in the financial markets.
“Somebody who’s worked on chaos is in no way surprised that this sort of thing happened,” said David Pines, a University of Illinois physicist. “It’s expected of such systems—they’re so sensitive to small perturbations.” Dr. Pines helped organize a meeting of leading economists and chaos theorists at the Santa Fe Institute in New Mexico to explore such possibilities in September, “pre-Black Monday,” he said.
When they speak of chaos, scientists mean erratic behavior that appears to be random but is not. The essence of their approach is a search for underlying patterns of a kind that have been discovered in a wide variety of seemingly random systems. Scientists studying chemical reactions, wildlife populations and electronic circuits have found that surprisingly simple systems can produce streams of data that rise and fall as erratically as the stock market, indicating that they may be governed by the rules of chaos.
But unlike any physical system, economics exists in a world with politics and history. It has the doubly entangled complexity that comes with human behavior: the same people who are trying to understand the stock market are quite capable of influencing the variables they seek to predict.
“Economic models are filled with agents that are trying to understand what other agents are doing, unlike physical models,” said William A. Brock, a University of Wisconsin economist. Weather forecasters, essentially unsuccessful at predicting their version of chaos, at least know that the laws of physics remain unchanged from day to day and that cyclones and anticyclones will not suddenly develop will and memory.
Still, some economists familiar with the details of modern trading technologies believe that chaos theory gives a telling look at forces of instability that tended to elude older models of economic behavior. Because traditional methods were geared to understanding such concepts as equilibrium, economists tended to focus on the forces of order. “Before nonlinear dynamics started capturing people’s imagination, we basically spent most of our time on evidence of stability,” Dr. Brock said.
Global Instability
There were moments, during the frenzied collapse of Oct. 19, when the stock market produced the financial equivalent of water flowing uphill.
Options to buy or sell stock—calls and puts—were seen moving in precisely the wrong direction, as the usual computer-controlled relationships broke down in the face of the rapid swings. And stock prices leaped across broad gaps, frustrating strategies meant to insure against less extraordinary declines.
The wildness offered a vivid example of how large-scale behavior emerges from the microscopic details of trading. As prices began to plummet, the market turmoil did not seem directly tied to the grand trends of budget deficits or the psychology of world events. Though analysts have tried, none have found a convincing trigger for the collapse in the Capitol or the Persian Gulf.
The real dynamics of the collapse were such contingencies as whether an over-the-counter market maker could or would answer his phone at a given instant; whether a sharp price change would breach a threshold and set off computers managing the intricately timed strategies of program trading and portfolio insurance; whether a simultaneous order to buy or sell the 500 individual stocks that compose a Standard and Poor’s index could be handled by the frantic specialists on the market floor.
Whether the mechanics of such events can be convincingly modeled by the techniques of chaos, or “nonlinear” dynamics, is far from clear. Physics and mathematics do not offer simple equations for human turmoil.
“It is true that people who have been studying the stock market with the tools of nonlinear dynamics have found, before this last episode, evidence of chaotic behavior,” said Kenneth J. Arrow, a Stanford University economist and Nobel laureate. “It’s tempting to jump from that to a statement about recent events, but I don’t think anything can explain a fall of 20 percent in one day.”
Nevertheless, some economists contend that chaos provides a natural way of seeing the crucial connections between the details of trading and the large-scale dynamics of the market.
The economy, like complex systems in nature, combines forces on the microscopic scale and the global scale. Just as the observable behavior of the earth’s atmosphere emerges from the interaction of a nearly infinite number of molecules, large-scale movements in the financial markets arise from millions of individual decisions to buy and sell.
In chaotic systems, surprises occur when the individual components are added together. In the stock market, one surprise that traders and market officials continue to debate fiercely has been the combined effect of powerful new techniques like program trading, the computer-managed arbitrage of index futures. Some chaos theorists believe that such trading is only a piece of a larger, possibly foreboding picture.
Program traders make money from small discrepancies that appear between the prices of index futures and the underlying stocks. One force that creates such discrepancies is portfolio insurance, a method of hedging against loss that led, on Black Monday, to a sudden and rapid selling of futures.
Index futures allow traders to bet on the mass movements of 100 or 500 stocks; when the price of the futures departs from the prices of the underlying stocks, it is possible to profit by selling one and buying the other. The strategy only works because computers can quickly calculate the small fluctuations and issue hundreds of simultaneous orders.
The consequences are intensely disputed. Program traders, and many economists, argue that such practices make the market more efficient and therefore more stable, by keeping prices in line and preventing large anomalies from opening up. Others contend that the huge volume of buying and selling contributes to volatility.
Program trading was halted after Black Monday and is now resuming, though not yet on its previous scale. Some chaos-minded economists now argue that whether or not program trading survives, it is only a part of a deeper trend.
They speak in terms of information flow and the global coordination of market strategies. More than ever, they note, separate markets are working in concert: the stock, credit and currency markets; and the New York, London and Tokyo markets.
Inefficiency, they suggest, helped keep markets stable. “When no one knows what’s going on, or rather everyone has a different opinion, it’s like a bunch of atoms bouncing around in a bowl,” Dr. Ramsey said. “You get a whole continuum and prices tend to remain in a steady state. So when information comes in, things tend to move very slowly.”
When opinions become more harmonious, and reactions become similar, he said, market balance becomes more precarious. If the market turns down for some outside reason, he added, many participants react uniformly—a process of which program trading is a part.
“As we get to more and more uniformity of opinion and more and more uniformity of action, in the sense that people are all following the same rules and the same procedures and using the same data base,” Dr. Ramsey said, “then the volatility of the market increases dramatically.”
Jagged Edges
Speed of price changes is one issue. The character of those changes—smooth or jagged, continuous or discontinuous—is another.
A few economists believe that another lesson of chaos was confirmed in the minute-to-minute drama of the Oct. 19 collapse: that prices change discontinuously, in jagged steps, rather than the unbroken trajectories that economists traditionally visualize.
The difference is not just academic, as many traders and portfolio managers discovered. When prices leap from one level to another without passing through the intervening points, they can defeat a set of fail-safe strategies based on the traditional assumptions.
“Most traders do tend to think of these things as being continuous,” said James L. Kaplan, a Boston money manager with a background in the mathematics of chaos. “Moreover, in their minds, they basically try to fit lines or curves to everything. If you believe the chaotic model, then there’s no reason to think that that works.”
Many of the complex strategies for protecting portfolios against loss rely on the ability to buy or sell a stock or a stock option at a certain price. If prices leap discontinuously, those strategies can fail just when they are most needed. Such leaps were witnessed repeatedly on Oct. 19.
“I believe that the most important characteristic of prices is that they can and do change discontinuously fairly often,” said Benoit Mandelbrot of the International Business Machines Corporation, who created the branch of mathematics known as fractal geometry.
The assumption of smooth motion is tempting, by analogy with motion in the physical world. But unlike a falling rock, driven by gravity, a stock price is an abstraction driven by the minds of market participants, and minds can change instantaneously.
Specialists who control the setting of prices for individual stocks are obliged to maintain the appearance of continuity, even when it is little more than appearance. When a price leaps from one level to another, specialists often make low-volume trades at the intermediate levels. When the market collapses as quickly as last month, such practices become impossible.
“The principal characteristic of prices in a competitive market of the kind that Wall Street is—and even the ideal Wall Street would be the same thing—is that price is different from all the physical quantities that vary continuously,” Dr. Mandelbrot said.
“There is no bound to the amount that prices can jump, as we saw last month.”
Fickle Forces
The attention to market dynamics comes amid continuing confusion over the role of national policy and other fundamental forces in influencing the market’s behavior. The budget deficit, the dollar’s plunge, the inflation rate—such forces certainly affect the market’s dynamics, but their effects remain unpredictable. Economists repeatedly find themselves unable to distinguish good news from bad.
Unpredictable, however, does not necessarily mean random. Lessons learned in physics and mathematics over the past two decades suggest a new way of thinking about such forces. Their relationships cannot be graphed with neatly proportional straight lines that correspond to linear equations. They require nonlinear equations, which more accurately reflect the interplay of economic forces, but are much harder to solve.
And because nonlinear relationships produce surprising behavior, both their direction and their timing can run counter to ordinary intuition. One of the few near certainties that have emerged in the minds of market analysts and government budget negotiators in Washington is the harmful influence of the federal deficit. The puzzle, as many economists have noted, is that until last month the unprecedented growth of that same deficit ran hand in hand with an ebullient bull market.
Other fundamental elements of the economy are even more confusing. The falling value of the dollar overseas surely plays a key role, yet on any given day, analysts cannot agree on whether the stock market has rallied because of, or in spite of, a rally in the currency markets. Rising interest rates are known to mean falling stock prices—except when, as during the last month, falling stock prices mean falling interest rates.
These and dozens of other important quantities in the global economy are linked to one another like the ligaments, joints and tendons of some improbably entangled anatomy. A push in one place shows up as a nudge or a flutter in many others.
The relationships are mercurial, however, impossible to pin down precisely. That is one symptom of nonlinearity; another is that the relationships are often circular, letting different trends start a chain of events that eventually leads to a kind of feedback.
So falling interest rates discourage foreign investment in American credit markets; an outbound flow of foreign money weakens the dollar; a cheap dollar encourages foreign investment. Or, a cheap dollar encourages exports; increased exports improve the balance of trade; a stronger balance of trade strengthens the dollar.
Economic theorists—if not the practitioners who actually make forecasts and design trading strategies—have recognized the nonlinearity of such systems for some time. Nevertheless, the preferred models of economics have traditionally been linear, meaning that they tend naturally to seek a balanced equilibrium or follow regular, periodic cycles. Those model behaviors, many now feel, are misleading. Chaos theory offers a radically new way of working systems of nonlinear equations and discovering the deeper patterns that often underlie them.
For economists, turning to the methods of chaos has required an uneasy shift in style. Dr. Ramsey, for example, deliberated carefully before making the shift, worrying at first that such unorthodox ideas might jeopardize his reputation. He now actively advocates an interdisciplinary approach.
Others, like Dr. Arrow, are cautious but intrigued. Dr. Arrow has long been a leading economic theorist, putting forward ideas that trickle down only slowly to actual market practitioners, and, along with Dr. Pines, he was an important force behind the meeting in Santa Fe.
The meeting brought together experimental physicists and computer modelers, who described their success with chaotic models of natural systems. Many economists who participated said that they had begun to see the limits of more orderly models based on old ideas about how systems come to equilibrium.
“They’ve really recognized that what they’re doing isn’t working, but they haven’t seen a better way to go,” Dr. Pines said. “What’s missing is the notion that things might really be chaotic, not orderly, and that these interactions can lead to genuine instability.”
A powerful tradition in stock-watching has treated price fluctuations as purely random—a “random walk,” as one school has it. The market has no memory, according to extreme forms of the random-walk view; its ups and downs on any given day bear no relationship to previous moves—or to real economic trends.
Not-So-Random Walk
Some of those who favor chaos assert that the scale of last month’s collapse was too vast to be explained as part of a random walk.
“The random-walk approach says there is no underlying dynamic,” said James A. Yorke, a mathematician who is acting director of the Institute for Physical Science and Technology at the University of Maryland. “I think the drop on Black Monday is inconsistent with the random-walk approach. If you look at the size of fluctuations on a typical day, there’s essentially no probability of a drop of 500 points.”
Other chaos specialists contend that random-walk theory is equally helpless in explaining longer trends like the strong run-up in the stock market early this year. Such trends show that an effective model must have some memory of past behavior, they said. In any case, the chaos approach prefers not to fall back on unexplained randomness.
Chaos “poses some real dilemmas for economic theory,” said Dr. Arrow of Stanford. “It creates the idea that you have a mechanism which produces cycles, produces amplified fluctuations and doesn’t depend on accidents, the way random walk theory seems to do.”
Thus, real causes and effects do govern market behavior. But in nonlinear models, their workings are not always easy to discern. Population biologists and epidemiologists, for example, know that as an epidemic courses through a real population, the number of victims often fluctuates erratically. To gain insight into such behavior, they devise mathematical models meant to mimic real life. To their surprise, they have found that the simplest of models, as long as they are nonlinear, can produce wild fluctuations.
One revealing approach is watch how such models behave when altered by a sudden jolt of some kind. Robert May, an influential Princeton University biologist, has described the surprising behavior of a simple model for the yearly progress of a disease. When the model is “perturbed” by a change responding to a program of inoculation, the level of disease does not coast downward smoothly, as traditional epidemiologists might expect.
Instead, the model embarks on a series of broad oscillations, swinging downward one year but up again the next. Only after several years have passed does a new, healthier equilibrium emerge. In real life, similar patterns have been observed.
A policy maker seeing an upward swing might be tempted to look for a specific cause of the disease’s resurgence, not realizing that such oscillations are a natural, predictable outcome of a nonlinear system. Similarly, chaos theorists suggest that the stock market’s behavior does reflect underlying economic forces—but with time lags and patterns of oscillation that remain inscrutable.
“In all nonlinear processes that I’ve ever looked at, there are always large excursions,” said Henry Abarbanel, a physicist who heads the Institute for Nonlinear Science at the University of California at San Diego. “They don’t represent major instabilities in the system; they simply represent the fact that you can concentrate energy or momentum or whatever is the appropriate dynamical variable. That’s what I think is happening.” Such methods are most successful in helping understand the general character of irregular behavior—the degree of volatility or the likelihood of oscillations. They can help in predicting how changes in a system’s rules might increase or decrease stability.
Whether they will ever help economists make more precise predictions remains in question. For now, many economists believe it will be hard enough for their science to become comfortable with extreme and extraordinary occurrences—events that used to be treated as exceptions. Psychologically, that remains difficult, as those studying chaos have discovered.
“The phenomenon of extreme events being viewed as being individual acts of God, as opposed to daily ordinary behavior,” Dr. Mandelbrot said, “is one that is very natural to the human race.”
The Right Way and the Wrong Way
When it comes to choosing the right way to look at stock market data, chaos theorists are nearly as contentious as other economists. When it comes to the wrong way, however, agreement is easier to come by. These are some common statistical tools and tricks that, by the lights of chaos, seem hopelessly naive:
Smoothing out the bumps. A variety of techniques, such as “moving averages,” are meant to eliminate fluctuations in series of data, in hopes that elements of randomness can be separated from the real information. To a scientist who has studied chaos, trying to make smooth curves out of jagged data is a mistake—the volatility is just as real as the average, and just as important in puzzling out the underlying dynamics.
Juxtaposing separate data. Analysts often try to line up a chart of the value of the yen, say, or the monthly trade deficit, with a chart of stock market prices over the same period, hoping to discern a visual similarity. Such artificial parallels are almost always illusions, chaos theorists say. The real connections between such factors are far more complex, usually involving variable time lags.
Forecasting by cycles. Although the human eye is extremely adept at picking regular cycles out of data series, those, too, are optical illusions. Real cycles in chaotic systems are almost never truly periodic. That is why, when analysts purport to see cycles of every 3 years, or every 20 years, or every 9½ months, the cycles are not likely to come around on schedule next time.
Forecasting by “chartism.” Chartists and other technical analysts go beyond cycles in search of more arcane patterns—the “head and shoulders” or the “three peaks followed by a house.” Although chaos theorists do try to find the patterns hidden in price data, the analysis is far more complicated. Mathematicians have firmly established that specific long-term predictions in a chaotic system cannot be made on the basis of past ups and downs.
November 22, 1987
New Appreciation of the Complexity in a Flock of Birds
Nothing in the motion of a single bird or a single fish, no matter how graceful, can prepare a scientist for the sight of 10,000 starlings wheeling in formation over a cornfield, or a million minnows, threatened by a predator, snapping into a tight, polarized array.
Yet somehow the actions of individual animals sum together in ways that researchers are only beginning to understand, creating patterns of motion so complex that they seem to have been choreographed from above. Flocks and schools have a distinctive style of behavior—with a fluidity and a seeming intelligence that far transcends the abilities of their members.
Vast congregations of birds, for example, are capable of turning sharply and suddenly en masse, always avoiding collisions within the flock, and zoologists now believe that such movements take place without guidance from a leader. Fish, too—their vision limited in murky seas—manage complex, seemingly instantaneous maneuvers when alarmed by an intruder.
Flocking and schooling create some of life’s most breathtaking spectacles, and they have been among the most difficult to explain. Zoologists, studying bird and fish behavior with the help of miles of high-speed film, have often assumed some high level of coordination.
But now, gaining insight from new computer models, they see synchronized maneuvers as a surprising product of the actions of individual fish following individual rules for fleeing predators and staying clear of their neighbors. Thousands of simulated animals are programmed to fly or swim independently, and flockand school-like behavior emerges on its own.
Despite recent progress, the science of analyzing the movements of flocks and schools is at an early stage in which scientists have identified many of these individual patterns but have yet to understand precisely how they emerge as movements of the group. Perhaps no conventional answer will ever emerge.
“The complexity is fascinating because there is so much going on, but it isn’t unstructured—it isn’t like dropping a thousand SuperBalls into a tank,” said Craig W. Reynolds of Symbolics Corporation, who has modeled flocking and schooling. “The synchronization speed is pretty astounding. And since birds aren’t mental giants, they can’t be doing deep thinking as they fly along. They must use fairly simple rules.”
Flocks and schools are more than just groups of animals clumped together, and not all species display flocking and schooling behavior. Those that do, often in response to hungry predators, are capable of high-speed motion, flight around obstacles and abrupt course changes.
Herding can be a similar phenomenon, but land-bound animals cannot match the flexibility of birds and fish liberated in the three-dimensional space of air and water. The seemingly effortless rapport of thousands of animals has driven otherwise sober scientists to talk of “thought transference” and “magnetic field perturbation.”
“All kinds of crazy things have been proposed,” said Frank Heppner, a University of Rhode Island zoologist. “I once put forward my own bit of insanity by proposing that there had to be some biological radio.”
Ornithologists have also traditionally believed that birds must be responding to cues from a leader—“because how else could you account for the near-simultaneous movement,” Dr. Heppner said. Now, however, he and other zoologists believe that the motion can be explained along the lines of computer models in which no individual is a leader, or, in another sense, every individual is.
“Until very, very recently, there’s been no conceptual model that permitted this,” he said. “You have a situation where you have some simple rules, but growing out of those simple rules, you have emergent properties that have nothing to do with the original rules.”
Behavior of Human Crowds
Wingless and finless creatures are also capable of coordinated motion with no leaders, as Dr. Heppner demonstrates with groups of hundreds of students. Asked to begin applauding en masse but in synchronization, they start with a random scattering of claps, yet manage to find a coordinated tempo almost instantly, usually within two beats.
Another kind of human behavior, the “wave” that rolls through masses of fans rising and sitting at a baseball stadium, may resemble the dynamics of a turning flock of birds. One zoologist, Wayne K. Potts, filmed the maneuvers of thousands of dunlin in Puget Sound, Washington, in 1983 and found that a turn propagates from one side of a flock to the other like a wave through a fluid.
The wave is fast. In the dunlin flocks, the turn spreads from bird to bird in about a 70th of a second—three times faster than a bird’s reaction time. If a bird waited until its neighbors turned, it would be much too late.
Dr. Potts saw this as a perplexing challenge, in part because he had experience flying military aircraft in formation. Even simple formations, he felt, had a strong tendency to break up because of a time lag that tends to be magnified from pilot to pilot.
“If the first does something abrupt, the second is already behind,” he said. “If anything like that was going on in a flock on a thousand birds, it would just be total chaos immediately. It’s still underappreciated how difficult a problem it is.”
“Chorus-Line Hypothesis”
Dr. Potts decided that birds must sense the approach of the wave from a distance and time their reactions accordingly. To test this idea—the “chorus-line hypothesis”—he provoked sudden maneuvers by shooting arrows near the edge of a flock. As expected, a turn began slowly with one or two birds and then accelerated rapidly through the rest.
Flocks and schools both execute far more complex maneuvers in the presence of a predator, and that is thought to be their evolutionary reason for being. Fish swimming in a more or less random way can instantly polarize—forming a dense school, aligned in parallel formation. And when a predator attacks, they can perform a fast expansion called the “fountain,” quickly splaying outward from the intruder.
Traditionally, though their problems seem increasingly similar, flock experts and school experts have not worked together. Predators act differently in water and air, and the problems of how individual creatures perceive one another tend to be quite different.
Where birds rely on vision, for example, fish have another sense organ, known as the lateral line—“a sense of distant touch,” as Julia Parrish of the University of California at Los Angeles put it. The lateral line allows them to respond sensitively to changes in water pressure caused by nearby motion.
The Problem with Starlings
Experimentally, those studying schools have had an advantage. Fish can be observed in laboratory tanks. “Whereas we’re kind of stuck with a field situation,” said Dr. Heppner, “because nobody has figured out how to get 10,000 starlings in a big cage and get them to do anything.”
Even school experts, though, face daunting levels of ignorance. Researchers need to find out how much switching and shifting places of individuals takes place in a smoothly swimming school, for example. They wonder how different types and sizes of fish sort themselves out within a school.
Technologically, such questions pose a challenge. Watching the overall shape of a school or flock is one thing; picking out the paths of individual members is quite another, only now coming within reach of computer tracking systems.
“I think there’s going to be a renaissance in studies of these groups,” said William M. Hamner, a zoologist at the University of California at Los Angeles. “One needs to model them from a series of individual behaviors. One wants to take those individual behaviors and construct a whole from them.”
Neither zoologists nor computer modelers know exactly what rules guide the motions of individual animals. In their models, they experiment with different possibilities. They do know that birds and fish must try to keep some minimal distance away from their neighbors, and they know that stragglers on the outskirts of a flock put themselves at risk if they stray.
Analyzing Films of Birds
Dr. Potts found from analyzing his films of dunlin that a left turn, say, starts not with the birds on the left side leading away, but rather with one or more birds turning inward from the right flank. That makes sense, he said, for individual birds fearing falcons and other raptors.
“The times when a bird is most vulnerable is when he’s away,” Dr. Potts said, “so the last thing he wants to do is turn outward.”
Dr. Reynolds used a computer model to mimic such maneuvers. Instead of directing the motion of the whole flock, he allowed each bird to fly at certain speeds, moving on the basis of information about a limited number of neighbors.
“You get this fluid-seeming motion,” he said. “The flock becomes a big amorphous thing that changes shape like a jellyfish, but there isn’t a central mind, there isn’t a flock mind.”
Patterns Are Deceptive
The human tendency to see the whole as a coherent, willful entity has been misleading, Dr. Hamner said. The coordinated motion of a school or flock does not imply purposeful coordination on the part of individuals, and the feeling of purpose may be deceptive.
Looking down at freeway traffic from atop a skyscraper, Dr. Hamner said, with cars smoothly weaving in and out, one has to fight the illusion that all the cars are cooperating. On a more intimate scale, traffic does not seem quite so well planned.
“Flocks form patterns and the patterns entrain our brain,” Dr. Hamner said. “We like patterns—we like patterns in waves, and we like patterns in a fire, and we see a flock of birds in the sky and we see a pattern in the overall movement. That’s the beauty of the whole system, but it’s also the thing that screws up human investigators.”
November 24, 1987
Indestructible Wave May Hold Key to Superconductors
Scientists struggling with the year’s foremost problem in theoretical physics—how to explain the new high-temperature superconductivity—are focusing more and more on a peculiarly indestructible kind of wave known as a soliton.
Unlike ordinary waves, which tend to spread out and fade as they travel through a substance, solitons are coherent packets or pulses that retain their shape over long distances. Some physicists now believe that such pulses may help explain the ability of the new materials to carry electricity with no loss to resistance.
Whether these theorists, who include an unusual assortment of Nobel laureates, will succeed in explaining superconductivity with solitons is far from clear. “It’s a lot of foment, it’s really exciting, but it’s by no means resolved,” said David Campbell, director of the Center for Nonlinear Studies at Los Alamos National Laboratories.
But in the meantime, the soliton—part wave, part lump, part wrinkle in the fabric of matter and energy—is becoming a pervasive concern of modern science. In recent years, scientists have begun to recognize solitons in nature, taking shapes that range from giant internal waves in the oceans to strange, rolling cloud formations. And solitons in laser light promise to sharpen long-distance communication over optical fibers.
A soliton can be a true, single wave—the name’s origin is “solitary wave”—or it can take the shape of some other stable, coherent structure in a complex system. “We’ve gradually realized that these things are far more common than we expected,” said David Pines of the University of Illinois. “Under suitable circumstances, they can be present in almost all systems we know.”
A vortex whirling in a draining tub of water behaves like a soliton. The red spot of Jupiter, a permanent giant eddy that sits amid the turbulence of the planet, now appears to be a soliton on a vast scale.
In the new superconductors, some theorists believe that solitons replace the ordinary waves of electronic energy that create electric current. Such solitons are particles that interact in a special way, colliding and passing through one another without losing their integrity.
“It brings us into a new conceptual domain of what a particle is,” said Robert Schrieffer of the University of California at Santa Barbara, who shared the Nobel Prize for creating the 30-year-old theory of superconductivity now being superceded. “Usually you think of starting out with bare particles in a vacuum and putting them into a medium. The soliton is a wrinkle of the medium itself. It’s self-focusing, and it just doesn’t dissipate.”
Close to Perpetual Motion
Physicists puzzling over superconductivity have been trying to grasp the workings of a highly organized motion of electrons, flowing with perfect efficiency through a crystalline arrangement of molecules. It is the closest thing in nature to perpetual motion: an electrical current in a loop of a superconducting material will flow forever.
Experiment has run far ahead of theory in recent months. Scientists have discovered a new class of superconductors without understanding them. A successful theory will be necessary, however, for researchers hoping to find still better superconducting materials and to put them to practical use. Several alternatives, similar but incompatible, are now being put forward by competing theorists, each convinced that he is heading toward a solution, including Dr. Schrieffer, Philip W. Anderson of Princeton University and T. D. Lee of Columbia University.
The old theory was a triumph of solid-state physics known as the BCS model, after its creators, John Bardeen, Leon Cooper and Dr. Schrieffer. They found a special kind of bonding that lets electrons travel in coordinated pairs, eluding the obstacles that ordinarily scatter them as they travel through a molecular lattice.
Except at extremely low temperatures, however, the energy of thermal vibrations tends to break up the pairs, leaving the electrons subject to ordinary electrical resistance. Theorists trying to explain the high-temperature superconductors have been looking for a new kind of bonding.
“One of the things which has been confusing people is that they’ve just been expecting to see the same old stuff,” Dr. Anderson said. “You’re used to BCS, you’ve been brought up on BCS; in many cases, it’s the biggest stretch of your imagination that you’ve ever had to undergo.”
Physicists work out the invisible details of electron flow indirectly, by gathering magnetic and electrical data and assembling them into calculations. They find that electrons themselves, as quantum mechanical objects, cannot be treated as points in space; their calculations only work when each electron is treated as a bundle of probabilities distributed over some distance.
An especially odd quality of the electron pairs is that their members, though coordinated, remain relatively far apart in space. On average, between any two paired electrons, there are as many as a million others.
Marvin Cohen, a theorist at the University of California at Berkeley, compares the motion of such electrons to dancers, each bound to a partner far off in a crowded ballroom. “From the outside, it looks completely chaotic, but in fact it’s completely organized,” he said. “If your partner bumps into a pole and bounces, then, even though you don’t have a pole, you have to bounce in the opposite direction. So there’s a stability to the whole system.”
The soliton models offer a different kind of coordination, more closely localized in real space. The solitons, whatever their precise electronic form—“spin bags,” in one model, “holons and spinons” in another—represent tight, particle-like clusters.
Nonlinear Systems
Understanding why solitons arise in nature has been a challenge not only for physics but also for mathematics. They are leading examples of the kind of orderly structures that can arise only in nonlinear systems, systems that cannot be expressed in terms of simple, proportional relationships and are therefore notoriously hard to unravel.
When the influences on a wave are linear, as they are for light traveling through a vacuum, for example, solitons cannot occur. The waves tend to spread out. When a wave travels though a substance that responds nonlinearly, however, feedback can occur that organizes the wave and keeps it coherent.
For mathematicians, treating such phenomena as sets of equations, solitons are a kind of miracle. Ordinarily, nonlinear equations are especially hard to solve; yet in the case of solitons, a solution inexplicably appears.
Mathematicians are far from understanding when and why the usual complexities of nonlinearity collapse into such neat solutions. Physicists, though, have begun applying the idea so broadly that Dr. Schrieffer now warns against seeing solitons everywhere. “Everyone likes the word because it puts a little holy water on their ideas,” he said.
A leading example of putting solitons to practical engineering use is in the laser light that carries telephone signals across long-distance networks of optical fibers. Ordinarily, these fibers behave as a linear medium. Over distances of many miles, the pulses degrade.
The standard solution is to install repeaters at regular intervals to reshape the signal. Researchers at A.T.&T. Bell Laboratories have found recently that, by increasing the intensity of the pulses enough, they can make the system nonlinear and produce solitons that retain their shape far longer.
“The Logjam of Data”
Physicists are far from gaining such precise control over the mechanisms of superconductivity. They are trying to match competing theories to the vast quantities of experimental data now pouring in from laboratories around the world, “trying to get the logjam of data to break up,” as Dr. Anderson put it.
Solitons have an essential topological character, like a knot or a twist in a ribbon that can move from place to place but cannot be eliminated without untying the ribbon. Ian Stewart, a mathematician at the University of Warwick in England offers the example of an air bubble sliding under wet wallpaper, its shape maintained by the tension between forces of the air’s pressure and the paper’s elasticity.
Those studying superconductivity have been encouraged to think in topological terms by the peculiar structure of the newly found materials. In ordinary metals, electrons move freely in three dimensions. Experimentalists have discovered that the new superconductors have a complex crystal structure that reduces the freedom of motion to two dimensions or to one, along sheets and chains of copper atoms. In several theories, this reduced dimensionality is crucial to the formation of coherent motion leading to superconductivity.
The test of such theories, in the end, will lie in the data from experiments. But until a good theory exists, the data will remain hard to sort out and interpret, a classic scientific bind.
Dr. Anderson, for example, said that his own theory had succeeded in predicting several otherwise puzzling new measurements. But he acknowledged that other data do not fit so well—data that he believes will turn out to be irrelevant or wrong.
“Any of us who’s any good at all is going to discard some of those experiments right off the bat,” he said. He is confident that the remaining data will confirm his proposal.
On the other hand, so is Dr. Schrieffer. “Mother Nature will tell us who’s right,” he said.
December 15, 1987
The Quest for True Randomness Finally Appears Successful
One of the strangest quests of modern computer science seems to be reaching its goal: mathematicians believe they have found a process for making perfectly random strings of numbers.
Sequences of truly patternless, truly unpredictable digits have become a perversely valuable commodity, in demand for a wide variety of applications in science and industry. Randomness is a tool for insuring fairness in statistical studies or jury selection, for designing safe cryptographic schemes and for helping scientists simulate complex behavior.
Yet random numbers—as unbiased and disorganized as the result of millions of imaginary coin tosses—have long proved extremely hard to make, either with electronic computers or mechanical devices. Consumers of randomness have had to settle for numbers that fall short, always hiding some subtle pattern.
How to use randomness, how to create it and how to recognize the real thing have become challenging questions in the computer era, touching many distant areas of science and philosophy. The randomness business is riddled with pitfalls; creeping non-randomness has undercut the expectations of many consumers, from state lotteries and tournament bridge players to drug manufacturers and court systems.
Random number generators are sold for every kind of computer. Every generator now in use has some kind of flaw, though often the flaws can be hard to detect. Furthermore, in a way, the idea of using a predictable electronic machine to create true randomness is nonsense. No string of numbers is really random if it can be produced by a simple computer process. But in a more practical sense, a string is random if there is no way to distinguish it from a string of coin flips.
“It’s almost a contradiction,” said Harvey Friedman, a logician at Ohio State University. “Superficially it’s paradoxical, since you can just rerun the generator and get the same number.”
Such paradoxes drew statisticians, computer scientists, probability theorists, physicists, philosophers and psychologists to an interdisciplinary conference on randomness organized by Dr. Friedman last week in Columbus.
“There’s a whole philosophical kind of rethinking going on,” Dr. Friedman said. “There are serious defects in just about everybody’s way of looking at this.”
Several theorists presented details of the apparent breakthrough in random-number generation, a technique that has emerged from a fast-growing area known as computational complexity, devoted to understanding the difficulty of solving different kinds of problems. The technique will now be subjected to batteries of statistical tests, meant to see whether it performs as well as the theorists believe it will. If it passes, computer scientists say it will be easy to put into use, either with software or with specially designed computer chips.
“The idea is very powerful, and it’s clear that it would have a lot of unusual applications,” said Leonid Levin, a Boston University mathematician.
The way people perceive randomness in the world around them differs sharply from the way mathematicians understand it and test for it. Asked to write down a “random” sequence of 100 zeroes and ones, for example, almost everyone chooses a sequence that alternates far too regularly. Although it seems paradoxical, if the zeroes and ones are too evenly mixed, they are not random.
A real sequence of 100 coin flips is likely to contain several strings of five or six consecutive heads or tails. Psychologists find that they have little trouble distinguishing real sequences from sequences that people write down: the sequences created by people taking psychology tests rarely contain streaks longer than four.
The need for randomness in human institutions seems to begin at whatever age “eeny meeny miny moe” becomes a practical decision-making procedure: randomness is meant to insure fairness. Like “eeny meeny miny moe,” most such procedures prove far from random. Even the most carefully designed mechanical randomness-makers, from casino roulette wheels to the airblown Ping-Pong balls used by state lotteries, break down under scrutiny.
One such failure, on a dramatic scale, struck the national draft lottery in 1969, its first year. Military officials wrote all the possible birthdays on 366 pieces of paper and put them into 366 capsules. Then they poured the January capsules into a box and mixed them. Then they added the February capsules and mixed again—and so on.
At a public ceremony, the capsules were drawn from the box by hand. Only later did statisticians establish that the procedure had been far from random; people born toward the end of the year had a far greater chance of being drafted than people born in the early months.
In general, the problem of mixing or stirring or shuffling things to insure randomness is more complicated than most experts assume.
“Eventually things do get well mixed,” said Persi Diaconis, a Harvard University statistician who has tested many mechanical randomizers and developed considerable mathematical theory for such problems. “The problem is, eventually we’re all dead.”
How long to mix can be a serious practical problem. Manufacturers mixing pill contents or foods want to avoid overmixing, because any such process takes time and because substances can break down chemically under too much stirring.
Drawbacks of Shuffling
Shuffling cards is a comparable problem. Dr. Diaconis has explored many methods and found the mathematics to be far from trivial. Using the standard riffle shuffle, for example, about seven shuffles are needed to get a good mix.
Oddly, if a riffle shuffle is carried out perfectly—the deck divided in half, then put back together with alternate cards from each half—it does not mix the structure at all. The order changes, but the structure remains. After eight perfect shuffles, the deck comes back exactly to its starting order.
A good mathematical description of real shuffling has to build in some randomness. It assumes that the deck is divided only approximately in half and then put back together with cards falling randomly from the left or right. It finds that, while seven shuffles are enough, six are not.
So Dr. Diaconis contends that millions of card-players are habitually under-shuffling. “Humans are really a lazy species, and people don’t shuffle cards seven times,” he said. “They shuffle three or four.” The consequences have been, by game-playing standards, dire. When computerized dealing replaced real dealing in contract bridge tournaments, for example, experts noticed a difference: hands with outlandish distributions of cards in different suits seemed more common, and players suspected computer trouble.
In fact, the intuitions of bridge players had been shaped by generations of improperly shuffled cards. Playing a hand of bridge and gathering up the tricks tends to organize the deck into packets of four cards, many of them all the same suit. When the deck is reassembled, insufficiently mixed, and dealt again, the packets are divided evenly among the players, and the result is a distribution of suits that proves too normal—too many hands of 4-3-3-3 or 4-4-3-2.
The computers that replaced human dealers use random-number generators, or, more precisely, pseudorandom-number generators. A number is pseudorandom if it is produced by a deterministic process, so that the same starting point will produce the same result.
Uses in Science
State lotteries, like the draft lottery two decades ago, have held out against electronic versions, mainly for psychological reasons. People prefer to trust a drawing that they can see. But for most other purposes, pseudorandom-number generators are essential.
Physical scientists use them when they need to simulate some behavior that is too complicated to calculate directly. Social scientists and medical researchers often use them to randomize surveys and drug trials, in hopes of eliminating biases. Such biases can show up in insidious ways—even the process of reaching into a box and pulling out laboratory mice can lead to a non-random assortment that will prejudice medical research.
“There’s a lot of need for pseudorandom numbers, and the point is can you do it well and can you do it fast,” said Silvio Micali, a Massachusetts Institute of Technology computer scientist.
Dr. Micali is one of a loose group of mathematicians—also including Andrew Yao, Manuel Blum, Lenore Blum, Shafi Goldwasser and Michael Shub—who have developed the powerful new approach to the random number problem. Their method begins with mathematical processes that are easy to compute, but hard to reverse, known as one-way functions.
A classic example, and the example used in a particularly promising new pseudorandom-number generator, is the multiplication of two large prime numbers. The multiplication is easy; reversing the process is very hard. Given a large product of two primes, no one knows a fast way of factoring the number and finding the primes.
The new technique involves taking some starting number, multiplying it by itself, dividing by the product of two primes, taking the remainder and using it to repeat the process over and over again. The mathematicians have proved that there is a peculiar connection between the randomness of the outcome and the difficulty of factoring large numbers. Specifically, they have proved that if factoring large numbers is truly hard—as most believe likely—the resulting sequence will be indistinguishable from true randomness.
Conversely, if some structure or pattern could be found in the sequence, that pattern would be, in effect, the long-sought solution to the factoring problem.
“It’s very intuitive that randomness is the extreme case of hardness of computation,” Dr. Micali said. “It lies at the heart of many sciences and many philosophical views of the universe. This makes that precise.”
An expert in exposing the flaws in pseudorandom-number generators, George Marsaglia of Florida State University, has begun to test the new technique, along with several others, including one put forward by Stephen Wolfram of the Center for Complex Systems Research.
Dr. Marsaglia judges sequences not just by uniformity—a good distribution of numbers in a sequence—but also by “independence.” No number or string of numbers should change the probability of the number or numbers that follow, any more than flipping a coin and getting 10 straight tails changes the likelihood of getting heads on the 11th flip.
He believes there will be a ready market for any improved pseudorandom-number generator. Tables of random digits have long been available; Dr. Marsaglia is planning to adapt that idea by producing a computer-era equivalent. His dream is a compact disk holding nothing but numbers—random sequences by the billions.
April 19, 1988
Coin-Tossing Computers Found to Show Subtle Bias
When scientists use computers to try to predict complex trends and events, they often apply a type of calculation that requires long series of random numbers. But instructing a computer to produce acceptably random strings of digits is proving maddeningly difficult.
In deciding which team kicks off a football game, the toss of a real coin is random enough to satisfy all concerned. But the cost of even a slightly nonrandom string of electronic coin tosses can be devastating to both practical problem-solving and pure theory, and a new investigation has revealed that non-random computer tosses are much more common than many scientists had assumed.
Mathematical “models” designed to predict stock prices, atmospheric warming, airplane skin friction, chemical reactions, epidemics, population growth, the outcome of battles, the locations of oil deposits and hundreds of other complex matters increasingly depend on a statistical technique called Monte Carlo Simulation, which in turn depends on reliable and inexhaustible sources of random numbers.
Monte Carlo Simulation, named for Monaco’s famous gambling casino, can help to represent very complex interactions in physics, chemistry, engineering, economics and environmental dynamics mathematically. Mathematicians call such a representation a “model,” and if a model is accurate enough, it produces the same responses to manipulations that the real thing would do. But Monte Carlo modeling contains a dangerous flaw: if the supposedly random numbers that must be pumped into a simulation actually form some subtle, non-random pattern, the entire simulation (and its predictions) may be wrong.
The danger was highlighted in a recent report from Dr. Alan M. Ferrenberg and Dr. D. P. Landau of the University of Georgia at Athens and Dr. Y. Joanna Wong of the I.B.M. Corporation’s Supercomputing center at Kingston, N.Y. In a Dec. 7 article in the journal Physical Review Letters, the scientists showed that five of the most popular computer programs for generating streams of random numbers produced errors when they were used in a simple mathematical model of the behavior of atoms in a magnetic crystal.
The reason for the errors, the scientists found, was that the numbers produced by all five programs were not random at all, despite the fact that they passed several statistical tests for randomness. Beneath their apparent randomness, the sequences actually concealed correlations and patterns, revealed only when the subtle non-randomness skewed the known properties of the crystal model.
All five of these systems for producing random numbers have long been used by scientists and statisticians with generally satisfactory results, Dr. Ferrenberg said. Major flaws turned up only recently when Dr. Ferrenberg and his colleagues were testing an ultra-powerful network of computers operating in parallel. Unlike ordinary sequential computer operations that execute programs one step at a time, parallel computing systems break up tasks into separate parts, which can be attacked simultaneously, with enormous savings in time. Parallel computers are so much faster and more powerful than the conventional kind that subtle problems in the programs they are executing come to light relatively quickly.
“We knew what the results of this test should be, and when we got errors for each simulation we tried, I thought there was something wrong with my program,” Dr. Ferrenberg said. “But Dr. Wong suggested that the problem was with the random-number generators we were using, and she was right—a reminder that no known random-number system is foolproof for all applications.”
Many relationships and interactions in science and engineering can be described by relatively simple mathematical equations that have definite solutions. But other systems are so complex that the equations describing their behavior, often containing many variables, cannot be solved by ordinary mathematical manipulations.
Disorder Hides Order
During the 1940s, Dr. Stanislaw M. Ulam, a mathematician recruited by Los Alamos National Laboratory to work on the atomic and hydrogen bombs, faced problems in the physics of nuclear explosions that proved too complicated to solve by conventional means. He attacked the problem by inventing Monte Carlo Simulation, in which, by filling in the blanks in his equations with successions of random numbers, he obtained probabilistic approximations of the answers the project needed.
This statistical system created an immediate demand for random numbers used in Monte Carlo equations, and the Rand Corporation of Santa Monica, Calif., published an entire book containing nothing but random numbers. But in today’s complex simulations, a program typically devours as many as 100 million random digits (binary zeros and ones) per second, Dr. Ferrenberg said, and to satisfy this prodigious appetite, a separate computer program must spit out digits at lightning speed. The need for speed means that any useful random number generator must contain only a few steps and be easy for a computer to execute.
Dr. Ron Graham, a mathematician at A.T.& T. Bell Laboratories in Murray Hill, N.J., explained that typical random-number generators may add a long series of arbitrary digits together, divide the result by some large number, and use the remainder as the basis for computing the next digit in the sequence.
But neither this nor any other machine-based system can produce truly random numbers. If computer-generated number sequences are long enough, repeating patterns inevitably emerge, even though they may remain hidden from their human creators.
Such patterns are sometimes discovered by accident. One system was believed to produce very good “pseudorandom” numbers, until someone plotted the numbers it produced on a three-dimensional graph. When that was done, the numbers proved to lie on a small number of planes rather than being uniformly distributed; in other words, the sequences were anything but random.
Dr. James Reeds, a computer expert at Bell, said the flaws exposed by Dr. Ferrenberg had caused problems for many users of Monte Carlo Simulations, including himself. “We all must take to heart the advice by the great John von Neumann, who wrote in 1951 that anyone who believed a computer could produce truly random sequences of numbers was living in a state of sin.”
Catch-22 of Randomness
Random numbers are used not only by scientists but in primitive ciphers. Modern cryptographic systems, however, encipher secret messages between government officials and banks using techniques that do not depend on random numbers alone. Most modern high-level cipher systems are based on the products of two very large prime numbers, products that are enormously difficult to factor into the original primes.
Advanced cryptographic systems, besides hiding the content of secret messages, can produce sequences of random numbers that meet tests for randomness much more successfully than sequences from ordinary random number generators. But they are far too slow for use in ordinary mathematical modeling.
There have been other ideas for improving the randomness of number sequences. Theorists have proposed that true random numbers rather than pseudorandom ones could be produced by machines responding to the radioactive decay of atoms. The instant at which a radioactive atom breaks up is completely unpredictable, even in theory, because it obeys the statistical rules of quantum mechanics, rather than any kind of causal determinism.
But Dr. Persi Diaconis, a Harvard University mathematician famous for his deep analyses of card shuffling and other approaches to randomness, believes that even atomic decay could pose problems. “Suppose such a decay were used to actuate a rotary switch, which would pick a number for the sequence,” he said. “The mechanical switch could have some tiny imperfection that would bias the numbers in some way, so you would back where you started.”
Another problem that would arise with any system producing genuinely random numbers, Dr. Diaconis said, is that no truly random sequence could be reproduced by the process that created it. In many cases, scientists need to be able to reproduce exact random sequences used in simulations, so that their work can be checked by others.
“We face a basic dilemma,” Dr. Ferrenberg said. “We can no longer trust various random number systems that we used to think were pretty good. Now we must test every pseudorandom number generator we use, to see if it works for the particular application we have in mind. But to test it, we must know what the right answer should be, and that’s what we’re trying to find in the first place.
“I’m getting a lot of gray hair.”
January 12, 1993
Science Squints at a Future Fogged by Chaotic Uncertainty
Back during China’s Shang dynasty, around 3,500 years ago, sages foretold the future by casting oracle bones—the clairvoyant equivalent of crap shooting.
Things have changed somewhat since then, even though predictions are as much in demand as ever, and even though we still send plenty of business to astrologers, tarot-card readers, numerologists, phrenologists, necromancers and psychics of all stripes. Aside from the traditional soothsayers, we sometimes also heed forecasts based on observations, scientific synthesis and reasoning.
Scientific sages depend not on tea leaves or the positions of planets but on tools like mathematical modeling, statistical analysis, complexity theory, celestial mechanics, geology, economics and epidemiology. Needless to say, scientists discount the naive paradigms of fortune tellers, ancient and modern.
Alas, rational approaches to prediction also fail all too often, and it may be that there are some phenomena for which predictions will remain forever out of reach.
As the stock market lurches between the tugs of bulls and bears, financial analysts armed with statistics and the latest marketing theories throw up their hands in despair when asked to guess what will happen from one day to the next.
Picnic planners know better than to rely on short-term weather forecasts, and meteorologists offer little hope that truly accurate weather predictions for specific places and times will ever be possible.
Relief agencies get little useful help from experts as they try to brace for disasters that might, with equal likelihood, occur tomorrow or a thousand years from now. Understanding the mechanisms of events like earthquakes and volcanoes offers little help in predicting precisely when they will occur.
Geologists have a pretty good idea that earthquakes are caused by the movement of tectonic plates, yet the art of earthquake forecasting remains notoriously imprecise.
The largest earthquake in four years—an 8.2-magnitude tremor centered under the ocean between Australia and Antarctica—caught seismologists completely off guard when it shook the sea floor last March 25, and geologists are still puzzled by the quake; it did not occur at the junction of tectonic plates where most quakes occur, but struck within a single plate.
Future seismic surprises, including those causing catastrophic loss of life, seem inevitable.
The tsunami that roared over Papua New Guinea on July 17, killing more than 2,000 people, caught everyone by surprise. Experts excused their failure to warn the population on the ground that a giant tsunami wave materializes only when a scarcely noticeable wave spawned by an earthquake or some other event far out at sea reaches shallow coastal water. By then it is almost on top of some hapless coastline, leaving no time to prepare.
There have been recent reminders that even in principle, some things are impossible to predict.
Among the puzzles that have perplexed mathematicians and physicists at least since the time of Isaac Newton in the 17th century is the “N-body problem” (sometimes called the “Many-body problem”).
One of Newton’s monumental discoveries was that any two objects attract each other with a force proportional to their masses and inversely proportional to the square of their distance apart. But when three or more objects—the sun, the earth and the moon, for example—are interacting gravitationally, exact solutions of the equations describing their motions generally remain beyond reach.
Fortunately, since Newton’s day, some very good approximate solutions of N-body equations have been devised, and such solutions have allowed space scientists to send vehicles to the distant reaches of the solar system with astounding precision.
This month, Dr. Gregory R. Buck of Saint Anselm College, in Manchester, N.H., disclosed a new class of approximate N-body solutions based on the analogy of a closed loop of beads, in which the beads, evenly spaced, chase each other around the curves and tangles of the loop. The system may help to work out the interactions of particles within a plasma, Dr. Buck suggested.
But none of the approximations now known exactly solve Newton’s equations, and this means that the motions of asteroids and comets can be predicted only up to a certain point in the distant future. Beyond that, forecasting an impact by one of these objects on the earth may be intrinsically impossible. A hit or miss may depend so sensitively on the minuscule “initial conditions” of all the objects involved that precise calculation becomes impossible. This is a property of a large class of systems scientists describe as chaotic.
Long-term foreknowledge of the hit or miss—a question of life or death for millions—is ruled out by the chaos inherent in the N-body system.
Scientists have yet to come to terms with chaos in all its manifestations. Dr. Steven Weinberg, winner of a Nobel Prize in physics, once said he considered an understanding of chaotic turbulence in fluids as the single most intractable problem in physics. Until scientists reach a deeper understanding of turbulence, many physicists believe, the dynamics of climate change, the behavior of galaxies and many other phenomena can never be fully penetrated.
Chaos can also prevent reliable predictions of group decisions, including those that investors make. Group decisions by investors can cause chaotic price fluctuations of commodities that are intrinsically unpredictable, it seems.
In a report published Aug. 24 in the journal Physical Review Letters, Dr. David A. Meyer of the University of California at San Diego and Dr. Thad A. Brown of the University of Missouri presented formal proof that collective decisions can be chaotic, even when the views of all participants are known and a standard voting rule is strictly applied.
When a group of decision-makers must choose between three or more options by comparing two of them at a time, the collective outcome often depends on the order in which the choices are presented. The outcome can cycle chaotically, the mathematicians found. Even nonhuman decision-makers—the computers that buy and sell commodities according to programmed rules, for example—are subject to chaotic uncertainty, a situation in which prediction becomes impossible.
Frustrated by the complex and chaotic behavior of the real world, some theorists have invented forecasting techniques based on little more than pure mathematics.
In the 1970s, René Thom, a French mathematician, developed an approach called Catastrophe Theory, which, for a time, enjoyed a considerable vogue among physicists, biologists and even sociologists. The theory is an application of topology, a field of mathematics that deals with the shapes of surfaces.
Dr. Thom’s theory holds that many sequences of events can be represented as smooth trajectories along a saddle-shaped surface, at one point of which an abrupt discontinuity, or “catastrophe cusp,” shunts them off to one side of the saddle or the other.
Dr. Thom and his followers proposed that mathematical models based on catastrophe cusps could be used to predict the reproduction of bacteria, the behavior of the stock market, heart attacks, biological evolution and the outbreak of war, among many other things. To some, Dr. Thom’s theory seemed to offer an explanation of almost everything, but many others condemned it as useless. Today, Catastrophe Theory is all but forgotten.
Predicting life spans in the absence of detailed knowledge has long interested scientists as well as insurance statisticians. A forecast of life expectancy based on the average age at death of a person’s four grandparents is a simple example of statistical forecasting. But a much more daring approach was devised a few years ago by Dr. J. Richard Gott 3d, a professor of astrophysics at Princeton University.
Dr. Gott’s scheme is based on the “Copernican principle,” which assumes that the odds are overwhelmingly against any particular place or time being “special.” From this, Dr. Gott reasoned that the mere knowledge of how long something (or someone) has been around is sufficient to estimate how much longer it could last. Based on this system, and the assumption that Homo sapiens appeared on earth about 200,000 years ago, Dr. Gott calculated that intelligent human beings are 95 percent certain to survive a minimum of 5,128 years more, and a maximum of 7.8 million years more.
There are those who contend that predictions like these are so vague that they are scarcely more useful than the prophecies of the Delphic oracle in ancient Greece, which was consulted by Socrates, Oedipus and other luminaries of the day. The oracle (operated by a concealed priest or priestess) was so ambiguous it could nearly never be proved wrong.
Scientists will never be able to answer all our questions about future events or to satisfy a deep-seated human yearning to foresee what’s coming at us. Some scientific efforts at prediction will always be defeated by the nature of Nature.
Mystic oracles have never shed light on future events either, but even 1,398 years after Socrates’s suicide, legions of people continue to visit palmists, astrologists and psychics. It’s human to prefer something to nothing at all.
September 22, 1998
Probing Disease Clusters: Easier to Spot Than Prove
The trouble began when Bobbie Gallagher noticed that her two-year-old daughter was behaving strangely, obsessively spinning and scrupulously setting her toys in rows. Alanna Gallagher turned out to have autism, a rare neurological disorder of unknown cause.
So did Alanna’s little brother. So did about 40 other children who lived in the Gallaghers’ town of Brick, N.J., near the seashore.
The parents in Brick were alarmed. On average, 1 child in 500 is autistic; in the town, the figure is about three times that.
But what does it mean? Does Brick have toxic chemicals in the water, pollutants in the air?
The problem, scientists say, may be impossible to resolve. It was yet another instance of a phenomenon that makes many statisticians shudder. It was a disease cluster—the Boy Who Cried Wolf of epidemiology.
Every time a disease cluster turns up, communities worry, scientists scramble for a cause and, as in the new movie based on Jonathan Harr’s 1995 book, A Civil Action (Random House), about a leukemia cluster in Woburn, Mass., lawyers start suing. Yet over and over again, despite years—sometimes decades—of efforts to link the disease with a cause, scientists usually come up empty-handed.
It can sound paradoxical. Here are unusual numbers of people with a disease. Toxic chemicals are everywhere, and many of them cause cancers and other diseases in laboratory animals. Why should it be so hard to find a cause?
Some disease clusters have been successfully linked to toxins: Coal miners got black lung disease; asbestos workers got mesothelioma. Workers cleaning containers where polyvinyl chloride was synthesized, breathing in fumes, got cancer of the blood vessels of the liver until machines replaced them.
But these examples of proven cause and effect are the rare exceptions, statisticians say. And they have two things in common: The chemical exposure was enormous, and the disease was extraordinarily rare.
Most disease clusters are very different. Autism, breast cancer and leukemia are fairly common. And even when there does seem to be an unusually high incidence of a disease, the search for a chemical basis usually turns up minute amounts of toxic substances that also are found in other places where there are no clusters. In other words, linking the suspect chemicals to the disease can be very hard. It can also be difficult to know if a cluster is anything more than a chance occurrence. And chance is hard to ignore.
Clusters will naturally appear even when events occur at random, said Dr. Persi Diaconis, a statistician at Stanford University. “There was a famous example of this when bombs were hitting London during World War II,” he said. “People were sure they were targeting individual places and they made up the most elaborate scenarios” to explain how the bomb targets were selected. But in the end, when the pattern was analyzed, the bombing turned out to be random.
Another problem is how to draw the boundaries of a cluster.
Dr. James Robins, a statistician at Harvard University’s School of Public Health, said it is a natural tendency to draw boundaries around groups of events to make clusters happen. If there are three children with cancer on a single block, you may draw your circle around the block—making that a cluster—rather than around the town as a whole, which may show no cluster.
Say you do find a cluster. Unless you identify, say, black lung or mesothelioma, statisticians say, the next question is: How can you decide if the cluster was caused by blind random clumpings of cases, with no environmental cause, or by a toxin in the environment?
Why would only one town have a disease cluster, some experts ask, while other places with the same pollutants in the air or water do not? One possibility might be an unidentified chemical in a mix of pollutants that is unique to the town. But that, of course, raises questions of how to find it.
Finally, there is the indirect exposure problem. If there is no direct link between chemicals and a disease, the tendency is to look for other exposures. Could the fathers, for example, have had their sperm affected when they were growing up? Or could the mothers have been exposed to chemicals during pregnancy? Some statisticians say that if people look hard enough and slice the data enough ways an association will emerge. What it means is another question.
Others are optimistic. Suzanne Condon, the director of the bureau of environmental health assessment at the Massachusetts Department of Public Health, said that in an unpublished study her department found that in the Woburn case, women who drank water from certain wells when they were pregnant were more likely to have children who developed leukemia. “We believe this sheds a lot of light on what happened in Woburn,” she said. W. R. Grace, which was accused along with Beatrice Foods of dumping chemicals in a way that allowed them to reach the water supply, paid $8 million into a settlement fund. Both companies agreed to finance an expensive cleanup plan.
The Massachusetts health department, however, warned on its Web page, “Findings should be interpreted with caution due to the limitations of conducting statistical analyses on small populations.”
That may not be what people want to hear, statisticians concede. “People—and I, too—find it hard to accept that it is just random chance that brought this horrible consequence,” said Dr. David Freedman, a statistician at the University of California at Berkeley.
Some statisticians ask whether it is worthwhile to keep pouring money and effort into searches for clusters and searches to explain them.
“The question is, at what point do you say we’ve see too many like this?” Dr. Robins asked. “Huge amounts of money” have gone to study disease clusters where the suspected cause was tiny amounts of chemicals, he added, and so far, “nothing has come of it.”
January 31, 1999
When the Miami police first found Benito Que, he was slumped on a desolate side street, near the empty spot where he had habitually parked his Ford Explorer. At about the same time, Don C. Wiley mysteriously disappeared. His car, a white rented Mitsubishi Galant, was abandoned on a bridge outside of Memphis, where he had just had a jovial dinner with friends. The following week, Vladimir Pasechnik collapsed in London, apparently of a stroke.
The list would grow to nearly a dozen in the space of four nerve-jangling months. Stabbed in Leesburg, Va. Suffocated in an air-locked lab in Geelong, Australia. Found wedged under a chair, naked from the waist down, in a blood-splattered apartment in Norwich, England. Hit by a car while jogging. Killed in a private plane crash. Shot dead while a pizza delivery man served as a decoy.
What joined these men was their proximity to the world of bioterror and germ warfare. Que, the one who was car-jacked, was a researcher at the University of Miami School of Medicine. Wiley, the most famous, knew as much as anyone about how the immune system responds to attacks from viruses like Ebola. Pasechnik was Russian, and before he defected, he helped the Soviets transform cruise missiles into biological weapons. The chain of deaths—these three men and eight others like them—began last fall, back when emergency teams in moonsuits were scouring the Capitol, when postal workers were dying, when news agencies were on high alert and the entire nation was afraid to open its mail.
In more ordinary times, this cluster of deaths might not have been noticed, but these are not ordinary times. Neighbors report neighbors to the F.B.I.; passengers are escorted off planes because they make other passengers nervous; medical journals debate what to publish, for fear the articles will be read by evil eyes. Now we are spooked and startled by stories like these—all these scientists dying within months of one another, at the precise moment when tiny organisms loom as a gargantuan threat. The stories of these dozen or so deaths started out as a curiosity and were transformed rumor by rumor into the specter of conspiracy as they circulated first on the Internet and then in the mainstream media. What are the odds, after all?
What are the odds, indeed?
For this is not about conspiracy but about coincidence—unexpected connections that are both riveting and rattling. Much religious faith is based on the idea that almost nothing is coincidence; science is an exercise in eliminating the taint of coincidence; police work is often a feint and parry between those trying to prove coincidence and those trying to prove complicity. Without coincidence, there would be few movies worth watching (“Of all the gin joints in all the towns in all the world, she walks into mine”), and literary plots would come grinding to a disappointing halt. (What if Oedipus had not happened to marry his mother? If Javert had not happened to arrive in the town where Valjean was mayor?)
The true meaning of the word is “a surprising concurrence of events, perceived as meaningfully related, with no apparent causal connection.” In other words, pure happenstance. Yet by merely noticing a coincidence, we elevate it to something that transcends its definition as pure chance. We are discomforted by the idea of a random universe. Like Mel Gibson’s character Graham Hess in M. Night Shyamalan’s new movie Signs, we want to feel that our lives are governed by a grand plan.
The need is especially strong in an age when paranoia runs rampant. “Coincidence feels like a loss of control perhaps,” says John Allen Paulos, a professor of mathematics at Temple University and the author of Innumeracy, the improbable best-seller about how Americans don’t understand numbers. Finding a reason or a pattern where none actually exists “makes it less frightening,” he says, because events get placed in the realm of the logical. “Believing in fate, or even conspiracy, can sometimes be more comforting than facing the fact that sometimes things just happen.”
In the past year there has been plenty of conspiracy, of course, but also a lot of things have “just happened.” And while our leaders are out there warning us to be vigilant, the statisticians are out there warning that patterns are not always what they seem. We need to be reminded, Paulos and others say, that most of the time patterns that seem stunning to us aren’t even there. For instance, although the numbers 9/11 (9 plus 1 plus 1) equal 11, and American Airlines Flight 11 was the first to hit the twin towers, and there were 92 people on board (9 plus 2), and Sept. 11 is the 254th day of the year (2 plus 5 plus 4), and there are 11 letters each in “Afghanistan,” “New York City” and “the Pentagon” (and while we’re counting, in George W. Bush), and the World Trade towers themselves took the form of the number 11, this seeming numerical message is not actually a pattern that exists but merely a pattern we have found. (After all, the second flight to hit the towers was United Airlines Flight 175, and the one that hit the Pentagon was American Airlines Flight 77, and the one that crashed in a Pennsylvania field was United Flight 93, and the Pentagon is shaped, well, like a pentagon.)
The same goes for the way we think of miraculous intervention. We need to be told that those lucky last-minute stops for an Egg McMuffin at McDonald’s or to pick up a watch at the repair shop or to vote in the mayoral primary—stops that saved lives of people who would otherwise have been in the towers when the first plane hit—certainly looked like miracles but could have been predicted by statistics. So, too, can the most breathtaking of happenings—like the sparrow that happened to appear at one memorial service just as a teenage boy, at the lectern eulogizing his mom, said the word “mother.” The tiny bird lighted on the boy’s head; then he took it in his hand and set it free.
Something like that has to be more than coincidence, we protest. What are the odds? The mathematician will answer that even in the most unbelievable situations, the odds are actually very good. The law of large numbers says that with a large enough denominator—in other words, in a big wide world—stuff will happen, even very weird stuff. “The really unusual day would be one where nothing unusual happens,” explains Persi Diaconis, a Stanford statistician who has spent his career collecting and studying examples of coincidence. Given that there are 280 million people in the United States, he says, “280 times a day, a one-in-a-million shot is going to occur.”
Throw your best story at him—the one about running into your childhood playmate on a street corner in Azerbaijan or marrying a woman who has a birthmark shaped like a shooting star that is a perfect match for your own or dreaming that your great-aunt Lucy would break her collarbone hours before she actually does—and he will nod politely and answer that such things happen all the time. In fact, he and his colleagues also warn me that although I pulled all examples in the prior sentence from thin air, I will probably get letters from readers saying one of those things actually happened to them.
And what of the deaths of nearly a dozen scientists? Is it really possible that they all just happened to die, most in such peculiar, jarring ways, within so short a time? “We can never say for a fact that something isn’t a conspiracy,” says Bradley Efron, a professor of statistics at Stanford. “We can just point out the odds that it isn’t.”
I first found myself wondering about coincidence last spring when I read a small news item out of the tiny Finnish town of Raahe, which is 370 miles north of Helsinki. On the morning of March 5, two elderly twin brothers were riding their bicycles, as was their habit, completing their separate errands. At 9:30, one brother was struck by a truck along coastal Highway 8 and killed instantly. About two hours later and one mile down the same highway, the other brother was struck by a second truck and killed.
“It was hard to believe this could happen just by chance,” says Marko Salo, the senior constable who investigated both deaths for the Raahe police department. Instead, the department looked for a cause, thinking initially that the second death was really a suicide.
“Almost all Raahe thought he did it knowing that his brother was dead,” Salo says of the second brother’s death. “They thought he tried on purpose. That would have explained things.” But the investigation showed that the older brother was off cheerfully getting his hair cut just before his own death.
The family could not immediately accept that this was random coincidence, either. “It was their destiny,” offers their nephew, who spoke with me on behalf of the family. It is his opinion that his uncles shared a psychic bond throughout their lives. When one brother became ill, the other one fell ill shortly thereafter. When one reached to scratch his nose, the other would often do the same. Several years ago, one brother was hit and injured by a car (also while biking), and the other one developed pain in the same leg.
The men’s sister had still another theory entirely. “She worried that it was a plot to kill both of them,” the nephew says, describing his aunt’s concerns that terrorists might have made their way to Raahe. “She was angry. She wanted to blame someone. So she said the chances of this happening by accident are impossible.”
Not true, the statisticians say. But before we can see the likelihood for what it is, we have to eliminate the distracting details. We are far too taken, Efron says, with superfluous facts and findings that have no bearing on the statistics of coincidence. After our initial surprise, Efron says that the real yardstick for measuring probability is “How surprised should we be?” How surprising is it, to use this example, that two 70-year-old men in the same town should die within two hours of each other? Certainly not common, but not unimaginable. But the fact that they were brothers would seem to make the odds more astronomical. This, however, is a superfluous fact. What is significant in their case is that two older men were riding bicycles along a busy highway in a snowstorm, which greatly increases the probability that they would be hit by trucks.
Statisticians like Efron emphasize that when something striking happens, it only incidentally happens to us. When the numbers are large enough, and the distracting details are removed, the chance of anything is fairly high. Imagine a meadow, he says, and then imagine placing your finger on a blade of grass. The chance of choosing exactly that blade of grass would be one in a million or even higher, but because it is a certainty that you will choose a blade of grass, the odds of one particular one being chosen are no more or less than the one to either side.
Robert J. Tibshirani, a statistician at Stanford University who proved that it was probably not coincidence that accident rates increase when people simultaneously drive and talk on a cellphone, leading some states to ban the practice, uses the example of a hand of poker. “The chance of getting a royal flush is very low,” he says, “and if you were to get a royal flush, you would be surprised. But the chance of any hand in poker is low. You just don’t notice when you get all the others; you notice when you get the royal flush.”
When these professors talk, they do so slowly, aware that what they are saying is deeply counterintuitive. No sooner have they finished explaining that the world is huge and that any number of unlikely things are likely to happen than they shift gears and explain that the world is also quite small, which explains an entire other type of coincidence. One relatively simple example of this is “the birthday problem.” There are as many as 366 days in a year (accounting for leap years), and so you would have to assemble 367 people in a room to absolutely guarantee that two of them have the same birthday. But how many people would you need in that room to guarantee a 50 percent chance of at least one birthday match?
Intuitively, you assume that the answer should be a relatively large number. And in fact, most people’s first guess is 183, half of 366. But the actual answer is 23. In Paulos’s book, he explains the math this way: “[T]he number of ways in which five dates can be chosen (allowing for repetitions) is (365 × 365 × 365 × 365 × 365). Of all these 3655 ways, however, only (365 × 364 × 363 × 362 × 361) are such that no two of the dates are the same; any of the 365 days can be chosen first, any of the remaining 364 can be chosen second and so on. Thus, by dividing this latter product (365 × 364 × 363 × 362 × 361) by 3655, we get the probability that five persons chosen at random will have no birthday in common. Now, if we subtract this probability from 1 (or from 100 percent if we’re dealing with percentages), we get the complementary probability that at least two of the five people do have a birthday in common. A similar calculation using 23 rather than 5 yields ½, or 50 percent, as the probability that at least 2 of 23 people will have a common birthday.”
Got that?
Using similar math, you can calculate that if you want even odds of finding two people born within one day of each other, you only need 14 people, and if you are looking for birthdays a week apart, the magic number is seven. (Incidentally, if you are looking for an even chance that someone in the room will have your exact birthday, you will need 253 people.) And yet despite numbers like these, we are constantly surprised when we meet a stranger with whom we share a birth date or a hometown or a middle name. We are amazed by the overlap—and we conveniently ignore the countless things we do not have in common.
Which brings us to the death of Benito Que, who was not, despite reports to the contrary, actually a microbiologist. He was a researcher in a lab at the University of Miami Sylvester Cancer Center, where he was testing various agents as potential cancer drugs. He never worked with anthrax or any infectious disease, according to Dr. Bach Ardalan, a professor of medicine at the University of Miami and Que’s boss for the past three years. “There is no truth to the talk that Benito was doing anything related to microbiology,” Ardalan says. “He certainly wasn’t doing any sensitive kind of work that anyone would want to hurt him for.”
But those facts got lost amid the confusion—and the prevalence of very distracting details—in the days after he died. So did the fact that he had hypertension. On the afternoon of Monday, Nov. 19, Que attended a late-afternoon lab meeting, and as it ended, he mentioned that he hadn’t been feeling well. A nurse took Que’s blood pressure, which was 190/110. “I wanted to admit him” to the hospital, Ardalan says, but Que insisted on going home.
Que had the habit of parking his car on Northwest 10th Avenue, a side street that Ardalan describes as being “beyond the area considered to be safe.” His spot that day was in front of a house where a young boy was playing outside. Four youths approached Que as he neared his car, the boy later told the police, and there might have been some baseball bats involved. When the police arrived, they found Que unconscious. His briefcase was at his side, but his wallet was gone. His car was eventually found abandoned several miles from the scene. He was taken to the hospital, the same one at which he worked, where he spent more than a week in a coma before dying without ever regaining consciousness.
The mystery, limited to small items in local Florida papers at first, was, What killed Benito Que? Could it have been the mugging? A CAT scan showed no signs of bone fracture. In fact, there were no scrapes or bruises or other physical signs of assault. Perhaps he died of a stroke? His brain scan did show a “huge intracranial bleed,” Ardalan says, which would have explained his earlier headache, and his high blood pressure would have made a stroke likely.
In other words, this man just happened to be mugged when he was a stroke waiting to be triggered. That is a jarring coincidence, to be sure. But it is not one that the world was likely to have noticed if Don Wiley had not up and disappeared.
Don C. Wiley was a microbiologist. He did some work with anthrax, and a lot of work with H.I.V., and he was also quite familiar with Ebola, smallpox, herpes and influenza. At 57, he was the father of four children and a professor of biochemistry and biophysics in the department of molecular and cellular biology at Harvard.
On Nov. 15, four days before the attack on Benito Que, Wiley was in Memphis to visit his father and to attend the annual meeting of the scientific advisory board of St. Jude’s Research Hospital, of which he was a member. At midnight, he was seen leaving a banquet at the Peabody Hotel in downtown Memphis. Friends and colleagues say he had a little to drink but did not appear impaired, and they remember him as being in a fine mood, looking forward to seeing his wife and children, who were about to join him for a short vacation.
Wiley’s father lives in a Memphis suburb, and that is where Wiley should have been headed after the banquet. Instead, his car was found facing in the opposite direction on the Hernando DeSoto Bridge, which spans the Mississippi River at the border of Tennessee and Arkansas. When the police found the car at 4 a.m., it was unlocked, the keys were in the ignition and the gas tank was full. There was a scrape of yellow paint on the driver’s side, which appeared to come from a construction sign on the bridge, and a right hubcap was missing on the passenger side, where the wheel rims were also scraped. There was no sign, however, of Don Wiley.
The police trawled the muddy Mississippi, but they didn’t really expect to find him. Currents run fast at that part of the river, and a body would be quickly swept away. At the start of the search, they thought he might have committed suicide; others had jumped from the DeSoto Bridge over the years. Detectives searched Wiley’s financial records, his family relationships, his scientific research—anything for a hint that the man might have had cause to take his own life.
Finding nothing, the investigation turned medical. Wiley, they learned, had a seizure disorder that he had hidden from all but family and close friends. He had a history of two or three major episodes a year, his wife told investigators, and the condition was made worse when he was under stress or the influence of alcohol. Had Wiley, who could well have been tired, disoriented by bridge construction and under the influence of a few drinks, had a seizure that sent him over the side of the bridge?
That was the theory the police spoke of in public, but they were also considering something else. The week that Wiley disappeared coincided with the peak of anthrax fear throughout the country. Tainted letters appeared the month before at the Senate and the House of Representatives. Two weeks earlier, a New York City hospital worker died of inhaled anthrax. Memphis was not untouched by the scare; a federal judge and two area congressmen each received hoax letters. Could it be mere chance that this particular scientist, who had profound knowledge of these microbes, had disappeared at this time?
“The circumstances were peculiar,” says George Bolds, a spokesman for the Memphis bureau of the F.B.I., which was called in to assist. “There were questions that had to be asked. Could he have been kidnapped because his scientific abilities would have made him capable of creating anthrax? Or maybe he’d had some involvement in the mailing of the anthrax, and he’d disappeared to cover his tracks? Did his co-conspirators grab him and kill him?
“We were in new territory,” Bolds continued. “Just because something is conceivable doesn’t mean it’s actually happened, but at the same time, just because it’s never happened before doesn’t mean it can’t happen. People’s ideas of what is possible definitely changed on Sept. 11. People feel less secure and less safe. I’m not sure that they’re at greater risk than they were before. Maybe they’re just more aware of the risk they are actually at.”
As a species, we appear to be biologically programmed to see patterns and conspiracies, and this tendency increases when we sense that we’re in danger. “We are hard-wired to overreact to coincidences,” says Persi Diaconis. “It goes back to primitive man. You look in the bush, it looks like stripes, you’d better get out of there before you determine the odds that you’re looking at a tiger. The cost of being flattened by the tiger is high. Right now, people are noticing any kind of odd behavior and being nervous about it.”
Adds John Allen Paulos: “Human beings are pattern-seeking animals. It might just be part of our biology that conspires to make coincidences more meaningful than they really are. Look at the natural world of rocks and plants and rivers: it doesn’t offer much evidence for superfluous coincidences, but primitive man had to be alert to all anomalies and respond to them as if they were real.”
For decades, all academic talk of coincidence has been in the context of the mathematical. New work by scientists like Joshua B. Tenenbaum, an assistant professor in the department of brain and cognitive sciences at M.I.T., is bringing coincidence into the realm of human cognition. Finding connections is not only the way we react to the extraordinary, Tenenbaum postulates, but also the way we make sense of our ordinary world. “Coincidences are a window into how we learn about things,” he says. “They show us how minds derive richly textured knowledge from limited situations.”
To put it another way, our reaction to coincidence shows how our brains fill in the factual blanks. In an optical illusion, he explains, our brain fills the gaps, and although people take it for granted that seeing is believing, optical illusions prove that’s not true. “Illusions also prove that our brain is capable of imposing structure on the world,” he says. “One of the things our brain is designed to do is infer the causal structure of the world from limited information.”
If not for this ability, he says, a child could not learn to speak. A child sees a conspiracy, he says, in that others around him are obviously communicating and it is up to the child to decode the method. But these same mechanisms can misfire, he warns. They were well suited to a time of cavemen and tigers and can be overloaded in our highly complex world. “It’s why we have the urge to work everything into one big grand scheme,” he says. “We do like to weave things together.
“But have we evolved into fundamentally rational or fundamentally irrational creatures? That is one of the central questions.”
We pride ourselves on being independent and original, and yet our reactions to nearly everything can be plotted along a predictable spectrum. When the grid is coincidences, one end of the scale is for those who believe that these are entertaining events with no meaning; at the other end are those who believe that coincidence is never an accident.
The view of coincidence as fate has lately become something of a minitrend in the New Age section of bookstores. Among the more popular authors is SQuire Rushnell (who, in the interest of marketing, spells his first name with a capital Q). Rushnell spent 20 years producing such television programs as Good Morning America and Schoolhouse Rock. His fascination with coincidence began when he learned that both John Adams and Thomas Jefferson died on the same July 4, 50 years after the ratification of the Declaration of Independence.
“That stuck in my craw,” Rushnell says, “and I couldn’t stop wondering what that means.” And so Rushnell wrote When God Winks: How the Power of Coincidence Guides Your Life. The book was published by a small press shortly before Sept. 11 and sold well without much publicity. It will be rereleased with great fanfare by Simon & Schuster next month. Its message, Rushnell says, is that “coincidences are signposts along your universal pathway. They are hints that you are going in the right direction or that you should change course. It’s like your grandmother sitting across the Thanksgiving table from you and giving you a wink. What does that wink mean? ‘I’m here, I love you, stay the course.’”
During my interview with Rushnell, I told him the following story: On a frigid December night many years ago, a friend dragged me out of my warm apartment, where I planned to spend the evening in my bathrobe nursing a cold. I had to come with her to the movies, she said, because she had made plans with a pal from her office, and he was bringing a friend for me to meet. Translation: I was expected to show up for a last-minute blind date. For some reason, I agreed to go, knocking back a decongestant as I left home. We arrived at the theater to find that the friend who was supposed to be my “date” had canceled, but not to worry, another friend had been corralled as a replacement. The replacement and I both fell asleep in the movie (I was sedated by cold medicine; he was a medical resident who had been awake for 36 hours), but four months later we were engaged, and we have been married for nearly 15 years.
Rushnell was enthralled by this tale, particularly by the mystical force that seemed to have nudged me out the door when I really wanted to stay home and watch The Golden Girls. I know that those on the other end of the spectrum—the scientists and mathematicians—would have offered several overlapping explanations of why it was unremarkable.
There are, of course, the laws of big numbers and small numbers—the fact that the world is simultaneously so large that anything can happen and so small that weird things seem to happen all the time. Add to that the work of the late Amos Tversky, a giant in the field of coincidence theory, who once described his role in this world as “debugging human intuition.” Among other things, Tversky disproved the “hot hand” theory of basketball, the belief that a player who has made his last few baskets will more likely than not make his next. After examining thousands of shots by the Philadelphia 76ers, he proved that the odds of a successful shot cannot be predicted by the shots that came before.
Tversky similarly proved that arthritis sufferers cannot actually predict the weather and are not in more pain when there’s a storm brewing, a belief that began with the ancient Greeks. He followed 18 patients for 15 months, keeping detailed records of their reports of pain and joint swelling and matching them with constantly updated weather reports. There was no pattern, he concluded, though he also conceded that his data would not change many people’s beliefs.
We believe in such things as hot hands and arthritic forecasting and predestined blind dates because we notice only the winning streaks, only the chance meetings that lead to romance, only the days that Grandma’s hands ache before it rains. “We forget all the times that nothing happens,” says Ruma Falk, a professor emeritus of psychology at Hebrew University in Jerusalem, who studied years ago with Tversky. “Dreams are another example,” Falk says. “We dream a lot. Every night and every morning. But it sometimes happens that the next day something reminds you of that dream. Then you think it was a premonition.”
Falk’s work is focused on the question of why we are so entranced by coincidence in the first place. Her research itself began with a coincidence. She was on sabbatical in New York from her native Israel, and on the night before Rosh Hashana she happened to meet a friend from Jerusalem on a Manhattan street corner. She and the friend stood on that corner and marveled at the coincidence. What is the probability of this happening? she remembers wondering. What did this mean?
“How stupid we were,” Falk says now, “to be so surprised. We related to all the details that had converged to create that moment. But the real question was what was the probability that at some time in some place I would meet one of my circle of friends? And when I told this story to others at work, they encoded the events as two Israelis meeting in New York, something that happens all the time.”
Why was her experience so resonant for her, Falk asked herself, but not for those around her? One of the many experiments she has conducted since then proceeded as follows: she visited several large university classes, with a total of 200 students, and asked each student to write his or her birth date on a card. She then quietly sorted the cards and found the handful of birthdays that students had in common. Falk wrote those dates on the blackboard. April 10, for instance, Nov. 8, Dec. 16. She then handed out a second card and asked all the students to use a scale to rate how surprised they were by these coincidences.
The cards were numbered, so Falk could determine which answers came from respondents who found their own birth date written on the board. Those in that subgroup were consistently more surprised by the coincidence than the rest of the students. “It shows the stupid power of personal involvement,” Falk says.
The more personal the event, the more meaning we give it, which is why I am quite taken with my story of meeting my husband (because it is a pivotal moment in my life), and why SQuire Rushnell is also taken with it (because it fits into the theme of his book), but also why Falk is not impressed at all. She likes her own story of the chance meeting on a corner better than my story, while I think her story is a yawn.
The fact that personal attachment adds significance to an event is the reason we tend to react so strongly to the coincidences surrounding Sept. 11. In a deep and lasting way, that tragedy feels as if it happened to us all.
Falk’s findings also shed light on the countless times that pockets of the general public find themselves at odds with authorities and statisticians. Her results might explain, for instance, why lupus patients are certain their breast implants are the reason for their illness, despite the fact that epidemiologists conclude there is no link, or why parents of autistic children are resolute in their belief that childhood immunizations or environmental toxins or a host of other suspected pathogens are the cause, even though experts are skeptical. They might also explain the outrage of all the patients who are certain they live in a cancer cluster, but who have been told otherwise by researchers.
Let’s be clear: this does not mean that conspiracies do not sometimes exist or that the environment never causes clusters of death. And just as statistics are often used to show us that we should not be surprised, they can also prove what we suspect, that something is wrong out there.
“The fact that so many suspected cancer clusters have turned out to be statistically insupportable does not mean the energy we spent looking for them has been wasted,” says Dr. James M. Robins, a professor of epidemiology and biostatistics at Harvard and an expert on cancer clusters. “You’re never going to find the real ones if you don’t look at all the ones that don’t turn out to be real ones.”
Most often, though, coincidence is a sort of Rorschach test. We look into it and find what we already believe. “It’s like an archer shooting an arrow and then drawing a circle around it,” Falk says. “We give it meaning because it does mean something—to us.”
Vladimir Pasechnik was 64 when he died. His early career was spent in the Soviet Union working at Biopreparat, the site of that country’s biological weapons program. He defected in 1989 and spilled what he knew to the British, revealing for the first time the immense scale of Soviet work with anthrax, plague, tularemia and smallpox.
For the next 10 years, he worked at the Center for Applied Microbiology and Research, part of Britain’s Department of Health. Two years ago, he left to form Regma Biotechnologies, whose goal was to develop treatment for tuberculosis and other infectious disease. In the weeks before he died, Pasechnik had reportedly consulted with authorities about the growing anthrax scare. Despite all these intriguing details, there is nothing to suggest that his death was caused by anything other than a stroke.
Robert Schwartz’s death, while far more dramatic and bizarre, also appears to have nothing to do with the fact that he was an expert on DNA sequencing and analysis. On Dec. 10 he was found dead on the kitchen floor of his isolated log-and-fieldstone farmhouse near Leesburg, Va., where he had lived alone since losing his wife to cancer four years ago and his children to college. Schwartz had been stabbed to death with a two-foot-long sword, and his killer had carved an X on the back of his neck.
Three friends of Schwartz’s college-age daughter were soon arrested for what the prosecutor called a “planned assassination”; two of the trials for first-degree murder are scheduled for this month. A few weeks later, police arrested the daughter as well. One suspect has a history of mental illness, and their written statements to police talk of devil worship and revenge. There is no talk, however, of microbiology.
On the same day that Schwartz died, Set Van Nguyen, 44, was found dead in an air-locked storage chamber at the Australian Commonwealth’s Scientific and Industrial Research Organisation’s animal diseases facility in Geelong. A months-long internal investigation concluded that a string of equipment failures had allowed nitrogen to build up in the room, causing Nguyen to suffocate. Although the center itself dealt with microbes like mousepox, which is similar to smallpox, Nguyen himself did not. “Nguyen was in no way involved in research into mousepox,” says Stephen Prowse, who was the acting director of the Australian lab during the investigation. “He was a valued member of the laboratory’s technical support staff and not a research scientist.”
Word of all these deaths (though not the specific details) found its way to Ian Gurney, a British writer. Gurney is the author of The Cassandra Prophecy: Armageddon Approaches, a book that uses clues from the Bible to calculate that Judgment Day will occur in or about the year 2023. He is currently researching his second book, which is in part about the threat of nuclear and biological weapons, and after Sept. 11 he entered a news alert request into Yahoo, asking to be notified whenever there was news with the key word “microbiologist.”
First Que, then Wiley, then Pasechnik, Schwartz and Nguyen popped up on Gurney’s computer. “I’m not a conspiracy theorist,” says the man who has predicted the end of the world, “but it certainly did look suspicious.” Gurney compiled what he had learned from these scattered accounts into an article that he sent to a number of Web sites, including Rense.com, which tracks U.F.O. sightings worldwide. “Over the past few weeks,” Gurney wrote, “several world-acclaimed scientific researchers specializing in infectious diseases and biological agents such as anthrax, as well as DNA sequencing, have been found dead or have gone missing.”
The article went on to call Benito Que, the cancer lab technician, “a cell biologist working on infectious diseases like H.I.V.,” and said that he had been attacked by four men with a baseball bat but did not mention that he suffered from high blood pressure. It then described the disappearance of Wiley without mentioning his seizure disorder and the death of Pasechnik without saying that he had suffered a stroke. It gave the grisly details of Schwartz’s murder, but said nothing of the arrests of his daughter’s friends. Nguyen, in turn, was described as “a skilled microbiologist,” and it was noted that he shared a last name with Kathy Nguyen, the 61-year-old hospital worker who just happened to be the one New Yorker to die of anthrax.
Of course, there have always been rumors based on skewed historical fact. Recall, for example, the list of coincidences that supposedly linked the deaths of Presidents Lincoln and Kennedy. It goes, in part, like this: The two men were elected 100 years apart; their assassins were born 100 years apart (in fact, 101 years apart); they were both succeeded by men named Johnson; and the two Johnsons were born 100 years apart. Their names each contain seven letters; their successors’ names each contain 13 letters; and their assassins’ names each contain 15 letters. Lincoln was shot in a theater and his assassin ran to a warehouse, while Kennedy was shot from a warehouse and his assassin ran to a theater. Lincoln, or so the story goes, had a secretary named Kennedy who warned him not to go to the theater the night he was killed (for the record, Lincoln’s White House secretaries were named John Nicolay and John Hay, and Lincoln regularly rejected warnings not to attend public events out of fear for his safety, including his own inauguration); Kennedy, in turn, had a secretary named Lincoln (true, Evelyn Lincoln) who warned him not to go to Dallas (he, too, was regularly warned not to go places, including San Antonio the day before his trip to Dallas).
I first read about these connections five years after the Kennedy assassination, when I was 8, which says something about how conspiracy theory speaks to the child in all of us. But it also says something about the technology of the time. The numerological coincidences from the World Trade Center that I mentioned at the start of this article made their way onto my computer screen by Sept. 15, from a friend of a friend of a friend of an acquaintance, ad infinitum and ad nauseam.
Professor Robins of Harvard points out that “the Web has changed the scale of these things.” Had there been a string of dead scientists back in 1992 rather than 2002, he says, it is possible that no one would have ever known. “Back then, you would not have had the technical ability to gather all these bits and pieces of information, while today you’d be able to pull it off. It’s well known that if you take a lot of random noise, you can find chance patterns in it, and the Net makes it easier to collect random noise.”
The Gurney article traveled from one Web site to the next and caught the attention of Paul Sieveking, a co-editor of Fortean Times, a magazine that describes itself as “the Journal of Strange Phenomena.”
“People send me stuff all the time,” Sieveking says. “This was really interesting.” Wearing his second hat as a columnist for The Sunday Telegraph in London, he wrote a column on the subject for that paper titled “Strange but True—The Deadly Curse of the Bioresearchers.” His version began with the link between the two Nguyens and concluded, “It is possible that nothing connects this string of events, but … it offers ample fodder for the conspiracy theorist or thriller writer.”
Commenting on the story months later, Sieveking says: “It’s probably just a random clumping, but it just happens to look significant. We’re all natural storytellers, and conspiracy theorists are just frustrated novelists. We like to make up a good story out of random facts.”
Over the months, Gurney added names to his list and continued to send it to virtual and actual publications around the U.S. Mainstream newspapers started taking up the story, including an alternative weekly in Memphis, where interest in the Wiley case was particularly strong, and most recently the Toronto Globe and Mail. The tally of “microbiologists” is now at 11, give or take, depending on the story you read. In addition to the men already discussed, the names that appear most often are these: Victor Korshunov, a Russian expert in intestinal bacteria, who was bashed over the head near his home in Moscow; Ian Langford, a British expert in environmental risk and disease, who was found dead in his home near Norwich, England, naked from the waist down and wedged under a chair; Tanya Holzmayer, who worked as a microbiologist near San Jose and was shot seven times by a former colleague when she opened the door to a pizza delivery man; David Wynn-Williams, who studied microbes in the Antarctic and was hit by a car while jogging near his home in Cambridge, England; and Steven Mostow, an expert in influenza, who died when the plane he was piloting crashed near Denver.
The stories have also made their way into the e-mail in-boxes of countless microbiologists. Janet Shoemaker, director of public and scientific affairs for the American Society for Microbiology, heard the tales and points out that her organization alone has 41,000 members, meaning that the deaths of 11 worldwide, most of whom were not technically microbiologists at all, is not statistically surprising. “We’re saddened by anyone’s death,” she says. “But this is just a coincidence. In another political climate I don’t think anyone would have noticed.”
Ken Alibek heard them, too, and dismissed them. Alibek is one of the country’s best-known microbiologists. He was the No. 2 man at Biopreparat (where Victor Pasechnik also worked) before he defected and now works with the U.S. government seeking antidotes for the very weapons he developed. Those who have died, he says, did not really know anything about biological weapons, and if there were a conspiracy to kill scientists with such knowledge, he would be dead. “I considered all this a little artificial, because a number of them couldn’t have been considered B.W. experts,” he says with a hint of disdain. “I got an e-mail from Pasechnik before he died, and he was working on a field completely different from this. People say to me, ‘Ken, you could be a target,’ but if you start thinking about this, then your life is over. I’m not saying I’m not worried, but I’m not paying much attention. I’m opening my mail as usual. If I see something suspicious, I know what to do.”
Others are not quite as sanguine. Phyllis Della-Latta is the director of clinical microbiology services at New York’s Columbia Presbyterian Medical Center. She found an article on the deaths circulating in the most erudite place—an Internet discussion group of directors of clinical microbiology labs around the world. These are the people who, when a patient develops suspicious symptoms, are brought in to rule out things like anthrax.
Della-Latta, whom I know from past medical reporting, forwarded the article to me with a note: “See attached. FYI. Should I be concerned??? I’m off on a business trip to Italy tomorrow & next week. If I don’t return, write my obituary.”
She now says she doesn’t really believe there is any connection between the deaths. “It’s probably only coincidence,” she says, then adds: “But if we traced back a lot of things that we once dismissed as coincidence—foreigners taking flying lessons—we would have found they weren’t coincidence at all. You become paranoid. You have to be.”
Don Wiley’s body was finally found on Dec. 20, near Vidalia, La., about 300 miles south of where he disappeared.
The Memphis medical examiner, O.C. Smith, concluded that yellow paint marks on Wiley’s car suggest that he hit a construction sign on the Hernando DeSoto Bridge, as does the fact that a hubcap was missing from the right front tire. Smith’s theory is that heavy truck traffic on the bridge can set off wind gusts and create “roadway bounce,” which might have been enough to cause Wiley to lose his balance after getting out of the car to inspect the scrapes. He was 6-foot-3, and the bridge railing would have only come up to mid-thigh.
“If Dr. Wiley were on the curb trying to assess damage to his car, all of these factors may have played a role in his going over the rail,” Smith said when he issued his report. Bone fractures found on the body support this theory. Wiley suffered fractures to his neck and spine, and his chest was crushed, injuries that are consistent with Wiley’s hitting a support beam before he landed in the water.
The Wiley family considers this case closed. “These kinds of theories are something that’s always there,” says Wiley’s wife, Katrin Valgeirsdottir, who has heard all the rumors. “People who want to believe it will believe it, and there’s nothing anyone can say.”
The Memphis police also consider the case closed, and the local office of the F.B.I. has turned its attention to other odd happenings. The talk of Memphis at the moment is the bizarre ambush of the city’s coroner last month. He was wrapped in barbed wire and left lying in a stairwell of the medical examiner’s building with a live bomb strapped to his chest.
Coincidentally, that coroner, O.C. Smith, was also the coroner who did the much-awaited, somewhat controversial autopsy on Don Wiley.
What are the odds of that?
August 11, 2002
Here is a mathematician’s nightmare I heard in the 1980s when that irritating, unconforming, self-regarding provocateur Benoît Mandelbrot was suddenly famous—fractals, fractals everywhere. The mathematician dreamed that Mandelbrot died, and God spoke: “You know, there really was something to that Mandelbrot.”
Sure enough.
Mandelbrot created nothing less than a new geometry, to stand side by side with Euclid’s—a geometry to mirror not the ideal forms of thought but the real complexity of nature. He was a mathematician who was never welcomed into the fraternity (“Fortress Mathematics,” he said, where “the highest ambition is to wall off the windows and preserve only one door”), and he pretended that was fine with him. When Yale first hired him to teach, it was in engineering and applied science; for most of his career he was supported at I.B.M.’s Westchester research lab. He called himself a “nomad by choice.” He considered himself an experienced refugee: born to a Jewish family in Warsaw in 1924, he immigrated to Paris ahead of the Nazis, then fled farther and farther into the French countryside.
In various incarnations he taught physiology and economics. He was a nonphysicist who won the Wolf Prize in physics. The labels didn’t matter. He turns out to have belonged to the select handful of 20th-century scientists who upended, as if by flipping a switch, the way we see the world we live in.
He was the one who let us appreciate chaos in all its glory, the noisy, the wayward and the freakish, from the very small to the very large. He gave the new field of study he invented a fittingly recondite name: “fractal geometry.” But he wanted me to understand it as ordinary.
“The questions the field attacks are questions people ask themselves,” he told me. “They are questions children ask: What shape is a mountain? Why is a cloud the way it is?” Only his answers were not ordinary.
Clouds are not spheres—the most famous sentence he ever wrote—mountains are not cones, coastlines are not circles and bark is not smooth, nor does lightning travel in a straight line.
If you closely examine the florets of a cauliflower (or the bronchioles of a lung; or the fractures in oil-bearing shale), zooming in with your magnifying glass or microscope, you see the same fundamental patterns, repeating. It is no accident. They are all fractal. Clouds, mountains, coastlines, bark and lightning are all jagged and discontinuous, but self-similar when viewed at different scales, thus concealing order within their irregularity. They are shapes that branch or fold in upon themselves recursively.
I was following him from place to place, reporting a book on chaos, while he evangelized his newly popular ideas to scientists of all sorts. Wisps of white hair atop his outsize brow, he lectured at Woods Hole to a crowd of oceanographers, who had heard that fractals were relevant to cyclone tracks and eddy cascades. Mandelbrot told them he had seen the same channels, flows and back flows in dry statistics of rising and falling cotton prices. At Lamont-Doherty Geological Observatory, as it was then known, the geologists already spoke fractally about earthquakes. Mandelbrot laid out a mathematical framework for such phenomena: they exist in fractional dimensions, lying in between the familiar one-dimensional lines, two-dimensional planes and three-dimensional spaces. He revived some old and freakish ideas—“monsters,” as he said, “mathematical pathologies” that had been relegated to the fringes.
“I started looking in the trash cans of science for such phenomena,” he said, and he meant this literally: one scrap he grabbed from a Paris mathematician’s wastebasket inspired an important 1965 paper combining two more fields to which he did not belong, “Information Theory and Psycholinguistics.” Information theory connected to fractals when he focused on the problem of noise—static, errors—in phone lines. It was always there; on average it seemed manageable, but analysis revealed that normal bell-curve averages didn’t apply. There were too many surprises—outliers. Clusters and quirks always defied expectations.
It’s the same with brainwaves, fluid turbulence, seismic tremors and—oh, yes—finance.
From his first paper studying fluctuations in the rise and fall of cotton prices in 1962 until the end of his life, he maintained a simple and constant message about extraordinary economic events. The professionals plan for “mild randomness” and misunderstand “wild randomness.” They learn from the averages and overlook the outliers. Thus they consistently, predictably, underestimate catastrophic risk. “The financiers and investors of the world are, at the moment, like mariners who heed no weather warnings,” he wrote near the peak of the bubble, in 2004, in The (Mis)behavior of Markets, his last book.
Fractals have made their way into the economics mainstream, as into so many fields, though Mandelbrot was not really an economist; nor a physiologist, physicist, engineer….
“Very often when I listen to the list of my previous jobs, I wonder if I exist,” he said once. “The intersection of such sets is surely empty.”
December 21, 2010