It doesn’t make a difference what temperature a room is, it’s always room temperature.
—Steven Wright
There are few things more pleasurable, on Earth or in the heavens, as a good hot bath or shower. Its enveloping warmth is reassuring, replenishing, rejuvenating—and for a good reason: Heat implicitly means life. Life requires heat, though like the fable of Goldilocks’s bears, not too much heat, not too little, but just the right amount. The heat from a bath or a lover’s embrace is a reminder that life is good and, if only for a moment, all is well.
Heat, at its essence, is motion—the motion of atoms and molecules goaded into movement by electricity, compression, chemical reactions, nuclear forces, or one of many other sources of energy. Energy inevitably turns into motion, like kids on a sugar high, and the motion spreads from atom to atom, molecule to molecule, until the warmth is shared as equally as possible.
Temperature—the measure of heat—is our natural way of gauging how much energy is in something. An ice cube has little energy to offer. A sweltering hot day is buzzing with energy, though the ambient humidity may make you feel as though you’re drained of yours. We are constantly aware of temperature, because every aspect of our life and health depends on it.
And yet we actually recognize only a small sliver of the wide range of temperatures in nature. Our fragile bodies can handle only the smallest variation in heat. An object only 30 degrees greater than our own internal temperature can cause significant burns, and if our own body temperature drops even 10 degrees for any significant amount of time, the result is catastrophic. Of course, our warm-blooded metabolism lets us regulate our body heat appropriately, so we perspire to cool ourselves, or generate cell heat as required, even resorting to the wild movements of involuntary shivering if necessary. But if these systems fail, either hyper- or hypothermia can set in, shutting down key chemical reactions in your organs and ultimately leading to death.
Other animals have adapted to deal with heat fluctuations in other ways. The North American wood frog doesn’t even try to get warm when winter sets in. Instead, as the temperature drops, it suffuses its cells and bloodstream with a cocktail of sugars and proteins that allows it to freeze solid without tissue damage. Once frozen, it shows no signs of life whatsoever: no heartbeat, no breathing, no kidney function. It is as dead as a stone . . . until the spring thaw, when some deep unknown signal miraculously tells everything to start up again, and in a matter of hours the frog is hopping about looking for a mate.
We’ve learned to survive in the harshest of both arctic and desert conditions, but even these are temperate compared with some places in space, or deep inside the crust of our planet. Temperature is energy in motion, and energy—as you may guess—covers a wide gamut.
Measuring Temperature If you were traveling to Sweden and heard the temperature was 22°, would you want to bring a jacket? Would you worry if, instead, the forecast read 295°? It all depends on the scale you’re using, of course: Celsius, Kelvin, or Fahrenheit.
A bolt of lightning can reach 50,000°F—hotter than the surface of the sun—and packs a punch between 100 million and 1 billion volts.
Scientists as early as the second century BCE discovered that certain substances, such as air, expand when heated, but it was not until the seventeenth century that scientists such as Galileo Galilei used this knowledge to build devices that would measure heat itself, called “thermo-meters.” When Isaac Newton made the radical suggestion to place marks on the thermometer in order to better record specific values, he prescribed that the zero mark should indicate melting ice and 12 reflect the temperature of the human body.
The value of 12 may seem odd (why not 10 or some other reasonable number?), but note that this duodecimal system is particularly handy when it comes to splitting evenly into sixths, quarters, thirds, and halves. Thus, the English standardized on 12 inches to a foot, 12 pence to the shilling, 12 units in a dozen, 12 dozens in a gross, and so on.
In 1714, a young glassblower-cum-physicist named Daniel Gabriel Fahrenheit hit upon several genius improvements. Instead of using sticky, imprecise liquids such as alcohol inside the thermometer, he used mercury. In order to encompass a wider range of values, he set the zero mark at the melting point of saltwater, which freezes at a significantly lower temperature than freshwater. And he enabled finer increments by increasing the top value, the body’s temperature, to 96°. (Once again, the choice of 96 makes sense only when you notice that it’s easily divisible by 2, 3, 4, 6, 8, 12, and so on.)
A few years later, as scientists decided the boiling point of water was a more important value than body temperature, Fahrenheit’s scale was fudged slightly. Water, it was declared, should boil at exactly 180 degrees above its freezing point of 32°, at 212°F. This adjustment was convenient (180 is also easily split into smaller fractions and matches up nicely with the number of degrees in a half circle), but this required stretching Fahrenheit’s scale a little bit, so that our body temperature would now be marked at the awkward value of 98.6°F.
▲ Anders Celsius
Given that the melting and boiling points of water are such important measurements (at least here on Earth), why not set those values to 0 and 100? Such was the reasoning of the Swedish astronomer Anders Celsius in 1742. (Actually, to be accurate, he bizarrely set freezing at 100 and boiling at 0, but that was quickly rectified a couple of years later, soon after he died.) Because each division on this thermometer measured exactly one hundredth of the total scale, it was labeled as a “degree centigrade” (Latin for “hundred steps”). That nomenclature stuck for three hundred years, until, in 1948, the term was changed by the International Committee for Weights and Measures to “degrees Celsius.”
Each degree on the Fahrenheit scale equals 5/9 of a degree on the Celsius scale. That has the curious result that both scales converge at –40 degrees: –40°C = –40°F.
To convert Fahrenheit to Celsius, subtract 32, multiply by 5, and then divide by 9. To convert °C to °F, multiply by 9, divide by 5, and then add 32.
Now that we had two different ways of measuring heat, why bother with a third? By the mid-eighteenth century, scientists had realized that there was a world far beyond that of the boiling and freezing of water. At first, there didn’t appear to be any limit to how hot or cold a substance could get. After all, if you could heat something to 1,000° Celsius (the temperature in a typical stovetop flame), then why not cool it to –1,000°C?
Unfortunately for this theory, as people were finding clever ways to cool nitrogen and other gases to their freezing point, they discovered a curiosity: For each degree Celsius you cool a gas, it reduces in volume a tiny amount—about 1/273. That led them to an intriguing conclusion: At that rate, the gas would disappear entirely, or at least take up zero space, when it reached –273°C. While their understanding of the science was still immature, they did correctly surmise that this must signify the coldest temperature possible.
▲ Sir William Thomson, later Lord Kelvin
Scientists attending the 1967 General Conference on Weights and Measures gave Lord Kelvin one of the greatest honors of all when they agreed to drop the word degree and just call the measurements “kelvins.” Thus, he joined a select group of inventors whose names have become uncapitalized measurements, such as watt, volt, ampere, and joule.
The data was compelling enough that the Scottish physicist William Thomson—who had gained fame and a knighthood for his work on the telegraph—suggested that this lowest low should be the new zero, an absolute zero. Sir William’s idea stuck, but you never hear about “degrees Thomson.” Rather, he was later raised to the House of Lords, assuming the heady title Lord Kelvin of Largs. (He lived in Largs, and his office was on the river Kelvin, which flows through Glasgow.) So scientists began to talk in terms of “degrees Kelvin”—a system in which each degree is the same “size” as a degree Celsius, but in which zero starts much lower.
Because heat is motion, and motion is energy, and there are many different forms of energy and reasons to discuss them, science has developed a number of other ways to describe and discuss heat in a system. We talk about calories, for example, and when discussing the heat from a burning gas we might talk about joules or numbers of BTU (British thermal units), where a single BTU describes the energy given off by a wooden kitchen match.
Fortunately, we don’t need this alphabet soup of heat descriptors to talk about how hot a proper cup of tea should be, or the weather on Venus. For common, daily usage, degrees Fahrenheit or Celsius do fine, and even when exploring the very cold or very hot, all but the geekiest discussions can use kelvins.
The Chaos Meter Here’s an amazing magic trick you can try at home: Place a chunk of ice on a plate and leave it out on the counter. By mumbling numerous incantations (and just waiting awhile) that solid object transforms into—gasp!—a liquid. Wait long enough and that clear puddle—double gasp!—appears to vanish completely. Like any good magic trick, it seems miraculous until you know how it’s done, at which point it becomes ordinary. But allow yourself a moment to see the magic through fresh eyes. A solid turning liquid turning gas, a seemingly insignificant element: heat.
For centuries, scientists assumed that heat was literally an element—an invisible fluid called “caloric” that traveled from object to object. The seventeenth-century English philosopher and father of liberalism John Locke suggested that heat was a form of kinetic energy—the motion of the tiny “insensible parts” of a substance.
If heat is motion (which we now know it is), then there is technically no such thing as “cold.” Obviously, one object may feel colder than another, but what you’re really talking about is heat—there is only heat, which can be added and removed, motion sped up or slowed down. In other words, one object isn’t really colder than the other; it’s just less warm.
The heat required to raise the temperature of 1 gram of water by 1°C is called a calorie. Don’t confuse that with the calories listed on a nutrition label; when you “burn calories” in your daily workout, you’re actually talking about kilocalories (or kcal, each one equaling 1,000 calories). One joule equals about one quarter of a single calorie.
And here’s the clever part: The measure of heat is also the measure of how much chaos or order there is. Cooling a substance removes energy, allowing molecules to rest into rigid structures. Adding heat smashes up that architecture, leaving a somewhat dense broth; adding even more heat allows the molecules to break free from one another, a rapidly expanding gas of entropy like a flock of birds rushing into the sky. Even the word gas itself stems from a Dutch pronunciation of the Greek word chaos.
It’s important to remember that at the atomic level, nothing ever stops moving. On a pleasant spring day, air molecules are flying at 1,150 mph (1,850 km/h), manically bouncing off one another’s magnetic fields, gently buzzing your skin, generating pressure and transporting heat. Even in a solid, like a crystal in which each atom is held tightly in place, molecules never stop dancing, wriggling atom to atom along their internal degrees of freedom. And, in turn, each atom vibrates with activity, each electron a never-ending now-it’s-there-now-it’s-not blur of enterprise.
One result of this constant motion is that solids aren’t always as solid as we think they are, and molecules that you thought were pretty consistent sometimes unexpectedly change from one phase to another. For example, some of the molecules in a block of ice will, even if kept below freezing, change into a liquid phase, then usually freeze again. Even stranger, if left alone, the ice will eventually evaporate, as frozen molecules literally boil into vapor in a phase-skipping process called sublimation. Similarly, in a gas, a few cooler molecules will spontaneously combine to form a liquid, then return to gas again, in a continuous state of transmutation.
The most you can say about any material is that it tends toward a particular phase (solid, liquid, or gas) at a particular temperature and a particular pressure—for pressure, too, has a huge effect on phase. Increasing pressure increases temperature—that’s why a bicycle tire gets warmer as you pump it full of air—but it also changes the melting or boiling point of a substance. Water boils at a lower temperature on top of a mountain than in a valley, as there is less air pressure to hold it in liquid form. Take a liquid far higher, into the low-pressure vacuum of space, and it instantaneously vaporizes, expands, cools, and then sublimates into tiny shards of crystal.
One thing is for certain, though: However matter changes, heat is never actually gained or lost; it is simply moved from place to place, or transformed from one kind of energy to another. This is at the heart of the laws of thermodynamics (a fancy way of saying “how heat moves”). Just as a gas always expands to fill its container, or people spread out to fill the space in an elevator, heat always spreads into cooler areas. Thus, when you use a thermometer to take your temperature, your body heat cools down a tiny bit as energy moves into the probe, until the two (body and device) are the same.
Making Things Cold Few people understand how a refrigerator—or an air conditioner, for that matter—makes the air cool. You can’t just take air and “add cold” to it; you need to suck the heat out of the air you already have. When you use a spray can, you notice something curious: The longer you spray, the colder the can gets. The reason is simple: The pressure inside the can drops when you press the button, and the lower the pressure, the less the molecules bounce around, and so the colder the gas becomes. Of course, after a moment, the can absorbs heat from the air around it, so the effect fades.
A refrigerator captures this same effect but carefully controls it—and keeps the chemicals in a closed loop in order to use them again. A substance such as liquid carbon dioxide or Freon is released from a compressed tube, through a spray nozzle, into a larger set of tubes, where the gas rapidly expands and becomes very cold, very quickly. Air from inside the refrigerator is blown over these tubes, transferring any heat to the colder-than-ice gas. The air circulating around your food (or room) gets colder, and the now warmer gas is then pumped out, compressed until it becomes very hot, and run through condenser coils, where all the heat is released into the room (or outside). By the time the gas gets to the end of these coils, it has returned to room temperature (and much of it has turned to liquid), but it’s still highly compressed, ready to repeat the process all over again.
▲ The refrigeration process
Meanwhile, while you’re cooling your food or home or whatever, you’re also pulling water out of the air—dehumidifying it. The cooler the air, the less moisture it’s able to contain, so water vapor collects (called condensation) around the cold pipes—just another phase change due to heat.
Because cold air contains less moisture, mountain ranges in temperate areas receive far more snow than the north or south poles. The Antarctic is essentially a desert, with extremely dry air and even less precipitation than Phoenix, Arizona!
The funny thing about water (which makes up the vast majority of us, the food we eat, and the surface of our planet) is that it expands slightly when cooled. This is unusual, as most other substances get more dense as they cool and freeze. This property of water is one reason life does so well on Earth—after all, the bigger, less dense ice floats instead of sinking, letting water freeze from the top down, so fish and other plants can live protected from the cold.
Unfortunately, even though ice takes up only 9 percent more space than water, that’s enough to cause massive destruction where you least want it. When water leaks into a tiny crack and freezes, the force of the expansion can split rock or concrete; metal pipes can burst, glass can shatter. Organic materials like meat and vegetables—which you’d expect to be more pliable—often fare the worst, as sharp ice crystals rip open delicate cell membranes, rupturing their contents. The result, when thawed, is the flavorless, mushy mess that disheartens so many hopeful cooks.
In 1923, the inventor Clarence Birdseye found that quickly freezing a thin layer of fish filets created much smaller and more evenly spaced ice crystals that avoided most of the cell damage. What works for fish works for other food, too, and by 1928, Americans were buying over a million pounds of frozen foods each year. In theory, flash-frozen food will hold indefinitely, but as we’ve learned, even ice cubes evaporate, and moisture slowly wicks away wherever cold air can get to it, causing unsightly (and tissue-damaging) freezer burn.
Denizens of cold climes know that stepping on snow causes little noise at air temperatures near 0°C, when a thin film of water lubricates the rubbing between the ice crystals. At much lower temperatures there is no water film, so the friction produces a relaxation oscillation called a squeak.
Of course, if you want to freeze something really fast, you put it in an environment far colder than normal ice. Take frozen carbon dioxide (CO2), for example, otherwise known as dry ice. The French scientist Charles Thilorier was the first to change the phase of this gas, in 1834, by placing it under incredible pressure (thus radically raising its boiling point), then releasing the pressure, dropping the temperature so quickly that it desublimated directly into a solid. It sounds like a simple process, but at the time, these kinds of experiments were dangerous; one of Thilorier’s assistants lost both legs when the apparatus exploded during testing.
You can buy dry ice at a grocery store, but don’t touch it: The surface is –79°C (–110°F), so cold that it burns, destroying your skin cells. However, it’ll keep a container cold for quite a while as the solid CO2 slowly, dryly evaporates back into a somewhat harmless gas.
▲ Phase change
The Hunt for Zero Frozen carbon dioxide’s melting point of –79°C is pretty nippy, but in 1983, at the Vostok Station in Antarctica, the thermometer outside dropped to –89.2°C (–128.6°F)—currently the coldest naturally occurring temperature ever recorded on Earth. There have certainly been events even colder, but without the benefit of humans to record them.
At –100°C, rubber tires freeze—a point well taken by recyclers, who shatter them into tiny shards to be reused in other materials. About 80 degrees colder, the oxygen and nitrogen we breathe liquify; and only 40 degrees colder than that, at –219°C (–362°F), they turn solid.
Liquid nitrogen and oxygen have a wide array of uses, as they freeze almost anything virtually instantaneously, perfectly preserving it in stasis. Cryobiologists commonly keep sperm cells, stem cells, and many other plant and animal tissues at –196°C (–320°F) indefinitely with little loss when thawed. And if you can freeze a cell, why not an organ, or even a whole body? In the early 1960s, a Japanese researcher froze a number of cat brains for days, weeks, and even months. After they were carefully warmed with a bloodlike substance, there were measurable (though brief) brain signals that were very similar to those from the original live brain.
Following this promising (though disquieting) evidence, cryonics experts have frozen hundreds of human bodies in liquid nitrogen in hopes that someday technology will advance enough to reanimate these people. Some customers choose to preserve only their brains (removed as quickly as possible after the moment of death), assuming that any future civilization capable of thawing a whole body could just as likely transplant the brain into a new body, or perhaps even create a new brain using the original, intricately woven neurons as a model.
Another benefit of compressed liquid air is that, when released, it boils from a compact form to a gas extremely rapidly—even explosively. In 1926, the American physicist Robert Goddard found a way to control this reaction, turning it into a fuel to propel a rocket—the same technique NASA and every international space agency has used to launch satellites, place astronauts on the moon, and boost shuttles to the International Space Station. Clearly there is great power in carefully managing the cold.
Brain freeze (also called an ice-cream headache, or sphenopalatine ganglioneuralgia) happens when cold food against the roof of the mouth makes capillaries contract, then expand rapidly, causing the trigeminal nerve to send a pain message to the brain. This is a major face nerve, so we get referred pain.
Space can get far colder than our temperate little planet, of course. The surface of Pluto is about –223°C (–369°F), and a crater on the dark side of the moon has been measured at a few degrees colder than that. While it would seem like there would be no temperature at all in the dead of space, far out between the galaxies where no stars burn, astronomers have discovered that even “emptiness” has an amazingly consistent “background radiation” temperature of about –270°C, or –455°F—more easily notated as 2.7 kelvins. This ubiquitous but mysterious heat has fascinating implications for our understanding of how our universe was born and grew.
Shortly after the big bang (approximately 13.7 billion years ago), the universe contained a lot of material in a still relatively small space and glowed unbelievably brightly with extreme heat. Astrophysicists believe that the faint background radiation, which they can detect no matter where they point radio telescopes, is the remnant of those early days, like the heat on an oven’s walls long after the oven has been turned off. Curiously, on closer inspection, they’ve found that space is a tiny bit warmer in some places and a tiny bit cooler in others—for example, it may be 2.7249 K in one spot and 2.7250 K in another. These infinitesimal differences are likely the result of quantum fluctuations that have been stretched out by the inflation of the space-time continuum. In other words, as the universe has cooled and expanded, the massive differences between scorching and less-scorched spots that must have existed within the explosive fireball have all cooled and almost evened out, creating this pattern.
These temperatures are cold enough to freeze almost anything except helium, with a melting point of 0.95 K (–272.2°C, –458°F)—which scientists long considered the final frontier. Then, in 1908, the Dutch physicist Heike Kamerlingh Onnes accomplished the task in an experiment that took seven years of preparation, followed by thirteen hours of slowly cooling the helium until it solidified. Referring to the explorers who were attempting to reach Earth’s poles during that same decade, Kamerlingh Onnes explained his passion for this work: “The arctic regions in physics incite the experimenter as the extreme north and south incite the discoverer.”
▲ The coldest known natural temperature in the universe is the Boomerang Nebula, a bow-tie-shaped expulsion of gas from a dying star about 5,000 light-years away in the constellation Centaurus. You’d expect an explosion like this to generate huge quantities of heat, but in fact the extremely rapid expansion of the gas into space had just the opposite effect, acting like the expansion tube in a refrigerator and leading to a temperature of about 1 K.
Where It All Gets Wacky Reaching toward absolute zero is like one of those horrible nightmares where the faster you try to run, the slower you get. We know that zero kelvin (–273.16°C) is a hard stop we can never achieve—a physical extremity like the speed of light. At absolute zero, all atomic and subatomic motion would stop and all particles would come to rest with zero energy. As far as our current understanding of science can see, accomplishing absolute zero would violate one of the fundamental laws of quantum physics, Heisenberg’s uncertainty principle, which states that we cannot pinpoint the exact location and momentum of every particle, even electrons.
This kind of boundary—where each step closer is exponentially more difficult—is called an asymptotic limit. In this case, as you get closer to zero, that is, as you are trying to remove more and more heat, you actually start generating heat. But the challenge is worth the effort, for the world of the ultracold reveals some astonishing phenomena.
To play in this realm, you need more clever ways of reducing heat besides just compression and expansion. For example, common evaporation lowers surface temperature, which we humans take advantage of by sweating on a hot day. You can also suck heat out of a system by running a small electrical charge to create a difference in temperature between two metal plates, a device used in many portable camping coolers. But when working at extremely low temperatures, below 1 K, nothing compares to cooling atoms with lasers.
Laser cooling techniques, developed in the mid 1980s with names like Doppler or Sisyphus cooling, all work by focusing two or more intense beams of light at a tiny group of atoms. By precisely tuning the electromagnetic wavelengths of the lasers, scientists can draw atoms in one direction or another by bombarding them with photons. One of the inventors of this method, the Nobel Prize–winning American physicist Carl Wieman, described the process as “like running in a hail storm so that no matter what direction you run the hail is always hitting you in the face . . . So you stop.”
In the 1980s, scientists achieved thousandths of a kelvin. In the ’90s, they slowed down the atoms even more, lowering the temperature to millionths of a kelvin, then hundreds of billionths, offering an unprecedented glimpse at the world of the supersmall. Luis Orozco, a physics professor at the University of Maryland, explained in a NOVA documentary: “It is as if I were to ask you, ‘Could you tell me something about the handles of a car that is passing on a highway at 50 or 60 miles an hour?’ Definitely you won’t be able to say anything. But if the car is moving rather slowly, then you would be able to tell me, ‘Oh yes, the handle is this kind, that color . . .’ At room temperature an atom is moving at roughly five hundred meters per second [about 1,100 mph]. However, if I slow it to a temperature that we can now achieve without much work in the lab, two hundred microKelvin, then the atoms start to move about twenty centimeters per second. Compared to something that’s rushing in front of you, you’d be able to look at a lot of the details, a lot of the internal structure of that atom.”
If the sun went out, Earth’s surface would cool to about –220°C, warmed slightly from heat coming from its core.
But it turns out that supercooling atoms encourages them to behave bizarrely—behavior that not only provides insights into the nature of matter but also may allow us to improve technology in extraordinary ways. For example, while some metals are better than others at conducting electricity, some materials, such as lead or buckminsterfullerines, become superconductors when cooled to extremes. A superconductor doesn’t just allow electrical impulses to pass through it; it does so without offering any resistance—a current running through a loop of superconducting wire will never fade.
It’s unclear how superconductors can pull off this feat of perpetual motion, but it appears that as the temperature drops, the atoms vibrate less and electrons can slip through more easily. The electrons actually seem to group into pairs, each tugging the other forward, when normally they would repel.
However it’s accomplished, superconductivity has led to incredibly powerful and precise magnets—magnets that today power MRI scanners and particle accelerators. Magnetically levitating (maglev) trains based on superconductor technology are still being tested, but they have already broken the world record for fastest-moving train, on an experimental track in Japan, at 581 km/h (361 mph). Someday, whole power grids may be based on superconductors, as estimates suggest that, in transferring electricity, 110 kilograms (250 lb) of superconducting wire could replace 8,100 kg (18,000 lb) of copper wire.
A second characteristic of supercooled atoms is that they can become superfluid—that is, at a certain point, the atoms in liquid helium begin to ignore friction. If you swirl a superfluid, it keeps swirling forever; if you spin its container, the superfluid inside remains motionless. A superfluid can escape through extremely small pores that would normally hold any liquid. Weirdest of all, because a superfluid does have surface tension—like all liquids, it has a slight attraction to the sides of a glass container—it gradually creeps up the sides of a cup until it flows out on its own accord, like a translucent creature that somehow knows how to escape confinement.
But these superpowers were just a small taste of the wonders scientists were about to find in the nanokelvin zone. In 1995, when the physicists Carl Wieman and Eric Cornell cooled a small collection of rubidium atoms to these extremes, they encountered a breakthrough moment: The atoms suddenly shifted phase—not to a liquid, or a solid, but to an entirely new state of matter, never before seen.
To understand this new state, we need to look back to 1924, when Albert Einstein and the Indian physicist Satyendra Nath Bose theorized that as individual atoms neared absolute zero, they would change in an extraordinary way. Quantum mechanics states that all atoms can be described as either particles (things) or waves (energy). Near zero, the theory went, atoms should begin to act less like particles and more like waves; then the waves would get longer until they overlapped, suddenly acting like a single wave—that is, as though all the atoms were one “superatom.”
This new state, a “holy grail of cold” called the Bose–Einstein condensate (BEC), is what Wieman and Cornell had created in their lab. Like everything else about quantum physics, BEC is a counterintuitive mystery. The atoms still exist, yet they have expanded their size—their awareness, as it were—in a way that we still don’t understand.
The condensate behaves unlike any other material. The atoms vibrate—barely—in unison, a quantum lockstep that acts like a giant magnifying glass on what is usually far too small to see. In 1998, the Harvard physicist Lene Vestergaard Hau found that she could shine a laser into a BEC made of millions of sodium atoms and slow down the light to 68 km/h (38 mph)—a huge leap from its normal speed of 300,000 km (186,282 mi) per second. A few years later she found a way to tune the BEC using lasers of specific wavelengths, letting her literally stop the light entirely, then release it again on its way.
Here’s how it works. The light pulse is converted to a hologram inside the condensate, literally creating a copy in matter, which can actually be transferred from one BEC to another nearby, like handing over a packet of information, before transforming back into light again. The implications are staggering and point to future quantum computers that may run on light instead of electricity.
On the other hand, these tiny condensates can also explode in an unexpected, extremely tiny version of a supernova—which scientists, recalling the 1960s Brazilian music boom, call a “Bosenova.”
What happens in environments even colder than the nanokelvin? We’re still finding out. After the Nobel laureate Wolfgang Ketterle trapped a cloud of sodium atoms in place with magnets in 2003, his team at MIT was able to laser-cool the gas to 500 picokelvins—half a billionth of a degree above absolute zero.
The physicist Juha Tuoriniemi at the Helsinki University of Technology’s Low Temperature Laboratory has taken a small piece of rhodium metal as low as 100 pK (1 × 10–10 K), but every supercold researcher today is focused on the next breakthrough: the femtokelvin, millions of times colder than the temperatures required to build a Bose–Einstein condensate. Scientists are hungry to see what surprises await in this often unpredictable realm where our everyday assumptions are superseded by the improbable results of quantum physics.
Some Like It Hot It’s a common misconception that heat rises. In reality, dense things sink and less dense things get pushed out of the way—which typically means they float up, like bubbles in a drink. That holds true whether it’s air in a room or lava under Earth’s crust. The difference in density is caused, of course, by heat. Add heat and most substances expand, lowering their density as the atoms and molecules dance and twitch. (We’ve already seen that ice offers one exception to this; silicon is another. But they are truly oddities among the vast majority of materials.)
Solids aren’t in any position to move much, but everyone knows that running hot water over the metal lid of a tightly sealed jar makes it easier to open—the heat literally expands the metal, even if just a small amount. Engineers using steel in railroad tracks and bridges have to take this effect into account anywhere the ambient temperature is likely to rise or fall significantly. On a hot day, a beam may expand several millimeters in length, buckling if an expansion joint hasn’t been provided.
A liquid or a gas offers plenty of latitude to move about, creating convection currents in which colder areas drop, get warmed, rise, lose some of their heat, and then drop again. This cycle is particularly helpful in cooking, but we can see it everywhere on Earth: weather patterns, the circulation of the ocean, hot soot rising up a chimney flue. Clearly, heat causes a lot of motion at the macro as well as the micro level.
At some point, if you add enough heat, you can force even more changes. A liquid boils, forcibly rending molecules apart until they fly into a gas. Even solids can change dramatically. If you place an intense heat source under a piece of paper, those pressed-flat plant fibers will undergo a radical transition: Around 150°C (300°F), the cellulose material starts to decompose, releasing into gases. We typically call this mixture of hydrogen, oxygen, and tiny carbon particles smoke. The more particulates that are released, the more smoky it appears. Some substances in paper don’t burn without far more heat, of course, so some material remains, darkened, called “char.”
If you apply even more heat, an astonishing thing happens: The various molecules in the paper and gas get so excited that they break apart into atoms, which quickly recombine to form carbon dioxide, water vapor, and other molecules. These blindingly fast chemical reactions have an interesting side effect: They generate even more heat, so even if you remove the original heat source, the new gases are so hot that they cause even more molecules to break up. As long as you have fuel to burn, and oxygen for it to react with, the heat keeps the process cycling. Obviously, we know this amazing chain reaction by a simple name: fire.
The science-fiction author Ray Bradbury called his 1953 classic, about a man who burned books for a living, Fahrenheit 451. Of course, he could have named it the metric equivalent: Celsius 233. This is the temperature at which paper made from wood pulp begins to burn.
Fire has been held as magical for millennia, a gift to humankind so extraordinary that it must have been stolen from the gods, and assumed to be so fundamental that the ancients gave it elemental status alongside earth, air, and water. Now we know that fire is simply the transition from one state to another, and the flames we usually see are simply the gases and particles glowing with incandescent heat.
In fact, anything will glow when it gets hotter than 525°C (977°F), named the Draper point, after the nineteenth-century American chemist John William Draper, who first wrote about this effect. (Coincidentally, Draper was also fascinated by photography and is credited with producing the first clear photographs of a female face and of the moon.) To be accurate, anything cooler than 525°C glows, too, but with infrared light, so we can’t see it. But at the Draper point the light waves carry enough energy that we begin to see a faint red glow. Heat something up to 725°C (1,337°F) and the color becomes positively luminous—red hot, as we call it.
It’s easy to remember that red hot is about 1,000 K. At 3,000 K, a material glows bright orange; at 6,000 K, it turns yellow-white. Guess the surface temperature of our sun . . . right, just about 5,800 kelvins. If the sun’s surface were hotter, it would appear bright white, or even—at 10,000 K—blue. This color spectrum is called black-body radiation, and it describes how thermal energy gets partially converted into electromagnetic energy, photons that will travel through space indefinitely. As long as there is movement, there is heat, and where there is heat, there is light.
Granted, some flames burn hot but we can’t see them—or can barely see them. A pure hydrogen fire burns clear as the gas combines with oxygen in the air and turns into water vapor; a pure ethanol fire burns so hot that the blue is almost indistinguishable in the bright light of day. Plus, different chemicals release different colors when heated, explaining the rich tonal range of a wood fire. And sometimes the color of a flame doesn’t tell the whole story: The blue section near the base of a match flame is technically hotter than the yellow tip, but due to a number of real-world factors (like air flow), it’s generally easier to light a candle using the cooler tip.
Absolute Hot A candle flame is plenty warm enough for most of us, but it’s only the beginning when it comes to the spectrum of hot. Because heat is just another form of energy, you can raise an object’s temperature in all kinds of ways, from running an electrical current through it to blasting it with microwaves.
“By convention sweet is sweet, by convention bitter is bitter, by convention hot is hot, by convention cold is cold, by convention color is color. But in reality there are atoms and the void. That is, the objects of sense are supposed to be real and it is customary to regard them as such, but in truth they are not. Only the atoms and the void are real.”
—Democritus, Greek philosopher
One way to get a gas hot is by compressing it, which also drops its boiling point so that it may return to a fluid state. If you apply enough pressure and heat, you create something called a supercritical fluid—not different enough to earn status as a new form of matter but nevertheless possessing some very cool properties.
For example, a supercritical fluid can dissolve materials like a liquid and also pass through semiporous solids like a gas. If you infuse a bunch of green coffee beans in a high-pressure bath of nontoxic hot carbon dioxide (CO2), the supercritical fluid seeps through the beans, absorbing caffeine and drawing it out. Then release the pressure, and the CO2 suddenly vaporizes into a steam, leaving the decaffeinated beans ready for roasting. Supercritical fluids are used in dry cleaning, essential oil extraction, dyeing materials, and all sorts of other applications.
However, if you warm up a gas even further, you do, in fact, achieve a new, fifth form of matter: plasma. At first, you may not notice any difference between a gas and a plasma, but the latter contains a bunch of atoms that have ionized—they’ve gotten so excited that they’ve broken free from their molecular relationships and stripped off some of their electrons, like partiers throwing caution to the wind.
This high-temperature concoction of positively charged ions and negatively charged electrons displays some interesting characteristics. To start with, you can run an electrical current through it, which is what makes neon signs, plasma televisions, and fluorescent lightbulbs glow. Switch on the power, and the molecules in the gas are turned into a superhot plasma. Fortunately, the gas is under such low pressure (there aren’t that many atoms buzzing about) that the total heat enclosed in the lamp isn’t hot enough to melt things around it.
In these applications, the color we see doesn’t come from the heat that the lamps generate; instead, the plasma causes phosphorescent chemicals painted on the glass to luminesce. But in some other instances, the gaslike substance itself lights up as atoms and electrons reunite, resulting in bright light and intense heat. Plasma cutters, which spray a high-speed stream of electrified gas plasma through a nozzle, can cut through steel up to 6 inches (150 mm) thick.
We all know of another common plasma: the sun. In fact, all stars are made of plasma. And, weirder, most of the free-floating gas sparsely spread between planets and stars is in a plasma state. Astronomers estimate that as much as 99.9 percent of all the visible matter in the universe is plasma.
But just because something is superhot doesn’t mean it’s going to be plasma. Earth’s core is hot—at 6,650°C (12,000°F), it’s hotter down there than the surface of the sun—but gravity tugs on each atom across thousands of kilometers, creating intense pressure in the center, which is very hot but solid. The larger the mass, the more heat is generated, so Jupiter’s core temperature is estimated to be as high as 20,000°C (36,000°F).
Yet that’s a cold shower compared with what happens inside the blast of an atom bomb or nuclear reactor, where temperatures of millions of degrees can be achieved through fission—the splitting of heavy atoms like uranium into smaller ones. And fission, in turn, is somewhat pitiful compared with the universe’s real power: fusion.
Revelation 21:8 implies that Hell has lakes of brimstone (sulfur). At sea level, sulfur boils to gas at 444.6°C (832°F). However, far underground, at extremely high pressure, sulfur can stay liquid as high as 1,040°C (1,904°F).
When a huge quantity of hydrogen and helium gas floating in space finds itself collected by gravity into a small enough ball, the atoms begin to smash into one another more and more rapidly. When the pressure and temperature are great enough, say, about 10 million K (about 18 million °F), the hydrogen atoms fuse together, nucleuses joining, creating a new helium atom. Sounds simple, but in the process a lot of energy is released—where “a lot” can be defined as mind-boggling, flabbergasting quantities that result in a fireball called a star.
The sun’s core registers at about 15 million K (27 million °F), where hydrogen is consumed at about 600 million tons per second. That means the sun, which is only about 4.6 billion years old, is currently in its midlife, about halfway through burning out.
Fusion is not impossible on Earth, however. That’s how the H-bomb works: A fission-based atom bomb blows inward on a small bit of prepared hydrogen, causing such intense heat that it fuses. The results, at temperatures reaching over 100 million K, are devastating. That said, if we can control fusion—or even better, create fusion at temperatures that don’t require radioactive explosions to ignite—we would have a never-ending supply of energy. It’s a deeply tempting prospect, one on the forefront of many a scientist’s mind these days.
Just as Jupiter outsizes Earth, many other stars dwarf our sun. A spectrographic analysis of the heavens reveals stars that have surface temperatures 50,000 times greater than the sun, and cores as high as 2 billion K. In theory, stars could get even larger and hotter, but these nuclear furnaces create something more than just heat and light, something very tiny but that can limit the temperature a star may reach: neutrinos.
Neutrinos are so unbelievably small and slippery that they can travel at near the speed of light through virtually anything. Each second, the sun releases about 2 × 1038 of these little guys (that’s 200 trillion trillion trillion neutrinos), and about 65 billion of them end up passing through each and every square centimeter of Earth. That’s trillions of neutrinos passing through you right now. It doesn’t matter if it’s nighttime and the sun is shining on the other side of the planet; neutrinos can literally cruise through Earth, meandering among the atoms, unaffected and unaffecting.
Neutrinos are small, but they do carry a little bit of energy away from a star, and as a plasma reaches about 4 billion K, the atoms become so energetic that the neutrino production actually begins to cool the star significantly. However, if a star becomes massive enough, and hot enough to reach about 6 billion K, the heat triggers such a massive release of neutrinos that the star collapses and then explodes into a colossal supernova. So 6 billion K effectively sets the maximum temperature for a star.
During a supernova, though, things can get crazy and all bets are off on the temperature scale. In 1987, astronomers witnessed a supernova in the Large Magellanic Cloud (one of only two galaxies close enough for us to see with our unaided eye). By careful analysis, they determined that the temperature inside the explosion reached about 200 billion K.
So was that the hottest thing in the universe? Far from it. In fact, you can find a hotter spot just an airplane ride away.
Like obsessive psychoanalysts, many physics researchers insist that the only way to truly understand our universe is to look back into its infancy, to see what crazy things happened within the first second after birth. Back then the heat must have been unbelievably greater than a puny supernova, somewhere in the trillions of degrees. At that temperature, atoms should not only shed their electrons, not only break into their constituent protons and neutrons, but also literally melt into a plasma of quarks and gluons—a seething, primordial broth of elementary particles.
With that in mind, physicists at the Relativistic Heavy Ion Collider at the Brookhaven National Laboratory on Long Island, New York, have been accelerating heavy gold ions around a huge underground ring, speeding the ions up to 99.99 percent of the speed of light, and smashing them into one another. The result is an enormous quantity of heat in a very small space, and in 2010 they achieved a record-setting 4 trillion K (over 7 trillion °Fahrenheit). The experiment confirmed the science, creating a quark-gluon plasma. However, the scientists were surprised to find that the result was more like a “quark soup” than a gas; subsequent calculations indicated that a million times more heat would likely be required to boil it.
On the outskirts of Geneva, Switzerland, CERN’s Large Hadron Collider is currently attempting to do just that. Scientists have already achieved trillions of degrees by smashing heavy lead atoms together; how long until the quadrillion- or quintillion-degree mark is smashed, too?
▲ Large Hadron Collider
There is, nevertheless, a theoretical upper limit on the thermometer. We know that the hotter particles get, the faster they move. But Einstein also figured out that as particles approach the speed of light, they also increase in mass. Keep increasing the temperature, and at some point each particle of matter would become so dense that it would collapse into its own black hole, causing a minor disruption in . . . well, pretty much everything. The German physicist Max Planck calculated that this would happen at about 1.4 × 1032 K. That’s 140 million million million million million degrees. And that, at least in this universe, is absolute hot.
▲ Max Planck
The Creator and Destroyer It is no surprise and no coincidence that virtually every religious tradition describes a divine life-giving warmth that cleanses and sanctifies—but that can also punish or annihilate. For heat is the creator, and every atom in your body and beyond was fused in the fiery depths of a star, often at the moment of its own supernova. And just as surely, heat is the destroyer, rending the elements apart, ending one form and transmuting it into another.
Heat is also the mover and shaker, allowing energy to radiate, infuse, and enable reactions throughout the universe. Without it, molecules could not have bonded together to form the amino acids and other fundamental structures that led to the spark of life, nor could the myriad chemical reactions required to sustain that life—your life—endure.
And yet, watching the ignition of Trinity, the first test of an atomic bomb in 1945—the heat from which melted the New Mexico desert into a crater of radioactive glass 300 meters (1,000 ft) wide—the physicist Robert Oppenheimer recalled lines from the holy Hindu scripture the Bhagavad Gita:
If the radiance of a thousand suns
Were to burst at once into the sky,
That would be like the splendor of the Mighty One . . .
I am Death the destroyer of worlds.
—Bhagavad Gita, chapter 11:12, 32
Nevertheless, heat offers hope, known by any hiker in the wilderness waking to a new sunrise. But removing heat, cooling elements, also offers hope: relief from a scorching daylight, or—in the laboratory—the glimmer of a possibility that we will better understand the building blocks from which we are all made. Cold creates order, though the very cold appears to create new disorders that we’re just beginning to comprehend.
We are all phoenixes, born from the ashes, living in the radiant glow of the sun—“the force,” as Dylan Thomas wrote, “that through the green fuse drives the flower.” Heat is the spectrum of life, however you measure it.
Additional Material
Agni, the Hindu god of fire from the 3,500-year-old Rig Veda scriptures, represents the essential life force in the universe, the creator of the sun and stars and the receiver of burned sacrifices, consuming and purifying so that other things may live. Agni may have also given birth to something else: the Latin word ignis (“fire”), which begat our English words ignite and igneous (“having formed from lava”).
Absolute hot (Planck temperature) 1.42 × 1032 K
Melting point of hadrons into quark-gluon plasma 2 trillion K
Everything 1 second after big bang 10 billion K
Thermonuclear weapon peak 350 million °C
Sun’s core 15 million °C (27 million °F)
Lightning bolt 28,000°C (50,000°F)
Center of Earth 6,650°C (12,000°F)
Surface of sun 5,500°C (10,000°F)
Filament inside light bulb 2,500°C (4,600°F)
Natural gas (methane) flame on a stovetop 1,200°C (2,200°F)
Lava 1,100°C (2,000°F)
Wood fire 900°C (1,650°F)
Draper point (where almost all solid materials begin to visibly glow) 525°C (977°F)
Melting point of lead 328°C (621°F)
Kitchen oven 288°C (550°F)
Book cellulose-based paper burns 233°C (451°F)
Water boils 100°C (212°F)
Hottest shade temperature recorded on Earth 58°C (136°F)
Human body temperature 37°C (98.6°F)
Room temperature 20°C (68°F)
Water freezes 0°C (32°F or 273 K)
Mercury in a thermometer freezes –39°C (–38°F)
Coldest temperature recorded on Earth –89°C (–129°F)
Alcohol freezes –114°C (–173°F)
Gasoline freezes –150°C (–238°F)
Boiling temperature of oxygen –183°C (–298°F)
Temperature on Neptune –220°C (–364°F)
Coldest spot on moon –228°C (–378°F or 45 K)
Cosmic microwave background 2.725 K
Coldest natural temperature known (Boomerang Nebula) 1 K
Coldest measured temperature 100 pK
Absolute zero 0 K (–273.16°C or –495.67°F)
Note: Many values here are approximate, as temperatures can vary.