DEVELOPING THERMOMETERS

Taking the Temperature

When Galileo created a device he called a thermoscope in 1593, it was really just an interesting toy. Galileo’s device had no markings to indicate degrees; that task would fall to his friends Santorio Santorio and Gianfrancesco Sagredo, who were the first to apply a crude numerical scale to the thermograph, making it the first official air thermometer. At the same time, similar devices were being developed all over Europe, but each inventor worked independently, so there was no universally agreed-upon scale of measurement.

A few decades later, Amsterdam instrument maker Daniel Gabriel Fahrenheit came along with access to more responsive alcohol thermometers. Still, there was no temperature-measuring scale everyone could agree on. After Fahrenheit invented an even more sensitive thermometer, filled with mercury, he decided a universal heat measuring system was needed.

Fahrenheit filled a container with salt and ice water to obtain the lowest temperature he could and called that point 0°F, then he measured the temperature of melting ice without the salt and assigned it the value of 30°F. His own body temperature was marked at 96°F. When he later added the boiling point of water to his scale, at 212°F, he changed the melting point of water to 32°F so that the scale would balance out at an even 180°F between the melting and boiling points of water—the angle of a straight line. On the new scale, body temperature came out to 98.6°F.

HOT COMPETITION

Seems pretty confusing, doesn’t it? Maybe that’s why Swedish astronomer Anders Celsius suggested a simpler system in 1742. His method divided the difference between water’s melting and boiling points into one hundred equal parts. Using the Celsius scale, room temperature is around 25°C, while a hot summer day can reach 30–38°C. Water freezes at 0°C and boils at 100°C. What could be easier?

Since there were now two temperature scales to choose from, there was no need to further complicate matters, right? Sir William Thomson, a Scottish mathematician also known as Lord Kelvin, apparently didn’t think so. Kelvin wanted to eliminate the need for negative numbers when measuring temperature, so in 1848 he devised a temperature scale that started at the lowest possible temperature, absolute zero. That works out to −273.18°C, or −459°F. His scale is called (surprise!) the Kelvin, or thermodynamic, scale.

Absolute Zero


The Third Law of Thermodynamics prohibits the temperature from reaching absolute zero, even in space. Although there doesn’t seem to be an upper limit on heat, the temperature in deep space never gets below 2.7 K due to background radiation left over from the Big Bang. If it could get to absolute zero, all motion right down to the atomic level would stop.


The science of measuring temperature has come a long way since then. Bimetal thermometers depend on the expansion and contraction rates of two different metals to move a pointer that indicates temperature. Others measure the high and low temperatures over a given time. Radiometric thermometers measure temperature by reading an object’s radiation emission spectrum. Liquid-crystal thermometers change color with the temperature, like a 1970s mood ring. These days, the thermometer you’re mostly likely to encounter is electronic. Such a thermometer uses a device called a thermistor that measures resistance through an internal element.