THERMODYNAMICS
Two important developments in nineteenth-century physics that had immense significance, both practical and cosmological, were thermodynamics and electromagnetism. The Industrial Revolution had brought on the need to analyze heat engines, and the new science of thermodynamics was developed to describe the observations of heat phenomena. Based solely on observations of macroscopic mechanical systems involving the exchanges of heat and work, thermodynamics grew into a remarkably sophisticated mathematical science that applied directly to measurable quantities such as temperature, pressure, and density.
The two most important principles of thermodynamics are the first and second laws:
First law of thermodynamics
The change in internal energy of a system is equal to the heat into the system minus the work done by the system.
The first law follows from the principle of conservation of energy, which was developed in concert with thermodynamics. Heat was recognized as a form of energy, while work had been earlier defined as the useful application of a force. If you apply a force to a body to increase its speed, the work done on the body is equal to the increase in the body's kinetic energy (energy of motion). If the body experiences friction that slows it down, the loss of kinetic energy appears as the heat of friction.
Second law of thermodynamics
The entropy of a closed system must stay the same or increase with time.
The second law was originally cast in terms of heat engines and refrigerators, to account for the fact that neither can be perfectly efficient, even though perfect efficiency does not violate energy conservation. That is, an engine cannot convert all its input heat into work. Otherwise we could build a perpetual-motion machine that gets all its energy from the environment. Similarly, a refrigerator or air-conditioner cannot move heat from a lower to a higher temperature without work being done on it. Otherwise they would not have to be plugged into an electrical outlet. In 1865, Rudolf Clausius (1822–1888) recast these observed principles in terms of an abstract quantity called entropy, which is a measure of the disorder of the system.
Thermodynamics had a huge impact on the cosmological thinking of the nineteenth century, especially in regard to its connection with theology. At the time, many philosophers and theologians turned to the first and second laws of thermodynamics to argue for a finite, created universe. The first law was applied to show that the internal energy of the universe, composed of gravitational potential energy and kinetic (motional) energy, had to come from outside the universe.
The second law was applied to show that the universe cannot be infinitely old but had to have a beginning and, furthermore, would eventually die.1 This was called heat death, which occurs when all motion stops and the universe is as cold as it can get, that is, absolute zero.
The thermodynamic case for a divine creation made by many authors followed along these lines: First, if the universe were infinitely old, we would have already reached heat death when everything is in complete disorder, that is, maximum entropy. Second, the entropy of the universe was lower in the past and at some point must have been minimum (zero), which marked the birth of the organized cosmos. This, it was argued, proved that the universe not only had a beginning but also was supernaturally created. This followed from the fact that the universe was in complete disorder or chaos at that time, so the order that exists had to be imparted from the outside.
Hermann von Helmholtz, who wrote the definitive treatise on conservation of energy in 1847, elucidated what he saw as the fate of the universe in a lecture in Königsberg in 1854:
If the universe be delivered over to the undisturbed action of its physical processes, all force [by which he meant energy] will finally pass into the form of heat, and all heat come into a state of equilibrium. Then all possibility of a further change would be at an end, and the complete cessation of all natural processes must set in…. In short, the universe from that time onward would be condemned to a state of eternal rest.2
In 1868, Clausius formulated heat death in terms of entropy: “The entropy of the universe tends toward a maximum,” at which point “the universe would be in a state of unchanging death.3
Not everyone was convinced. The eminent British physicist Lord Kelvin (William Thomson) acknowledged the idea of heat death, writing in 1862, “The result [of what he called the law of energy dissipation] would inevitably be a state of universal rest and death.” However, he questioned the conclusion, saying, “Science points rather to an endless progress, through an endless space, of action involving the transformation of potential energy to palpable motion and thence to heat, than to a single finite mechanism, running down like a clock, and stopping forever.”4 In other words, the law of dissipation of energy did not hold for infinite space. But he had no convincing argument for space being infinite.
Other scientists, notably the Scottish engineer and physicist William Rankine (1820–1872), one of the founders of thermodynamics (the absolute temperature scale in Fahrenheit units is called degrees Rankine), sought ways that heat death might be avoided. He conjectured that “radiant heat” might have special properties that allowed it to be refocused instead of dissipated.5
As for the origin of the universe, it was believed that a law of conservation of matter existed, so that the matter of the universe also had to come from someplace. In an address in 1873, James Clerk Maxwell, the great unifier of electricity and magnetism about whom we will hear later, who also happened to be an evangelical Christian, expressed a common view:
Science is incompetent to reason upon the creation itself out of nothing. We have reached the utmost limit of our thinking faculties when we have admitted that because matter cannot be eternal and self-existent, it must have been created.6
Basically, then, science as understood in the nineteenth century seemed to require that the universe began with a supernatural creation a finite time in the past and will for all purposes end a finite time in the future when everything comes to rest. And the reasons were good ones, resting on the best empirical and theoretical knowledge of the day. But, as we will see, none of the arguments hold up under the knowledge of today.
In the meantime, many physicists and philosophers of science, in particular Ernst Mach (1838–1916), adopted the doctrine of positivism in which anything unobservable is not physics but metaphysics and out of the purview of the experimental method. In an 1872 lecture, Mach asserted that no meaningful scientific statements can be made about the universe as a whole. Terms such as “energy of the world” and “entropy of the world” were meaningless because they could not be measured.7
In short, no consensus existed on the cosmological implications of thermodynamics. Most astronomers paid no attention to the matter. French philosopher and historian Pierre Duhem (1861–1916) made an interesting conjecture that turned out to be correct: Even if the second law requires that entropy increase with time, it need not have either a lower or an upper limit.8
ELECTROMAGNETISM
In the second significant development in nineteenth-century physics, electricity and magnetism joined gravity as basic forces of nature. Again, we find a combination of experiment and theory, in this case culminating in a set of equations written down in 1865 by Scottish physicist James Clerk Maxwell. Maxwell's equations combined a number of principles discovered by others:
Note that if an electric force is produced by a static charge and a magnetic force by a moving charge, then Galileo's principle of relativity requires that the two be equivalent. Which is which depends on the reference frame of the observer. This is a fundamentally important point rarely mentioned in physics classrooms or textbooks.
In physics, a field is a mathematical object that has a value at every point in space. That “value” can be a single number, such as the density or pressure in solids, liquids, and gases, in which case it is said to be a scalar field. Or, it can be a set of numbers. The Newtonian gravitational field, the electric field, and the magnetic field are vector fields that require three numbers to define each point in space—one number for the magnitude and two for the direction of the field. The gravitational field in Einstein's theory of general relativity is a tensor field requiring ten independent numbers to define.
Faraday and Ampère had earlier demonstrated that electricity and magnetism are the same phenomenon, thus unifying two forces that had previously been thought to be separate. Maxwell's equations codified that fact. Maxwell's theory provides a complete description of the classical electromagnetic field. Given any distribution of electric charges and currents in any medium, one can use Maxwell's equations to compute the electric and magnetic fields at any point in space or a material medium. With just one additional equation, provided by Hendrick Lorentz (1853–1928), one can then determine the electric and magnetic forces on a charged particle at any point in the field and, using Newtonian mechanics, predict the position and velocity of that particle at any time in the future (or the past, for that matter). Once, again, we find corroboration of the Newtonian world machine.
Dramatic as this was, the even more dramatic result of Maxwell's equations was their prediction that in the absence of electric charges and currents, an electromagnetic field can still be present in empty space. Furthermore, that field will propagate through space as a wave with a speed c exactly equal to the speed of light in a vacuum. This number was not in inserted in the model; it came out of the derivation. Thus, it was concluded that light is an electromagnetic wave, confirming the wave nature of light.
Another consequence of Maxwell's theory was that electromagnetic waves extend indefinitely below and above the spectrum of visible light, which extends in wavelength from the violet end at about 430 nanometers to the red end at about 700 nanometers. The region below violet is called ultraviolet and the region above red is called infrared. Below the ultraviolet we have x-rays and below that, gamma rays. Above the infrared we have radio waves. In 1887, the German physicist Heinrich Hertz (1857–1894) transmitted electromagnetic waves with a wavelength of 8 meters, a billion times longer than visible light, and determined that they moved at the speed of light.
Today, astronomy is conducted over the range of the electromagnetic spectrum from gamma rays with wavelengths as a short as 10–18 meter (I have participated in gamma-ray observations) to radio waves with wavelengths of several kilometers.
The wavelength of light is usually represented by the Greek symbol λ. It is the distance between crests of the wave. The frequency f of a wave is the time rate at which the crests of a wave pass a given point. For light, fλ = c, where c is the speed of light in a vacuum. This expression holds for waves in general, where c is the wave velocity.
ATOMS AND STATISTICAL MECHANICS
The nineteenth century saw not only the development of thermodynamics and electromagnetism but also the application of the atomic theory to the understanding of the behavior of bulk matter. Starting early in the century with John Dalton (1766–1844), chemists developed an atomic theory of matter that culminated in the periodic table of the chemical elements introduced by the Russian chemist Dmitri Mendeleev (1834–1907). However, chemists saw no empirical reason to associate their atoms with the particulate atoms proposed by the ancient Greeks that formed the basis of Newtonian mechanics, as described in chapter 2. The one feature the chemical atoms shared with those of the ancient atomists was that they were indivisible (atomos in Greek). They were called elements because chemists could not break them down into anything simpler.9
Meanwhile, physicists stuck with their predisposition toward particles. Austrian physicist Ludwig Boltzmann (1844–1906), along with Maxwell and American physicist Josiah Willard Gibbs (1839–1903), developed the theory of statistical mechanics, which was based on the particulate picture of matter. It was used to derive all the laws of thermodynamics by assuming that a macroscopic body is composed of a vast number of tiny particles moving around largely randomly, colliding with one another and the walls of any container according to the laws of Newtonian mechanics.
The laws of thermodynamics are thus said to be “emergent” principles—not fundamental principles of nature but derived from more fundamental principles. Indeed, the principles that govern all systems made of many particles such as fluid mechanics, condensed matter physics, chemistry, biology, neuroscience, and even the social sciences can be viewed as emergent. Even gravity is now being proposed as an emergent phenomenon rather than a fundamental force. (See chapter 15.)
No attempt is made in statistical mechanics to describe the motions of individual particles. This would be impossible. Instead, statistical methods were used to predict the average behavior of the ensemble of particles that constituted the system. Thus, the pressure on the wall of a container was associated with the average force per unit area produced by particles colliding with the wall, per unit time. Absolute (Kelvin) temperature was identified with the average kinetic energy of the particles in a system when the system is in equilibrium.
Statistical mechanics associated the chemical elements with particulate atoms. Chemical compounds composed from the elements were identified as molecules that formed when atoms combined.
Despite this success, the particulate theory of atoms was still challenged by a large number of influential chemists and philosophers, notably Ernst Mach. As mentioned earlier, Mach was a positivist who asserted that only objects that can be sensed can be subject to scientific study. He insisted he did not believe in atoms because he could not see them. Mach maintained that position until his death in 1916, by which time the indirect evidence for atoms was indisputable. Today, chemical atoms can be seen directly with the Scanning Tunneling Microscope.
Further evidence for the particulate nature of matter was provided by a series of laboratory observations culminating in an experiment by the British physicist J. J. Thomson (1856–1940)and colleagues in 1896 confirming that the rays emitted from the cathodes of vacuum tubes were charged particles far less massive than the previously known lightest object, the hydrogen ion. These were named electrons and were soon recognized to be the carrier of an electric current. Since they flowed in the opposite direction to the current flow that had been arbitrarily labeled “positive,” electrons were specified to have negative electric charge. Today the electron is still considered to be one of the fundamental particles of matter.
VIOLATING THE SECOND LAW
Now, let's get back to the second law of thermodynamics. In 1872, Boltzmann derived what was called the H-theorem, which showed that a large ensemble of randomly moving particles will tend to move toward an equilibrium state where a certain quantity H, which is proportional to negative entropy, approaches minimum. That is, Boltzmann essentially proved that the second law of thermodynamics follows from the laws of statistical particle mechanics.
However, Boltzmann's good friend and colleague, Josef Loschmidt (1821–1895) saw a problem, which is called the irreversibility paradox. If a bunch of particles are moving around randomly, they can in principle accidentally form a state of lower entropy even when part of a closed system.
In 1890, Henri Poincaré published what is called the recurrence theorem, which says that dynamical systems will, after a sufficiently long time, return to their initial state. This directly contradicted Boltzmann's theorem and, therefore, seemed to disprove the second law of thermodynamics.
In 1867, Maxwell had expressed similar misgivings about the second law with his famous “thought experiment” in which an imaginary creature others called “Maxwell's demon” redirects particles to produce a state of lower entropy.
But demons or angels aren't needed. As Boltzmann eventually realized, his H-theorem, and thus the second law, are not hard-and-fast principles but simply statistical statements. On average, a closed system of many randomly moving particles will move to state of maximum entropy, as Boltzmann proved, but statistical fluctuations can occasionally produce a state of lower entropy. In fact, if you have a system of just a few particles, that will happen often.
In everyday life, we are familiar with occurrences that are termed “irreversible.” Puncture a tire, and the air flows out. We never witness a flat tire reinflate by outside air flowing back through the puncture. Broken glasses don't reassemble. The dead don't come back to life.
However, think about these from a particulate point of view. The molecules of air outside a flat tire are moving about randomly. Suppose a large number of them just happen, by chance, to be moving in the direction of the hole in the tire. Then the tire could reinflate!
The reason we don't see this happen is not because it is impossible but because it is highly unlikely that trillions upon trillions of air molecules will all be moving in the right direction to reinflate the tire.
But suppose we have a closed container of just three particles. Outside the container is an environment of many particles of the same type. Open the container, and the three particles escape. As long as we keep it open, the chance of three particles finding their way back into the container is very high.
In other words, the second law is not inviolate. It is just a statistical statement.
Boltzmann applied this realization to cosmology, speculating that if the universe were vast enough, entropic fluctuations could lead to pockets that deviate from equilibrium to produce other worlds such as ours with sufficiently low entropy to maintain and evolve order.10 So a live universe could rise from the heat death of the second law. And if one could, so could any number. He did not, but might have called it a “multiverse.”
THE ARROW OF TIME
Boltzmann also had another profound realization: the second law is not even a law! It's an arbitrary definition. The principle we are dealing with here is not that the average entropy of a closed system must increase with time, or, at best, remain constant. The principle is that the direction of time is by definition the direction in which the entropy of a closed system, namely the universe, increases. Arthur Eddington (1882–1944) would later dub this notion the arrow of time.
As we have seen, the fact that we do not observe certain time-reversed processes is that they are very unlikely, not that they are impossible. Our everyday experience of decay and death seems to confirm the second law, but that is because we, and the world around us, are composed of great numbers of particles in mostly random motion. But, when you deal with small numbers of particles, such as in chemical, nuclear, and elementary-particle reactions, events are observed to happen in either time direction.
THE END OF CLASSICAL PHYSICS
Physics as it stood at the end of the nineteenth century is generally referred to as “classical physics.” By that time, physicists had developed an almost, but not quite, complete theory of the physical world. The matter that makes up that world is composed of elementary particles, called atoms, which are associated with the ninety or so elements of the chemical periodic table. These particles interact with one another by means of two fundamental forces: gravity and electromagnetism, which are mathematically described with unlimited precision by Newton's law of gravity and Maxwell's equations respectively. The motion of every particle in the universe is then completely determined by these laws applied to whatever the position and velocity of the particle is at a given time.
As evidenced from planetary mechanics and the spectroscopy of stars, these same atoms and the theoretical principles that describe their behavior are identical throughout the universe.
ANOMALIES
Still, there were remaining unsolved problems. Maxwell's equations predicted the existence of electromagnetic waves that travel through space at the speed of light. Visible light was identified as those electromagnetic waves with wavelengths within a specific narrow spectral band, strongly confirming Huygens's wave theory (see chapter 3). In addition, waves well outside that band were observed that also traveled at the speed of light. However, the wave theory of light offered no explanation for three observed features of light: (1) line spectra, (2) blackbody radiation, and (3) the photoelectric effect.
We have already discussed line spectra, the very narrow dark lines that are seen as light passes through matter and the bright lines that are seen in the emission of light from hot bodies. These could not be understood in terms of the wave theory of light.
Blackbody radiation refers to the electromagnetic emissions from everyday objects. A blackbody exhibits a smooth spectrum with a peak that depends on the body's temperature. In the case of the very hot sun, its spectrum peaks in the center of visible spectrum, at yellow. Advocates of the notion that the parameters of physics are fine-tuned so that humans would evolve would have us think that the spectrum of light from the sun was designed to peak in just the range where human eyes, patterned after God's, were most sensitive. More likely, our eyes evolved to be sensitive to the region around that peak, which is why we call it “visible.” The spectra of cooler bodies, such as you and me, radiate in the infrared region, with wavelengths longer than that of red light. Rattlesnakes and other pit vipers evolved to see infrared light—the better to catch warm-blooded prey in the dark; so to them infrared is “visible.” Unless they reflect light, these bodies appear black to us, hence the term “blackbody.”
In 1905, Lord Rayleigh (John Strutt, 1842–1919) and James Jeans (1877–1946) used the classical wave theory of light to derive the spectrum of blackbody radiation. The calculation assumed that radiation results from oscillating charged particles inside the body. The shorter the wavelength, the greater the number of electromagnetic waves that can fit inside the body. Rayleigh and Jeans determined that the spectrum should fall off with the fourth power of wavelength.
However, the Rayleigh-Jeans model had a serious flaw. It predicted that the spectrum should increase indefinitely at shorter and shorter wavelengths. This is called the ultraviolet catastrophe. In fact, the measured spectra of all blackbodies fall off at either end (see figure 6.1)
The third observation that could not be explained by the wave theory involved ultraviolet light. Physicists, notably Hertz, discovered a wide range of phenomena in which ultraviolet light induces an electric current when the light is directed on various metals. The curious thing was, depending on the specific metal, a threshold wavelength of the light exists above which no current is produced. The wave theory of light offered no explanation for this.
And these were not the only problems with the wave theory of light. If light is an electromagnetic wave, what's doing the waving? The standard assumption was that electromagnetic waves are vibrations in an invisible, frictionless medium that smoothly pervades the universe and was identified with Aristotle's quintessence, the aether. But, as Maxwell himself noted, nothing in the theory of electromagnetism assumes the existence of the aether. Unlike the derivation of sound waves, which starts with the hypothesis of an elastic medium, Maxwell did not include any such medium in his prediction of electromagnetic waves. They just fell out of his equations of electromagnetism.
Starting in 1887, American physicists Albert Michelson (1852–1931) and Edward Morley (1863–1923) performed a series of experiments in which they attempted to detect the presence of the aether by measuring the difference in the speed of light expected between two perpendicular beams of light, which they anticipated would be carried by the motion of the Earth through the aether at different speeds by the principle of Galilean relativity. If Earth is moving at a speed v toward the aether, they should measure a light speed v + c. If away, they should measure v – c. They, and others who followed with increasingly more precise experiments, were unable to see the expected shift in the speed of light. Instead, they always obtained the same value, c.
And so, while by the dawn of the twentieth century, physics had scaled unimagined heights, problems remained that would result in further unimagined developments and newly scaled heights.