Science and peace will triumph over ignorance and war.
—LOUIS PASTEUR, 1892
Science will flourish only in a society that cherishes…the reason, openness, tolerance, and respect for the autonomy of the individual that distinguish the social process of science.
—GERALD PIEL, 1986
Although the twentieth century produced unprecedented improvements in health, wealth, and welfare, it also saw the rise of totalitarian regimes that killed a hundred million people, threatened the survival of the liberal democracies—and ravaged ideas as well as lives, fomenting persistent misapprehensions about liberalism and science that persist to this day. Liberalism began to look quaintly old-fashioned, an eighteenth-century indulgence in a world whose future was widely presumed to belong to fascism, communism, or some other form of socialism. Science, previously esteemed, was blamed for the machine guns that mowed down young men on the battlefields, the napalm that incinerated cities by night—and, of course, for the nuclear bombings of Hiroshima and Nagasaki. Even many scientists adopted such views. “The physicists have known sin,” declared J. Robert Oppenheimer, chief scientist of the Manhattan Project, adding that in making “an evil thing,” they had “raised again the question of whether science is good for man.” His colleague Philip Morrison expressed concern about “a latent but growing feeling that science is somehow turning evil or blind. The people have the right to ask why must we do research if the outcome is the ruin of Hiroshima and its hundred thousand blackened corpses.” The mathematician and philosopher Michael Polanyi spoke of “the destructive potentialities of the scientific outlook” being realized in wars “which shattered our belief in liberal progress.” It began to be said, and not just on the fringe, that scientists ought to be held accountable for the consequences of their research, and that governments should intervene to channel research toward socially responsible goals. Such an approach was thought to be working in the totalitarian states, which were perceived as more efficient than the democracies at promoting scientific and technical advancement—as witnessed by the Nazi deployment of wartime wonders like the V-1 and V-2 rockets, and the Soviet Union’s acquisition of thermonuclear weapons and its launch of the Sputnik earth satellite.
If, indeed, totalitarianism nourished science and technology more efficiently than liberalism did, then the future of liberalism looked dark. But was this the case? How did science fare in Nazi Germany, the Soviet Union, and communist China?
Germany prior to the rise of the Nazis went through fitful excursions into liberalism while enjoying considerable scientific success. Following the European revolutions of 1848—when an economic depression touched off rebellions in the German Confederation, France, and Italy—the vested interests sought to preserve their traditions of nationalism, militarism, and monarchism by enacting just enough reform to stave off further unrest. The man who best managed this balancing act was Otto von Bismarck, appointed prime minister by King William in 1862. Bismarck was personally conservative but regarded all passionate political convictions as impediments to effective statecraft, preferring to absorb—and almost to embody—many points of view: “It was not that Bismarck lied [but] that he was always sincere” wrote Henry Kissinger, one of his many admirers. Crown Princess Victoria of Germany judged Bismarck to be “mediaeval altogether and the true theories of liberty and of government are Hebrew to him, though he adopts and admits a democratic idea or measure now and then when he thinks it will serve his purpose.” Neurotic and insecure—he was an insomniac, a hysteric, and a morphine addict who, said a contemporary, “eats too much, drinks too much and works too much”—Bismarck projected an image of unbending self-confidence. “I want to make music in my own way,” he said, “or not at all.”
Bismarck’s way of maintaining the monarchy was to build the power of the state while playing off the liberals and progressives against one another and otherwise maneuvering as necessary to forestall any real threat of Germany’s becoming a genuine democracy. He drove a wedge through the liberal party, then the nation’s largest, by forming an alliance with its left wing (which supported labor unions and big government) in order to weaken its moderate center (home to the liberals, who favored small government and free enterprise). The progressives got state-sponsored health insurance, workplace safety measures, and an eight-hour workday. Liberals got women's rights, a freer press, freer trade, and freer elections; Germany for a time had the only effective secret ballot in Europe. Conservatives got to retain the monarchy, the aristocracy, and the real power. “In exchange for lavish trinkets from an all-powerful state,” writes the conservative commentator Jonah Goldberg, “Bismarck bought off the forces of democratic revolution. Reform without democracy empowered the bureaucratic state while keeping the public satisfied.” This cynical recipe would prove chillingly effective in Nazi Germany and Soviet Russia.
It was, however, inherently unstable. Europe in general was unstable—as was demonstrated when the assassination of Archduke Franz Ferdinand of Austria-Hungary by a Serb nationalist on June 28, 1914, resulted, to everyone’s surprise, in the Great War. Instabilities continued to bedevil the Germany of the Weimar Republic (1919–1933), when innovations like expressionist art contended with the conservatism of monarchists, militarists, university professors, and government bureaucrats. Much of our ongoing fascination with Weimar libertinism—nudity and drugs in the nightclubs, Marlene Dietrich in The Blue Angel—derives from the fact that these romps were played out by young adults who, like drunken teenagers skinny-dipping in the swimming pool just before the parents come home, had too little political power to accept the responsibilities that went along with their newly acquired freedoms.
During its time of limited liberal and progressive reform, Germany emerged as a center of scientific research and development. Germans could boast of scientific accomplishments like Wilhelm Konrad Röntgen’s discovery of X-rays in 1895, Max Planck’s founding of quantum physics in 1900, and Einstein’s 1905 and 1915 theories of relativity. Their technological accomplishments included the invention of aspirin and heroin (advertised in tandem in the late 1800s, as relieving headaches and coughs respectively); Gottlieb Daimler’s early automobiles, from 1887; Otto Lilienthal’s gliders (he died testing one, in 1896, saying on his deathbed, Opfer müssen gebracht werden!—“Sacrifices must be made!”); Count Ferdinand von Zeppelin’s dirigibles, from 1900; the first electric typewriter; the electric locomotive; the Geiger counter; and the first machine gun synchronized to fire between the blades of a spinning aircraft propeller. It was said that the dream of every German mother was to have a son who was an engineer. When Hitler came to power in the midst of the Great Depression—having won 34.3 percent of the vote in 1930, and his party 44 percent in 1933, in elections sadly consonant with Benjamin Franklin’s grim warning that “those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety”—he took over a technological ship of state that made headway on sheer inertia even as he went to work dismantling its engines.
Hitler wanted to harness the power of science, of course. Like any dictator he wanted all the power he could get his hands on, demanding the “total mobilization” of science toward one or another hellish goal. But he understood little of what science is or how it works, relying for news of promising scientific developments on conversations with his barber. Comprehending nothing but power, Hitler assumed that science would quicken to the sharpened spurs of ruthless rule. “The triumphant progress of technical science in Germany and the marvelous development of German industries and commerce,” he asserted, “led us to forget that a powerful State had been the necessary prerequisite of that success.” He imagined that science was useful principally “as an instrument for the advancement of national pride.” Schools should teach “will-power” instead. “Instruction in the sciences,” Hitler decreed, “must be considered last in importance.”
Personally, Hitler was entangled in an obscuring web of pseudoscientific enthusiasms ranging from fad diets—he drank a toxic gun-cleaning fluid as a digestif—to avoidance of harmful “earth-rays,” the emanations of which the physician Gustav Freiherr von Pohl mapped out for him with a dowsing rod. Cosmologically, Hitler favored the “glacial cosmogony” theory concocted by an amateur astronomer, Philipp Fauth, and an Austrian engineer, Hanns Hörbiger, according to which the stars were balls of ice. He was blind to the prospects of technological progress other than those useful for killing people (Hannah Arendt: “The totalitarian belief that everything is possible seems to have proved only that everything can be destroyed”). Even there he displayed little foresight, dismissing the combat potential of rockets and jet aircraft while disdaining the possibility of nuclear weapons as a fantasy promoted by “Jewish physics.”
To the notion that science could be flattened under the boots of power and still deliver the goods—a fallacy described by Jacob Bronowski as “trying to buy the corpse of science”—the Nazis added leaden layers of superstition and pseudoscience. Looming large among these was a racial doctrine that proclaimed the superiority of an Aryan Nordic race, destined to rule a world cleansed of genetic impurities. Although decked out in costumes of mythological antiquity, these ideas were actually crackpot novelties: “Aryan,” a Sanskrit word for Persian nobility, had only recently been imported into European discourse, and “Nordic” was coined in 1898. Neither term described a race, much less a “pure” race, whatever such a thing might be. Nor would genetic purity confer an advantage on any population: The strength of a species resides in its genetic diversity, which improves its likelihood of surviving environmental changes. As Joseph Needham said of Nazi racial doctrines, “A more shameless flying in the face of established scientific fact has never been known in human history.”
It would be rather astounding if a party founded on pseudoscience and maximally illiberal power politics had presided over any significant number of scientific or technological breakthroughs—and indeed, there is scant evidence to support the popular image of the Third Reich as a futuristic war machine. The German military was an imposing force to be sure, arising as it did from generations of military professionalism, lavish infusions of fresh spending by the Nazi regime, and a passion for vengeance following Germany’s humiliating defeat in the Great War, but it was no monument to science. Starved for steel, the Nazis sent storm troopers to scavenge iron fences from public parks and cemeteries. Industrial chemists were ordered to develop synthetic rubber and gasoline, tasks at which they mostly failed. Nearly half of German’s wartime artillery came not from German factories but from conquered neighbors, mainly France. (Such shortages were a major reason behind Hitler’s disastrous decision to invade Russia.) The backbone of Germany’s military transport system consisted of railways plus seven hundred thousand horses. “In weapons and technology,” writes the historian Alan J. Levine, “the German forces were greatly superior only to their weakest and most backward foes, i.e., Poland, Norway and the Balkan countries. Generally speaking, Germany’s victories were due to good leadership, training, and the revolutionary use made of tanks and tactical airpower.” In cryptology, which provides a reasonable arena for comparing the scientific and technological capacities of adversaries in wartime, the Nazis’ greatest achievement was the Enigma encoding device. It was so thoroughly cracked by British scientists that German submarine attacks in the Atlantic came almost to a halt, obliging British leaders to exercise restraint in acting on what they learned from deciphered Enigma dispatches for fear the German high command might otherwise realize what was up.
When people think of “Nazi science” today they usually have in mind either eugenics or the pointless and sadistic “experiments” carried out by Nazi doctors in the death camps. But such obscenities can be described as science only in the rather distant sense that, say, Theodore Kaczynski’s “Unabomber Manifesto” can be called philosophy. The death-camp doctors discovered little beyond the fact that it is possible to kill a great many defenseless prisoners through the use of poison gases like the pesticide Zyklon B. Nazi eugenics research consisted of studies by physicians such as Eugen Fischer, who measured the “racial purity” of various individuals by looking for “Negro blood” Julius Hallervorden, who studied the “feeble” brains of euthanasia victims; Robert Ritter, whose data were employed by the SS to dispatch Gypsies to Auschwitz; Ernst Rüdin, who helped draft a Nazi sterilization law aimed at preventing “genetically diseased” offspring; Otmar Von Verschuer, who campaigned for forced sterilization of the “mentally and morally subnormal” Ernst Wentzler, who coordinated a pediatric euthanasia program that killed thousands of children; Carl Clauberg, who sterilized women at Auschwitz; and Josef Mengele, who murdered Jews and Gypsies to study their organs. It scarcely need be added that their results were of no scientific value.
Yet many thinkers continue to overestimate the quality of Third Reich science. Some are dazzled by German technological achievements, such as the development of jet engines, proximity fuses, and infrared night-vision goggles—but technological applications can lag decades behind the scientific discoveries that made them possible. One way to separate scientific research from technological applications is by tallying the citations in leading scientific journals. Such studies indicate that German research tumbled into a dying fall once Hitler came into power. Initially the impetus of prior research carried it forward, with Rudolf Schoenheimer employing natural isotopes as radioactive tracers in the human body in 1935 and Otto Hahn splitting uranium atoms in 1938, but Germany’s scientific citations thereafter dwindle until the pages are almost blank. Five years into Hitler’s reign, the number of German scientific papers appearing in one leading international physics review had fallen from 30 to 16 percent of the total. Membership in the nation’s oldest national scientific organization, the Society of German Natural Researchers and Physicians, shrank from 6,884 in 1929 to 3,759 in 1937. There was no dramatic moment when storm troopers burst into the German research funding agency to cry “Halt!” to German science. “Instead,” notes the historian Ulrich Herbert, “contrary positions and voices were simply eliminated.”
The Nazis presided over a nuclear-physics brain drain of startling proportions. The computer pioneer John von Neumann departed for the United States in 1930—the same year that saw the emigration of Hans Bethe, who would help discover the nuclear processes that make the sun shine, and of Leo Szilard, who while soaking in his bath in a London hotel suddenly realized how a nuclear weapon could be made. (To keep the idea secret, Szilard patented it and assigned the patent to the British Admiralty; he then drafted a letter, which he had his friend Einstein sign, warning President Roosevelt that “it appears almost certain” that an atomic bomb could be built “in the immediate future.”) The quantum-physics virtuosos Max Born and Erwin Schrödinger fled, as did James Franck, who would help build the atomic bomb and then petition to have it demonstrated to the Japanese rather than being dropped on a city.
Hitler was untroubled by the scientific exodus. When a physicist tried to alert him to the corrosive effects that Nazi anti-Semitism was having on scientific research, Hitler reportedly replied, “If the dismissal of Jewish scientists means the annihilation of contemporary German science, then we shall do without science for a few years!” To direct the Reich Ministry for Science, Education and Popular Culture—which was chartered to “unify and control all of German science by the Reich both within and outside the universities [and manage] the control and methodical shaping of all of scientific life especially at the university”—Hitler named Bernhard Rust, a former provincial schoolmaster who had been dismissed for molesting a schoolgirl but had escaped prosecution on grounds of his documented mental illness. For Rust, the whole purpose of education was to create Nazis. When Rust asked David Hilbert whether the once great mathematics center at Göttingen had suffered from the expulsion of its Jewish faculty members, Hilbert replied, “Suffered? It hasn’t suffered, Minister. It doesn’t exist anymore!”
A few first-rate scientists did remain in Germany throughout the war. One of them was Max Planck, the founder of quantum physics—a patriot whose eldest son died fighting in World War I and whose second son was executed by the Gestapo for attempting to assassinate Hitler with a bomb on July 20, 1944. Planck noted that intellectual midgets were being promoted to the academic posts vacated by professors who had fled the Nazi regime or perished at its hands: “If today thirty professors get up and protest against the government, by tomorrow there will be also one hundred and fifty individuals declaring their solidarity with Hitler, simply because they’re after the jobs.” He took comfort in the objectivity of science. “The outside world is something independent from man, something absolute,” he wrote, “and the quest for the laws which apply to this absolute appeared to me as the most sublime scientific pursuit in life.” Another who stayed behind was Werner Heisenberg, discoverer of the uncertainty principle in quantum physics. Something of a Romantic, Heisenberg was given to long alpine walks and to the formulation of oracular queries like, “Why is the one reflected in the many, what is the reflector and what the reflected, why did not the one remain alone?” He thought of the war as a passing storm, upon the subsiding of which intellectuals like himself would restore German culture to its proper prominence. “I must be satisfied to oversee in the small field of science the values that must become important for the future,” Heisenberg wrote in 1935. “That is in this general chaos the only clear thing that is left for me to do. The world out there is really ugly, but the work is beautiful.”
Jewish scientists who remained in Germany were soon dismissed from their posts and in many cases liquidated. Among those who perished at the hands of the Nazis were the mathematicians Ludwig Berwald, who died in the Lodz Ghetto; Otto Blumenthal, killed in the “model” camp at There-sienstadt; Robert Remak, who died at Auschwitz; Stanislaw Saks and Juliusz Pawel Chauder, murdered by the Gestapo; and Paul Epstein and Felix Hausdorff, who committed suicide. The chemist Wilhelm Traube was beaten to death in his apartment by Gestapo agents. The physicist Lise Meitner, a Viennese Jew who along with her two sisters had converted to Christianity, worked on the prospect of nuclear fission in Berlin with Otto Hahn and the chemist Fritz Strassmann until the summer of 1938, when she slipped away to Sweden on an expired passport. Soon thereafter Hahn and Strassmann arrived at the fission results for which Hahn would win a Nobel Prize—a finding that prompted Niels Bohr to exclaim, “Oh, what idiots we have been! Oh but this is wonderful! We could have foreseen it all! This is just as it must be!” Although by escaping from Germany Meitner may have forfeited her share of the credit for the discovery, she blamed herself for not having departed sooner. “Today it is very clear to me that it was a grave moral fault not to leave Germany in 1933, since in effect by staying there I supported Hitlerism,” she wrote to Hahn, adding an unflinching indictment:
You all worked for Nazi Germany. And you tried to offer only a passive resistance. Certainly, to buy off your conscience you helped here and there a persecuted person, but millions of innocent human beings were allowed to be murdered without any kind of protest being uttered.
Meitner refused a 1943 offer to work in the Manhattan Project, declaring, “I will have nothing to do with a bomb.”
Ultimately, though, the decay of German science under the Nazis resulted not just from the brain drain and the harebrained ministrations of Hitler and his henchmen, but from fundamental differences in the way science operates under totalitarianism as opposed to liberalism. Science demands free, open discussion and publication, not only in order to circulate fresh information and ideas but to expose them to lively criticism. A totalitarian regime can afford little of either. Having seized a measure of power to which it has no legitimate claim, in order to solve real or imaginary problems that it cannot in fact solve, such a regime is highly vulnerable to criticism and so must stifle it. One way it accomplishes this, aside from jailing and murdering dissenters, is to create a cult of secrecy and power in which access to secrets is perceived as a source of power. Such vices are infectious, and the history of Nazi Germany is rife with examples of corporations and government agencies needlessly duplicating their R&D efforts by playing their cards too close to their chests.
The development of radar in Britain, essential to the Royal Air Force’s defeat of a Luftwaffe that had it outgunned four to one, demonstrated some of the differences between the way the Germans and the English governments interacted with their scientists in wartime. The theory of radar was simple—a radio pulse that strikes an airplane will bounce back and so reveal the plane’s location, even at night or under cloudy skies—and had been understood since the 1880s. Naturalists found acoustic analogues in dolphin and whales, which locate fish in the depths of the sea by pinging them with pulsed squeals of sound, and in bats that navigate inside ink-black caves by emitting high-pitched squeaks and mapping their echoes. The difficulties arose with implementation. Since radio waves travel at the velocity of light, a competent radar kit has to receive echoes arriving a fraction of a second after the pulse was emitted. And, since shorter wavelengths produce sharper resolution (which is why dolphin and bats use high-pitched sounds for echolocation), radar required microwave radio equipment that did not yet exist when the war began.
Overcoming these obstacles called for an openness to new ideas that was scarce in wartime Germany. The Nazis’ institutionalized paranoia produced a stovepipe array of mutually suspicious public agencies and private corporations, with university scientists—such as they were—largely excluded from radar work altogether. The effort was further retarded by the low quality of Nazi appointees—men like Ernst Udet, who knew next to nothing about science but was appointed technology chief of the Luftwaffe on the basis of his fame as a World War I flying ace, and who objected that if radar systems were deployed, “Flying won’t be fun anymore.”
In Britain, a young and little-known Scottish electrical engineer named Robert Watson-Watt—descended from James Watt—found a ready audience in the war ministry for the idea that radar could win the air war. In the summer of 1940, when German bombs were raining down on London on a nightly basis, Prime Minister Churchill took the advice of the scientist-inventor Henry Thomas Tizard and, overruling the security concerns voiced by his cabinet, authorized the dispatch of a steel box containing radar blueprints and a new microwave transmitter prototype across the Atlantic, to see if the American allies could help speed things along. The physicist John Cockroft purchased the box in a surplus store and took it on an unescorted Canadian ship, the Duchess of Richmond. (Always attentive to detail, he drilled holes in the box so that it would sink should the Duchess be torpedoed.) The navy men aboard asked for a lecture by the famous physicist. Wanting to stick to a subject that he felt certain would have no wartime use, Cockroft spoke on atomic energy, telling the sailors that theoretically, the nuclear energy in a cup of water could blow their ship out of the water. In America, Cockroft found that radar work was being pursued by a variety of ad hoc teams involving university scientists, government engineers, and even talented amateur scientists like the financier Alfred Loomis, who tested the world’s first Doppler-radar “speed gun” at his estate in Tuxedo Park, New York (one of his colleagues remarking, “Hey, don’t let the cops get a hold of that”). Loomis immediately grasped the importance of the British inventions to the imminent deployment of radar in the war, and put the new transmitter into production the next day.
Churchill took a personal interest in science, numbered top scientists among his friends, and understood the importance of free communications in developing new devices and tactics. There was “no time to proceed by ordinary channels in devising expedients,” he said, boasting that his military service ministers “stood on no ceremony…had the fullest information…and constant access to me,” and that “anyone in this circle could always speak his mind.” In his view the conflict was not
a war of masses of men hurling masses of shells at each other. It is by devising new weapons, and above all by scientific leadership, that we shall best cope with the enemy’s superior strength…. The multiplication of the high-class scientific personnel, as well as the training of those who will handle the new weapons and research work connected with them, should be the very spear-point of our thought and effort.
“Unless British science had proved superior to German,” Churchill later wrote, “we might well have been defeated, and, being defeated, destroyed.”
His seriousness on this point was evident to the young physicist Reginald Victor Jones, who had been researching the infrared spectrum of the sun when the war intervened. Appointed Britain’s first scientific intelligence officer at age twenty-eight, Jones investigated the possibility that the Germans were using intersecting radio beams to signal their pilots where to release their bombs. Summoned to a meeting at Ten Downing Street in June 1940, the young Jones suspected a practical joke, but instead found himself seated at a table where Britain’s top air force officers, with Churchill presiding, were discussing the German radio-beam puzzle. After listening for a while to their groping conversation, which to Jones “suggested that they had not fully grasped the situation,” he was asked by Churchill to clear up a technical point. Instead he said, “Would it help, sir, if I told you the story right from the start?” Churchill was startled but replied, after a moment of hesitation, “Well, yes it would!” Jones spoke for twenty minutes, explaining his research and urging that British pilots fly along the German beams for themselves to learn how they worked. Such flights began the following day. British engineers were soon jamming the Germans’ signals, a key step in ending the night-bombing raids and ultimately the Blitz—which had destroyed over a million homes, more than were consumed in the Great Fire of London in 1666. Jones recalled of Churchill that “he valued science and technology at something approaching their true worth.”
This is not to say that the Allies were immune to the difficulties posed by bureaucratic opacity and conservatism—indeed British and American scientists often complained of just that—or that German scientists were never able to cut through red tape and get anything done. But on the whole, wartime scientific and technological development fared far better in the liberal democracies.
The failure of the Germans to develop nuclear weapons—a failure that surprised the Allies, who had invested in the Manhattan Project out of the quite sensible fear that the Nazis would otherwise get there first—also reflected the hobbled state of communications among the scientists and engineers involved. Historians differ sharply over whether Heisenberg, whom Hitler appointed to head up the German A-bomb project, deliberately let the project languish, but whatever his motives, his approach was a far cry from the egalitarian ethos of the Manhattan Project. Disinclined toward laboratory work, Heisenberg played Bach fugues on the chapel organ at Hechingen while his subordinates there conducted nuclear experiments with uranium, graphite, and heavy water. (“Had I never lived,” he mused dreamily, “someone else would probably have formulated the principle of indeterminacy; if Beethoven had never lived, no one would have written Opus 111.”) A similar insularity appears to have afflicted Walther Bothe, the eminent experimentalist whose mistaken calculation of the absorption characteristics of graphite led the German bomb project astray. Bothe, whose heart was not in the work anyway—the Nazis having hounded him over his antifascist political views so severely that he sought treatment in a sanatorium—concluded that graphite would not work as a neutron absorber in sustaining a nuclear chain reaction. Actually he was testing the wrong grade of graphite, but his results were taken at face value, leading German bomb scientists to conclude that only heavy water could do the job. This set them on a path that dead-ended once the Allies disabled the German heavy water plant at Vemork, Norway, in a series of raids conducted by Norwegian, French, and British commandos flown in on American bombers. A mistake like Bothe’s would have been unlikely to persist in the atmosphere of Los Alamos, where Gen. Leslie Groves had agreed to let the scientists work with their customary informality and ordinary engineers would have felt free to question an Oppenheimer about where he’d got his graphite.*
By the end of the war it was starkly evident that the Nazi campaign of top-down totalitarian science had failed. “In the early days of the war the world was amazed by the efficiency of the Nazi war machine,” wrote Needham, but “in every theater of war…the technology of the democracies has proved superior to that of the fascist powers.” Needham concluded that “the Axis powers have carried out a great social experiment. They have tested whether science can successfully be put at the service of authoritarian tyranny. The test has shown that it can not.”
Yet the notion persisted that totalitarianism was more efficient than liberal democracy. It was said that Mussolini had at least “made the trains run on time” (which as a matter of fact he did not) and that it must be more efficient to control science, technology, and industry via centralized planning. Adding to the confusion were the distortions created by a general reliance on a one-dimensional, left–right political spectrum. If you assume that the Nazis and the communists define the ends of the spectrum, and you’ve just fought a hideous war against the Nazis, it is easy to imagine that democracies will in the future evolve toward something like communism—whereas if you instead think in terms of a triangular relationship, you are more likely to see that communism is just as far removed from liberalism as the Nazis were. But this was not the mind-set of many postwar Europeans and Americans, for reasons dating back many decades.
Prior to the disaster of the two world wars, Europe was becoming politically liberal—by 1914, Russia and Turkey were the only thoroughgoing autocracies left in the region—and economically liberal enough that most Europeans were growing rapidly wealthier. But the wealth was not being equally shared, and with swelling masses of the working poor huddled in urban slums, the middle classes were tormented by two economic concerns—that their newfound wealth might be taken from them, and that they had no moral right to it in the first place. Some feared that the poor might vote to abrogate property rights altogether, others that unskilled workers sick of toiling in poverty would lobby to decouple merit from financial reward so that every laborer, regardless of his skill or diligence, got more or less the same wages. That all these fears should be realized—human rights overrun, property confiscated, and workers rewarded according to their needs rather than for their productivity—was precisely the communist prescription. Karl Marx and Frederick Engels knew what they were talking about when they declared, in the opening passage of the Communist Manifesto of 1847, “A specter is haunting Europe—the specter of communism.”
Not even Marx and Engels could foresee the extent to which communism would indeed come to haunt Europe and the world. Chiefly an enthusiasm of intellectuals and political opportunists, communism failed to capture the affection of the working masses whose interests it claimed to promote, yet came to rule a third of the world’s peoples. Marx and Engels talked about liberation, yet communism presided over the most craven sacrifices of liberty for the promise of material gain to have darkened human history. They described communism as a science, yet few more noxious assaults on science had ever been concocted.
Marx and Engels are easy enough to disdain. Innumerable crimes having since been committed in their names, it is possible today to take a certain grim satisfaction in reading that Marx neglected his family and that Engels campaigned against capitalism while living off the wealth that his father accrued as owner of a Manchester thread factory. But this is rather unfair. Marx was one among many radical philosophers and journalists in an age of revolutionary tumult, and his conduct was more that of an otherworldly scholar than of a scheming conspirator. And what may be said of Engels? A man born into wealth loses either way: He stands accused of complacency if he defends the existing regime, and of posturing if he objects to it. In financially supporting his friend Marx, and editing the two posthumous volumes of Capital following Marx’s death, Engels believed he was aiding a genius whose insights would liberate the poor. Neither man thought that communism, put into practice, would become an unbridled horror. But as it did, and was taken to be scientific, questions arise as to communism’s actual relationship to science and liberty.
Communist theory was inherently reactionary—a reaction against the brutalities of the early industrial era—and pseudoscientific, in that it pretended to be scientific while ignoring the need to test theories experimentally. Marx was stricken, as any right-minded individual would be, by the appalling lot of the working poor and by child labor in particular. Capital is rife with descriptions of children being
dragged from their squalid beds at two, three, or four o’clock in the morning, and compelled to work for a bare subsistence until ten, eleven, or twelve at night, their limbs wearing away, their frames dwindling, their faces whitening, and their humanity absolutely sinking into a stone-like torpor, utterly horrible to contemplate.
Engels, similarly affronted, recalled walking through the slums of Manchester with “a bourgeois” to whom he complained of
the frightful condition of the working people’s quarters, and asserted that I had never seen so ill-built a city. The man listened quietly to the end, and said at the corner where we parted: “And yet there is a great deal of money made here: Good morning, sir.”
Communists regarded Engels’ middle-class companion as the villain of the piece, yet capitalism and liberal democracy went on to alleviate the privations of far more working people than communism ever did. Liberalism proved to be empirical enough to address the plight of the working poor, while communism—an absolutist philosophy, equal parts religion and pseudoscience—was too inflexible to respond to much of anything.
Engels thought of Marx, who admired Darwin, as a scientific thinker of Darwinian stature. “Just as Darwin discovered the law of development of organic nature, so Marx discovered the law of development of human history,” he said at Marx’s funeral, adding that in his opinion, Marx had been “the greatest living thinker.” And indeed, Marx in his Capital, a book sprinkled with references to mathematics, geometry, and chemistry, claimed to have identified “natural laws” of history that move “with iron necessity toward inevitable results.” According to the allegedly inexorable workings of these laws, capital in proportion to production was destined to increase, profits and wages to decrease, and capitalist states to polarize into two cliques—wealthy capitalists on top, a seething mass of impoverished proletarians below—whereupon capitalism would succumb to socialistic revolution. All this sounded Darwinian to those only vaguely familiar with Darwin, as were Marx and Engels. In fact, Darwinian evolution has no inevitable or even foreseeable direction. Evolutionary biologists can make only weak, tentative predictions about such rudimentary matters as whether a given bacterium is likely to survive in a lab rat’s intestinal tract, much less about the destiny of economic and political institutions. Had Marx comprehended this he might have made more modest claims, and accompanied them with suggestions as to how communists could monitor the results of their social experiments and adjust their theories accordingly. Instead, he was long on what must be done to establish a communist society but short on how, once property had been abolished and the state had withered away, that society would actually operate: Marx’s communist world was as ethereal as a Christian’s heaven. Yet this very insubstantiality appealed to middlebrow thinkers seeking personal salvation through political revolution. As Michael Polanyi observed, Marxism
predicted that historic necessity would destroy an antiquated form of society and replace it by a new one, in which the existing miseries and injustices would be eliminated. Though this prospect was put forward as a purely scientific observation, it endowed those who accepted it with a feeling of overwhelming moral superiority.
Engels imagined that science was a branch of philosophy—specifically, that the great accomplishment of modern science had been to validate the dialectical materialism of Georg Wilhelm Friedrich Hegel, whose notion that all processes can be understood in terms of a dialectic (thesis, antithesis, and synthesis) has enthralled many a muddled mind. Marx was of a similar opinion; he borrowed much of his scientific terminology from Hegel’s Science of Logic, which he had read immediately before getting started on Capital. Hegel had once enjoyed an intellectual suzerainty conferred on few living philosophers. But his star had since fallen, his disciples morphing into left-wing and right-wing radicals and Arthur Schopenhauer dismissing him as “the clumsy and stupid Hegel,” whose “mad sophistry” “will remain as a monument of German stupidity.” None of this seems to have bothered Engels, who bought into Hegel with the blinkered enthusiasm of a vacationer boarding a burning blimp. “Nature is the proof of the [Hegelian] dialectic,” he wrote, “and we must give to modern science the credit of having furnished an extraordinary wealth and daily increasing store of material towards this proof.” Working from this groundless premise, Engels was quick to conclude that “a correct notion of the universe…can only be had by means of the dialectic method.” To make such a progression—from shallow-draft philosophizing and patronizing praise to a finger-wagging dismissal of all the alternatives—might have been harmless enough in the hands of amateur scholars contributing to fact-free journals of “analysis,” but it became quite a darker matter once the communists came to power and started prating Marxist-Leninist dogma across steel desks to those whose dissenting opinions would earn them a one-way ticket to the insane asylums, prison camps, and firing squads. If the world is relatively anti-intellectual today, it is because the world got a bellyful of the communists’ pseudoprophetic intellectualism and turned its broad back on the lot of it. The philosopher of science Karl Popper expressed just such a change of heart in the introduction to his The Open Society and Its Enemies. “This book,” he wrote,
does not try to add to all these volumes filled with wisdom, to the metaphysics of history and destiny, such as are fashionable nowadays. It rather tries to show that this prophetic wisdom is harmful, that the metaphysics of history impede the application of the piecemeal methods of science to the problems of social reform. And it further tries to show that we may become the makers of our fate when we have ceased to pose as its prophets.
Although the communists spoke enthusiastically of conducting social experiments, the inherent certitude of their philosophy barred them from providing the mechanisms by which a society could respond and change course when the experiments failed. Instead it soon became a crime even to call attention to communism’s failures, so that the social experiments collapsed amid a graveyard silence. To cite one example among thousands, when a 1937 national census revealed the damage done by Joseph Stalin’s having uprooted more than a million “kulaks” (meaning well-off peasants, a particular object of his and Lenin’s hostility), Stalin suppressed the census results, arrested the census takers, and had many of them shot. Such offenses were not unique to communism; what set them apart was the notion that something scientific was going on. It was not. Communism was characterized by the willful suspension of disbelief that is the polar opposite of scientific skepticism. The Soviet political activist Lev Kopolev recalled that when he saw people dying of hunger in Ukraine in the spring of 1932, he did not
curse those who had sent me to take away the peasant’s grain in the winter, and in the spring to persuade the barely walking skeleton-thin or sickly-swollen people to go into the fields in order to “fulfill the Bolshevik sowing plan in shock-worker style.” Nor did I lose my faith. As before I believed because I wanted to believe.
The influential Hungarian Marxist Georg Lukacs spoke for many communists when he stated in 1967 that he would continue to believe in Marxism even if every empirical prediction it made were proven to be false. You can’t get much further away from science than that.
Where communists ruled, real science was pitilessly pruned and uprooted to accommodate the party line. Russian astronomers who favored non-Marxist accounts of sunspots were accused of terrorism, while other scientists were persecuted simply for espousing the liberal values of science itself. The physicists Matvei Bronshtein and Lev Shubnikov were shot; their colleagues Semyon Shubin and Alexander Vitt were arrested and died in the camps; Lev Landau, who would receive the 1962 Nobel Prize in physics, nearly died in prison but was released when his colleague Pyotr Kapitsa warned Stalin that he would stop doing research unless Landau was set free. In all, more than a hundred members of the Soviet Academy of Sciences were sent to labor camps or otherwise incarcerated. Meanwhile government-sponsored pseudoscience flourished, reaching a tragicomic apotheosis in the Lysenko affair.
Trofim Denisovich Lysenko—who graduated from the Uman School of Horticulture in 1921 and ruled Russian agronomy for a quarter of a century, from 1940 to 1965, as director of the Soviet Academy of Sciences’ Institute of Genetics—was a crank whose fantasies happened to dovetail with communist cant. In normal circumstances he might have puttered away his career in some backwater agricultural station, but in the USSR he became a hero. Agriculture was one of the crowning glories of the United States and the other liberal democracies, where scientific experimentation had produced soaring crop yields, so the Soviet leaders made it a priority to outproduce Western farmers through the exercise of “socialist science.” There is, however, no socialist science, no Western or Eastern science, no capitalist or communist or feminist or ethnic science. There’s just science, and while one scientist may do it better or worse than the next, nobody can simply invent a different science and expect it to compete successfully with the real thing.
Stalin thought otherwise. He claimed to feel “reverence for science,” but only of the communist kind—“that science whose people…have the boldness, the resolution to smash old traditions.” Lysenko fit the bill. He presented himself as a “barefoot scientist” who had learned his trade from peasant farmers, and he promised that his earthy methods would outperform “bourgeois” science just as communist economics would outperform capitalism. As the communist theoretician and party leader Nikolai Bukharin put it in an address to the 1931 International Congress for the History of Science, Lysenkoism would help eliminate
the rupture between intellectual and physical labor [facilitating] the entry of the masses into the arena of cultural work, and the transformation of the proletariat from an object of culture into its subject, organizer and creator. This revolution in the very foundations of cultural existence is accompanied necessarily by a revolution in the methods of science….
What Lysenko actually did was to mix Lamarckian notions of acquired genetic characteristics with a romantic zeal for the health of wild plants—which, he maintained, could among other wonders morph from wheat to rye, a feat comparable to wild pigs awakening in the morning to find themselves turned into storks. (This was similar to the Nazi celebration of “natural” organisms as stronger than those allegedly weakened by cultivation.) Lysenko could never quite decide whether genes—the basic units of heredity, from which the word genetics is derived—existed, but he felt that mainstream genetics overemphasized competition, which smacked of capitalist market economics, whereas his own theories stressed a more socialistic biological “cooperation.” Lysenko maintained that oak seedlings planted in dense clusters do not enfeeble one another by competing for sunlight and nutrients, as Darwin had found, but thrive, the weaker seedlings willingly dying out so that the stronger might prosper—a claim with unnerving parallels to communism’s demand that citizens sacrifice themselves for the good of the state.
Lysenko responded to any hint of criticism with reflexive anger, political denunciation, and blanket dismissals of every scientific approach that threatened his prestige. After guaranteeing to double grain yields in a decade, he ordered that his methods be instituted on large scales without first being tested experimentally. “When we put forward a measure that is as yet founded only on theory, such as ‘freshening the blood’ of varieties of most important agricultural crops,” he asserted, “do we have the right to lose two [or] three years in preliminary testing of this measure on little plots at several breeding stations? No! We don’t have the right to lose a single year.” When Lysenko’s methods failed to produce the predicted results, he discarded the discouraging data and publicized only those of the few farms that seemed to have succeeded—data most often concocted by local party bosses eager to keep their jobs. These “successful” farms were then held up as “beacon-lights” for less hardworking (or more honest) farmers to emulate. “It would be hard to imagine a more effective way to make genuine science useless,” notes the historian David Joravsky in his book The Lysenko Affair. “What need was there for a lot of experiment stations and basic research institutes, if the scientist’s job was to be a publicity agent for isolated beacon-light farms?”
Under Lysenko, crop failures spread across Russian farmlands like a blight. Grain yields per acre fell 14 percent from 1930 to 1934, while 85 percent of the fruit trees planted according to Lysenko’s prescriptions withered and died. The Soviet officials cheered him on. As people are apt to do once their ideology comes into conflict with empirical fact, Communist Party bosses resorted to an impenetrable bureaucratese that combined claims of great progress with exhortations to do better. A 1932 declaration by the Russian Congress for Planning Genetics and Breeding implored the Russian people to “raise more decisively the question of the reconstruction of our science itself, the rethinking of its methods of work, the introduction of the principle of the classness and partyness [klassovost’ i partiinost’] of science on the basis of Marxist-Leninist methodology, the rethinking of trends and interrelationships with other sciences…” and so on. Russian wags coined a term, boltologiia or “jabberology,” equivalent to “gobbledygook,” to describe such verbiage.
Meanwhile those scientists who dared to point out the facts were persecuted, jailed, or reassigned to tertiary jobs. Prominent among such cases was that of Nikolai Vavilov, a devoted young biologist and geneticist who won the Lenin Prize and was made a member of the Soviet Academy of Sciences at age forty-two. Vavilov saw clearly that genetics research could save millions of Russians from malnutrition, if only Soviet science were released from the constraints of political dogma. The conflict came to a head one day in May 1939 when Vavilov was confronted by Lysenko, then head of the Lenin Academy of Agricultural Sciences, and a vice president of the academy named I. E. Lukyanenko. When Vavilov admitted that his Institute of Plant Botany and New Crops “bases its selection work wholly on Darwin’s evolutionary teaching,” his superiors pounced:
LUKYANENKO: Why do you speak of Darwin? Why don’t you choose examples from Marx and Engels?…
LYSENKO: I understood from what you wrote that…evolution must be viewed as a process of simplification. Yet in chapter four of the history of the [communist] party it says evolution is [an] increase in complexity…
VAVILOV: There is also reduction…
LUKYANENKO: Couldn’t you learn from Marx?…
VAVILOV: I am a great lover of Marxist literature…
LUKANENKO: Marxism is the only science. Darwinism is only a part; the real theory of knowledge of the world was given by Marx, Engels, and Lenin.
Vavilov ran out of patience and seized Lysenko by his suit lapels, exclaiming that Lysenko was destroying Soviet science. Lysenko shrieked like a schoolyard bully, declaring that he would tell on Vavilov. On August 6 of the following year, four Soviet agents emerged from a black sedan and arrested Vavilov while he was gathering botanical specimens in a sun-drenched field in western Ukraine. Interrogated nearly four hundred times in the ensuing eleven months, Vavilov was convicted of sabotaging Soviet agriculture and was sentenced to death. The sentence was commuted to twenty years but he soon died of malnutrition in Saratov prison, at age fifty-five. His “World Collection” of plant samples remains a resource for biologists today.
Such examples sufficed to silence Lysenko’s critics, leaving him free to make extravagant promises and to blame his colleagues when they failed to come true. Ultimately the rot of mutual deception reached so deep that the Soviet Union was exporting grain while its own people starved. Stalin at one point deliberately shipped food out of districts in Ukraine, the North Caucasus, and Kazakhstan where peasants had resisted communist reforms, then deployed troops to prevent them from escaping. Six million people died in this concocted famine.
Soviet leaders were quick to credit their regime with all sorts of scientific and technological breakthroughs. Most were spurious, but two Soviet technological endeavors—rocketry and nuclear weapons—were genuinely imposing, especially when combined in the form of nuclear bombs affixed atop intercontinental ballistic missiles. A single bomb could destroy a city; an ICBM could loft the bomb from Russia to New York in a matter of minutes. The situation seemed bad enough when the Cuban missile crisis of 1962 brought the world to the brink of nuclear war, encouraging talk of whether the West might not be “better Red than dead,” but at that point the USSR had only a few dozen nuclear-tipped ICBMs while the United States had a few hundred. In the accelerating nuclear arms race that followed, the number of such weapons spiraled into the thousands. Lurid war-gaming scenarios proliferated on both sides: “We” would hit “their” missile sites first, obliging them to sue for peace to save their cities; or hit their cities too, minimizing the risk of retaliation at the cost of our becoming the worst mass murderers in history, then hit them again, just to make sure, a measure known as “bouncing the rubble.” We would take shelter, in foxholes or under our little desks at school, and ride it out, then rebuild, somehow, on soil poisoned by radioactive plutonium and cesium. In times of international crisis our leaders would consult over the hotline, or deliberately act crazy in order to persuade the other side that we really were deranged enough to carry out our threats. Both sides relied on the policy of Mutual Assured Destruction, or MAD. American generals tried to make the Soviets believe that they were willing to incinerate German cities to halt the progress of an invading Russian army; when that threat wore thin the Americans developed the neutron bomb, a nuclear warhead that could kill people without destroying property. The more frightening a weapons system was, the better its perceived potential for deterrence. A single “boomer”—a submarine carrying dozens of missiles, each missile tipped with multiple warheads—could wreck an entire nation overnight, yet boomers were also potent preservers of the peace. That was the trouble with MAD: To prevent a nuclear war, you had to act as if you were willing to start one and thought you could win it. Another problem with MAD was that it kept fueling the nuclear arms race. Intelligence officials habitually overestimated the adversary’s nuclear strike forces—understandably so, since few cared to underestimate so grave a threat to their nation’s security—and no chief of state could risk ignoring their warnings. Moreover, each leader wanted the other side to know that his arsenal was increasing, the better to deter them. They in turn responded by deploying still more nuclear weapons, which led to a further response, and so on.
In the end MAD worked, staving off nuclear war until the Soviet Union folded its tents on Christmas Day 1991, but it left behind an image of the USSR as having been a scientific and technological powerhouse. Now that the records are opening up, it can be seen that this was never the case.
Russia, like Germany, had many capable scientists and engineers—and unlike Nazi Germany it did what it could to hang on to them—but asking them to compete effectively against their colleagues in the free world was like fielding a shackled track team. The historian of science Loren R. Graham notes that although Russian scientists did manage to contribute to population genetics, soil science, and the design of the Tokamak nuclear reactor, “The overall record of Soviet science and technology, considering the enormous size of the Soviet science establishment, was disappointing.” At the height of the Cold War the Russian biologist Zhores Medvedev made a quantitative study of Soviet versus American scientific research, but the results were so unfavorable to the USSR that he did not bother trying to publish them. The Soviet Academy of Sciences was advised in 1965 that Russia was producing only half as many research papers as the United States, despite having about the same number of scientists. The reasons for this gap were not obscure. Denied the freedom to attend conferences abroad, Soviet scientists had trouble keeping abreast of the latest developments. Hampered by the Soviet economy’s inability to provide adequate laboratory equipment, many were obliged to concentrate on mathematical fields far from empirical tests; Russian astrophysicists did important work in exotic areas such as black hole theory and the origin of the universe but could not get their hands on a decent telescope. Scientists whose work was particularly important to national defense were cosseted in isolated communities where they enjoyed luxuries and liberties unavailable to their colleagues, but confinement in such gilded cages impelled the best among them to become vocal critics of the communist regime. The Russian physicist Andrei Linde recalled that physics,
because of its importance for the development of the atomic bomb, was able to survive and develop, harmed to a much lesser degree by the dictatorship of official ideology. This circumstance gave rise to a very unusual phenomenon: The scientific culture associated with the Soviet school of physics became a culture of free political thought. As a result, many [of the] leading dissidents in the USSR, such as Andrei Sakharov and Yuri Orlov, were physicists.
Yuri Orlov helped formulate the Helsinki Accords, in which participating nations agreed to “respect human rights and fundamental freedoms, including the freedom of thought, conscience, religion or belief” and to report “interference with the free exchange of information.” For this he was convicted of anti-Soviet agitation and spent nearly a decade in the Siberian Gulag, much of it in solitary confinement. In 1986 he was swapped for a Soviet spy and emigrated to the United States, where he joined the physics faculty at Cornell.
Sakharov, called the “father of the Soviet hydrogen bomb,” became a passionate campaigner for democracy and human rights. He had worked on nuclear weapons from 1946 to 1968, spending eighteen years in a secret laboratory called “The Installation,” which he described as a combination “between an ultra-modern scientific research institute…and a large labor camp.” The scientists there, though showered with awards and bonuses, were virtual prisoners all the same. When Sakharov and a few colleagues wandered out toward the Installation’s wooded perimeter one day, preoccupied by the conversation at hand, they were arrested by security guards and brought back to the barracks at gunpoint. Sakharov called for an end to nuclear testing and the establishment of a Russian “democratic, pluralistic society free of intolerance and dogmatism,” in an article that was smuggled out of the Soviet Union, published in the New York Times, and reprinted in editions that sold more than eighteen million copies. He was awarded the 1975 Nobel Peace Prize, the citation calling him “the conscience of mankind,” but was forbidden to travel to Norway to receive it. (In his written acceptance speech, Sakharov pointedly mentioned that his friend Sergei Kovalev, a biophysicist, had just been sentenced to seven years’ hard labor.) Constantly harassed and sometimes beaten by communist agents, Sakharov was obliged to write his thousand-page autobiography three times from scratch, the original draft having been stolen by the KGB while he was in a dentist’s chair and the second confiscated by agents who dragged him from his car and drugged him. Sakharov’s international reputation made him an inopportune candidate for execution so the authorities sent him into six years of internal exile in Gorky, where he was forbidden to communicate by telephone or to entertain foreign visitors. He died of a heart attack on December 14, 1989, hours after addressing the Soviet Congress on behalf of political pluralism and free-market economics.
To understand how the United States and the USSR seemed during the decades of the Cold War to have had scientific and technological parity—to have resembled, as was often said, two scorpions in a bottle—requires a brief look at the history of their respective work on rocketry and nuclear weapons.
Rocketry is as old as gunpowder. It operates on the simple principle that any explosive, whether solid or liquid, expands when it burns. If you pack the explosive into a tube that is left open at one end—the Chinese originally used gunpowder tamped into bamboo—the expanding gases released by the burn will rush out the open end, pushing the tube off in the opposite direction. (Newton’s third law: For every action there is an equal and opposite reaction.) Rockets clearly had military applications—the Chinese fired their bamboo rockets at invading Mongols in the year 1232—but what really mattered was that they could function in the vacuum of space. Fly a jet fighter to the edge of space and its engine will falter and shut down, since it needs oxygen from the atmosphere to burn the kerosene in its fuel tanks. But a rocket carries everything it needs on board—typically, liquid oxygen in one tank and a propellant such as hydrogen in the other—so it requires no air. Rockets are born for space.
It takes a lot of fuel and costly hardware to put even a small payload into space, and as there was no immediate practical benefit in doing so, the visionaries who laid the foundations of modern rocketry were not just engineers but futuristic dreamers as well. The three leading pioneers, all born in the 1800s, were a Russian, Konstantin Tsiolkovsky; an American, Robert H. Goddard; and a German, Hermann Oberth. Goddard was a physicist whose fascination with rockets, much ridiculed at the time, arose from a vision he had of going to Mars that came to him while he was climbing a cherry tree at age sixteen. He belonged to the American Rocket Society (originally called the American Interplanetary Society) whose founders were science-fiction enthusiasts like himself. He published a paper titled “A Method of Reaching Extreme Altitudes” in 1919, and on March 16, 1926, launched the world’s first liquid-fueled rocket, from a snowy field at his Aunt Effie’s farm in Auburn, Massachusetts. Tsiolkovsky, an autodidactic schoolteacher, proselytized for rocketry as the key to human spaceflight. He derived the “Tsiolkovsky equation” (demonstrating that rockets capable of sufficient thrust could loft large payloads into space) foresaw the development of multistage missiles (which increase their efficiency by shedding dead weight as they climb), envisioned space-suited astronauts assembling space stations in orbit, and studied the perils of returning to Earth amid the fireball generated by friction with the upper atmosphere. “The Earth is the cradle of humanity,” he declared, “but one cannot live in a cradle forever.” Oberth, a Transylvanian gymnasium teacher, read Jules Verne at about age eleven and launched his first rocket at age fourteen. Ultimately he succumbed to the UFO craze, declaring “that flying saucers are real and that they are space ships from another solar system…manned by intelligent observers [whose] present mission may be one of scientific investigation.” When his physics doctoral dissertation on rocketry was rejected as too “utopian,” Oberth published it via a vanity press as Die Rakete zu den Planetenräumen (“The Rocket into Planetary Space”). A copy found its way into the hands of a thirteen-year-old astronomy buff studying at a fashionable boarding school set up near Weimar in Ettersburg castle, where Goethe is said to have worked on Faust. “Opening it,” the young man recalled, “I was aghast. Its pages were a hash of mathematical formulas. It was gibberish. I rushed to my teachers. ‘How can I understand what this man is saying?’ I demanded. They told me to study mathematics and physics, my two worst courses.” He buckled down and went on to a celebrated career as a rocketry engineer and organizer, without ever working as a scientist. He was Wernher von Braun.
The big, powerful rockets requisite for space travel could not have been developed without the heavy investments made in them by the defense departments of Germany, Russia, and, more tardily, the United States. For von Braun, the Faustian bargain began on the day in early 1932 when a black sedan carrying three Army officers in mufti pulled up at the Raketenflugplatz (Rocketport) where his college rocketry club was about to conduct a test. The Nazis were not yet in power, premonitions of war were almost nonexistent, and for von Braun, the scion of a Prussian Junker family, the equation was clear. “We needed money, and the Army seemed willing to help us,” he recalled. “We were interested solely in exploring outer space. It was simply a question with us of how the golden cow could be milked most successfully.”
The affair began innocently enough. The rocketeers set up a government-funded launch site at Peenemünde—on the advice of von Braun’s mother, who recalled that his father used to go duck hunting there—and as late as 1944 their rockets remained so inaccurate that von Braun took to observing test launchings from the intended point of impact, on the theory that “the bull’s eye is the safest spot on the map.” (He was nearly killed one day when, to his horror, an on-target missile loomed out of the blue sky and exploded overhead.) Before it was over, Polish and Soviet slave laborers were being worked to death in tunnels beneath Peenemünde, building the V-1 (“V” for Vergeltungswaffe, or “vengeance weapon”) buzz bombs and V-2 missiles lobbed at London. When Berlin fell, von Braun and key members of his team surrendered en masse to the U.S. Army—their rationale being, as one put it, “We despise the French; we are mortally afraid of the Russians; we do not believe the British can afford us; so that leaves the Americans.”
In the United States, von Braun popularized space exploration through books, magazine articles, and appearances in Disney films and television shows, his genial personality and deeply checkered past personifying the two faces of rocketry as both a way to explore other worlds and the possible agency of a nuclear war. (The comedian Mort Sahl suggested that an adoring Hollywood movie about von Braun, I Aim at the Stars, should be subtitled, “But Sometimes I Hit London.”) A scientist or engineer living in a totalitarian state, von Braun told an American audience in 1957, is “coddled to some extent, and has practically all the advantages and privileges he enjoys in a free country. But there is always the looming danger that a brick may suddenly fall on his head.”
Soviet rocket designers enthralled by space exploration similarly hoped that their military work would produce rockets capable of lofting humans into orbit, but the differing circumstances of the USSR and the United States put the two nations on contrasting paths toward that decidedly subordinate goal. The problem for Russian generals at the dawn of the nuclear era was that in the event of war, American bombers could fly missions against Moscow and other Soviet cities after refueling at forward bases in Europe and Asia, whereas the USSR lacked such refueling bases and so could muster only a limited nuclear bomber threat. The answer was to develop intercontinental missiles capable of targeting American cities from within Russia’s borders. But while American scientists and engineers had been able to miniaturize their thermonuclear weapons, Soviet H-bombs were still so big and heavy that hurling one over the North Pole was like using a rocket to deliver a loaded meat freezer to New York, complete with a power source for its refrigeration unit. To compensate for this technological limitation, the Russians built big boosters.
That was good news for Sergey Korolyov, the top Soviet missile designer, whose career spanned the idealistic visions and worldly torments that afflicted so many Russian scientists and engineers. As a student Korolyov had met Tsiolkovsky and absorbed his visions of space exploration, declaring, “Soviet rockets must conquer space!” His R-7 missile inaugurated a series of powerful, reliable boosters so cheap to mass-produce that while the Americans spent months testing new satellites by shaking and baking them in laboratory vacuums, the Russians often found it more cost-effective to just launch them into real space and see what happened. But along the way, Korolyov spent six years in the sharashka (“deceit”) prisons established to punish scientists, engineers, and the intelligentsia. Beaten and tortured, he came home with a broken jaw, a heart condition, and a case of scurvy, and was thereafter kept under constant surveillance by Soviet agents, one of whom ominously wondered aloud, “Maybe you’re making rockets for an attempt on the life of our leader?” Haunted by recurring nightmares in which guards burst into his bedroom to take him back to the camps, Korolyov habitually muttered to himself the incantation, “We will all vanish without a trace.”
The Americans, meanwhile—protected by Strategic Air Command bombers, substantial ground forces in Western Europe, and a navy that roved the oceans almost at will—felt no great sense of urgency about developing ICBMs. Nor would such missiles need to be terribly powerful, since American nuclear warheads were already smaller and more sophisticated than the Russians’ and were being further miniaturized—a process that would eventually produce H-bombs the size of Super Bowl trophies. This satisfied the generals, but von Braun feared that it meant the Americans would never build boosters big enough to put humans in space—as indeed would have been the case, had John F. Kennedy not made it a national goal to dispatch men to the moon. To provide an alternate vision, von Braun went public with books, TV shows, and plastic models illustrating how big, fat boosters could carry a giant space station, piece by piece, into orbit. He even went so far as to argue that the station might function as a military asset, the astronauts hurling bombs down onto an unwitting enemy.
On July 29, 1955, President Dwight D. Eisenhower announced that the United States planned to develop what everyone assumed would be the world’s first earth satellite. Devoted solely to scientific research, it would contribute to the International Geophysical Year of 1957–58, a cooperative research effort involving scientists in sixty nations worldwide. To underscore its peaceful intent, the satellite would be launched by an allegedly civilian rocket (actually a rebranded Navy missile) called Vanguard. The skinny, crayon-shaped Vanguard and its grapefruit-sized scientific satellite would demonstrate both the Americans’ technological superiority and their desire to prevent space from becoming a Cold War battleground.
Back in the USSR, Korolyov realized that the feel-good Vanguard project presented an opportunity to beat the Americans into space. His team already had powerful enough rockets to orbit a satellite heavier than Vanguard’s little grapefruit, provided that the Soviet leaders weren’t too worried about the public-relations downside of using military hardware to get the job done. He pitched the idea to Nikita Khrushchev, the first secretary of the Communist Party, to whom he had given a tour of a Soviet missile launch facility four years earlier. (“We gawked at what he showed us as if we were sheep seeing a new gate for the first time,” recalled Khrushchev, who liked to portray himself as a simple farmer. “We were like peasants in a market place. We walked around the rocket, touching it, tapping it to see if it was sturdy enough—we did everything but lick it to see how it tasted.”) Although Khrushchev was loathe to waste military resources, he appreciated that a satellite launch might improve the morale of his best rocket designer—who was, after all, literally a beaten man—while making it appear that the USSR had leapfrogged the United States technologically. So Khrushchev gave it the nod, and on October 4, 1957, Korolyov’s team put Sputnik into orbit. “The freed and conscientious labor of the people of the new socialist society,” crowed Pravda, “makes the most daring dreams of mankind a reality.” Korolyov’s childhood dreams had come true, but he had never entirely recovered from the physical and psychological abuse he suffered in the Gulag, and he died a few years later at age fifty-eight.
The American public was flabbergasted. Sputnik “created a crisis,” recalled James Killian, the first White House science advisor. “Confidence in American science, technology, and education suddenly evaporated.” Gliding overhead like a beeping Sword of Damocles—it did nothing but beep, since it contained nothing but a radio transmitter—Sputnik suggested to politicians and pundits alike that the United States, awash in a hedonistic brew of martinis, bikinis, and Cadillacs sporting tailfins larger than slabs of barbecued ribs, was losing out to the stern efficiency of totalitarian technology. “The communists have established a foothold in outer space,” Lyndon Johnson warned his colleagues in Congress. “It is not very reassuring to be told that next year we will put a better satellite into the air. Perhaps it will also have chrome trim and automatic wind-shield wipers.” The journalist Edward R. Murrow wrote, three days after the Sputnik launch:
It is to be hoped that the explosion which flung the Russian satellite into outer space also shattered a myth. That was the belief that scientific achievement is not possible under a despotic form of government…. We failed to recognize that a totalitarian state can establish its priorities, define its objectives, allocate its money, deny its people automobiles, television sets and all kinds of comforting gadgets in order to achieve a national goal. The Russians have done this with the intercontinental ballistic missile, and now with the Earth satellite.
Murrow added that Americans should have learned “from the Nazis that an unfree science can be productive.”
But that was not the lesson to be learned, because the Americans had not fallen behind. In many ways they were ahead—a secret that Eisenhower grimly kept while a campaigning John F. Kennedy accused the administration of having permitted a “missile gap” to loom between the United States and the USSR. There was no missile gap. Eisenhower’s real reason for staking the nation’s space plans on the nonmilitary Vanguard project was that he understood the intelligence potential of secret spy satellites, which were being developed by the military, but feared that their deployment would provoke an international incident should the Soviets view them as violators of Russian airspace. Meanwhile the Americans had to rely for their aerial intelligence on U-2 spy planes, which normally flew at altitudes above the reach of Russian air defenses. In 1960, the final year of Eisenhower’s presidency, the Soviets managed to shoot down a U-2, capturing its pilot alive and its surveillance cameras intact. Eisenhower felt obliged to insist that the U-2 was a “weather plane,” creating the embarrassing spectacle of an American president lying to his people while a Soviet premier for once told his people the truth. Eisenhower had hoped that the scientific Vanguard project would establish a legal precedent that orbiting satellites do not violate nations’ sovereignty. Ironically enough, Sputnik laid that worry to rest.
Von Braun and his team at the American rocket center in Huntsville, Alabama, had been insisting since 1952 that they could launch a satellite on demand, using existing military rockets, and were frustrated by Eisenhower’s reluctance to do so. On the day Sputnik went up, von Braun sought out Neil McElroy, recently nominated to be secretary of defense, at a cocktail party at the Huntsville officers’ club and repeated this claim. “We have the hardware on the shelf,” von Braun told McElroy. “For God’s sake turn us loose and let us do something. We can put up the satellite in sixty days, Mr. McElroy! Just give us a green light and sixty days!” General John Bruce Medaris, head of the Army Ballistic Missile Agency, interjected a note of moderation. “No, Wernher,” he said. “Ninety days.”
In the end it took 119 days to launch Explorer, the first American satellite, on January 31, 1958. The rocket was stock, the satellite hastily constructed by scientists and engineers at the Jet Propulsion Laboratory in California. “We liked the difference between our satellite and Sputnik,” said one of the conspirators. “Ours flew science.” Indeed, Explorer made the first scientific discovery in space, when data sent back by its detectors suggested the presence of the doughnut-shaped bands of intense radiation surrounding Earth known today as the Van Allen belt—after the physicist James Van Allen, who designed the instruments involved. Meanwhile a Vanguard test on December 6, promoted by the White House to the status of an actual launch attempt and broadcast live on network television, failed in spectacular fashion: The slim rocket rose only four feet before subsiding into a blooming fireball, its nose cone toppling off and its gleaming chrome-plated satellite rolling piteously across the ground. A subsequent Vanguard launch succeeded—hurled high, as if from pent-up frustration, the satellite remains in orbit to this day—but only after the first attempt had been dubbed “Flopnik” and “Stayputnik” in the newspaper headlines, adding to an American inferiority complex that deepened in 1961 when Yuri Gagarin became the first human to orbit the earth.
Spaceflight enthusiasts had been arguing since the start of the Cold War that a space race between the two superpowers could function as a peaceful, nonmilitary arena of technological competition. Now something like that was actually happening. President Kennedy said as much in his speech of May 25, 1961, committing the nation to “the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” The idea, Kennedy told Congress, was to win the Cold War by winning the space race—“To win the battle that is now going on around the world between freedom and tyranny” so that “men everywhere” could decide “which road they should take.” The Americans prevailed, landing on the moon five months ahead of Kennedy’s deadline, and in the process a lot of exploration and some science got done. Even the rosy scenario of international cooperation was eventually realized, when the opening decade of the twenty-first century saw the stodgy but reliable rockets of a postcommunist Russia ferrying astronauts, cosmonauts, and supplies to an International Space Station, much as Tsiolkovsky, Goddard, Oberth, and Korolyov had dreamed.
The Soviet nuclear weapons program was launched on a river of espionage opened up by the Lend-Lease Act, under which the United States dispatched $11 billion worth of materiel to its Soviet allies—much of it flown out from Gore Field in Great Falls, Montana, under the supervision of an Army Air Force officer named George Racey Jordan. In his book Dark Sun: The Making of the Hydrogen Bomb, Richard Rhodes recounts Jordan’s tale of how he became suspicious of all the black patent-leather suitcases, bound with sash cords and sealed with red wax, that he saw dispatched to Moscow. “The units mounted to ten, twenty and thirty and at last to standard batches of fifty, which weighed almost two tons and consumed the cargo allotment of an entire plane,” Jordan recalled. “The [Soviet] officers were replaced by armed couriers, traveling in pairs, and the excuse for avoiding inspection was changed from ‘personal luggage’ to ‘diplomatic immunity.’” When the Russians invited Jordan to a vodka-soaked dinner one night in March 1943, he became suspicious and broke away from the endless rounds of toasts to inspect a C-47 cargo plane whose crew was demanding immediate clearance to take off for Moscow. Arriving at the airstrip, he pushed aside a “burly, barrel-chested Russian” who tried to block his entrance to the plane and found “an expanse of black suitcases” filling its cavernous cargo hold. He summoned an armed GI and started cutting open the suitcases. They were full of documents. Puzzled, he wrote down the words in them that he did not understand: “Uranium 92—neutron—proton and deuteron—isotope—energy produced by fission….” Jordan soon realized that he had been witnessing not only the departure of “a tremendous amount of America’s technical know-how to Russia” but that the Lend-Lease planes were also being used to import Soviet agents into the United States:
The entry of Soviet personnel into the United States was completely uncontrolled. Planes were arriving regularly from Moscow with unidentified Russians aboard. I would see them jump off planes, hop over fences, and run for taxicabs. They seemed to know in advance exactly where they were headed, and how to get there.
In addition to their own agents, the Soviet intelligence establishment was able to recruit American and European scientists and intellectuals who viewed Russia as the prime mover in driving back fascism—credit for which the Russians certainly deserved, having lost 8.6 million troops and more than 10 million civilians in the war—and thought of communism as the wave of the future. Some of these amateur spies were astoundingly naïve. David Greenglass, an engineer who spied for the Soviets while working in the nuclear facilities at Oak Ridge and Los Alamos, wrote to his wife, Ruth, in 1944:
I have been reading a lot of books on the Soviet Union. Dear, I can see how far-sighted and intelligent those leaders are. They are really geniuses every one of them [and] I have come to a stronger and more resolute faith and belief in the principles of Socialism and Communism.
Klaus Fuchs, a theoretical physicist who joined the Communist Party in the early thirties out of opposition to Hitler and later sent nuclear secrets to the Soviets from a British laboratory at Birmingham, said blithely, “It was always my intention, when I had helped the Russians to take over everything, to get up and tell them what is wrong with their system.” Harry Gold, a Manhattan Project spy, claimed that he decided “to do everything possible to strengthen the Soviet Union” after a drunken anti-Semite called him a “sheeny bastard” in the smoking room of the Philadelphia train station. Another spy, Julius Rosenberg, boasted to Greenglass in 1943 that his “powerful friends” in Moscow would set him up in a phony “screen” business after the war. “Victory shall be ours and the future is socialism’s,” he wrote to his wife, Ethel.
Such escapades ended badly for some of the spies—Greenglass served ten years in prison, Fuchs nine years, and Harry Gold fifteen years, while Julius and Ethel Rosenberg died in the electric chair at Sing Sing on June 19, 1953—but the espionage proved invaluable to the Soviets. By 1942, having seen enough secret files about nuclear science and engineering to conclude that something was afoot, Beria secured Stalin’s agreement to try to build a bomb. Work began the following year, based on thousands of pages of captured documents: “The materials are magnificent,” declared Peoples Commissar Vyacheslav Molotov in 1942, and they kept getting better. The bombing of Hiroshima on August 6, 1945, ended any lingering doubts. “Now that the Americans have invented it,” said Col. Nicolai Zabotin, the head of Soviet military intelligence, “we must steal it!” An entire city, named for Beria and devoted to nuclear research, was built on a site conveniently located near four Gulags whose laborers were pressed into service clearing the land. Stalin doubled the salaries of the physicists and chemists sent to work there.
The first Russian fission device (or A-bomb) was tested in 1949. Based on espionage, it was an exact replica of the American bomb dropped on Nagasaki. A Russian A-bomb of indigenous design was exploded two years later, soon followed by the fusion or H-bomb. (“H” stands for hydrogen; H-bombs work, as stars do, by fusing hydrogen atoms to make helium and releasing energy in the process.) The Americans had hesitated about developing hydrogen bombs, owing both to the moral concerns of some scientists—there is no upper limit to the power that can be released by nuclear fusion, as a glance at the sun will verify—and by the fact that half the Los Alamos staff had departed at the end of the war. But President Truman in January 1950 directed that the work proceed apace, and the first H-bomb was detonated on Eniwetok Atoll in the Pacific on November 1, 1952. Seven hundred times more powerful than the A-bomb dropped on Hiroshima, it resembled a visiting star. When the Soviets exploded their H-bomb, less than a year later, the arms race was indisputably on. Following the collapse of the Soviet Union the United States cut its nuclear arsenal in half and funded a program that eliminated more than ten thousand Russian nuclear warheads, recycling their reactive materials for use in power plants.
By rights the end of the Soviet Union should have ended communism, which had lost the space race, the nuclear arms race, and every other major scientific and technological competition it entered; had failed to win the loyalty of the peoples it ruled or of those beyond; and had committed a multitude of hideous crimes against humanity. Yet communism still controlled the destinies of more than a billion Chinese, and so provided another unbidden test of the status of science under authoritarian rule.
The Chinese have a long tradition of technological innovation (mechanical printing, the magnetic compass, the world’s first seismograph) and of such protoscientific investigations as charting the night sky. In recent years Chinese researchers working abroad have made important scientific contributions, with three Chinese physicists winning Nobel Prizes—Chen-Ning Franklin Yang and Tsung-Dao Lee, who did their major research at the University of Chicago, and Daniel C. Tsui, who earned his doctorate at Chicago and then worked at Bell Labs. China’s indigenous scientists might have contributed much more to science by now, had their nation enjoyed liberal governance. But instead, China continued to labor under a Communist Party that claimed to be “guided by scientific theories,” and “to ensure that decision making is scientific and democratic,” but which by “scientific” meant Marxist and by “democratic” meant communist. The education of generations of Chinese scientists was blunted by mind-numbing classroom indoctrinations in communist doctrine, those who emerged with any freedom of thought intact having to choose between keeping silent or being oppressed as “dissidents.” As Alfred Zee Chang of the Library of Congress noted in 1954, just four years after the Chinese Communist Party came to power:
It is apparent that the communists have little appreciation of the talents and ability of leading scientists. In fact, these scientists, with their objective method of reasoning, represent a potential threat to communism…. Indigenous technology and the scientific approach on the China mainland is in a highly static condition today, and the future of science in this area presumably will be that contained within the general Soviet scientific pattern.
The Chinese communists, like the Russians, were eager to demonstrate how Marxist methods could transform agriculture. Mao Zedong’s “Great Leap Forward” of the late fifties did just that, and the result was the greatest famine in recorded history: Some twenty million people perished. Survivors got by on a government ration of 1,200 calories a day (less than was provided by the Nazis to slave laborers at Auschwitz) while others resorted to cannibalism or took the government’s advice and tried to live on chorella, a freshwater algae grown in urine. As in the USSR, the Chinese communist leadership continued to export food during the famine, in order to be able to boast of the remarkable productivity of their socialist farms. The State Statistical Bureau, which might otherwise have reported that millions were dying of hunger, was disbanded and replaced with “good news reporting stations.” And, as in the USSR, the Chinese campaigns of agrarian reform were characterized by an overt disdain for science. Stalin during his first five-year plan waved away the warnings of scientists, who were derided as “bourgeois specialists.” Lectures were preceded by the singing of an anthem to “the eternal glory of Academician Lysenko,” whose campaign “protects us from being duped by Mendelist-Morganists”—a reference to Gregor Mendel and Thomas Morgan, two founders of genetics. In China in 1958, a Mao toady named Kang Sheng mounted a national lecture tour to advise the public that if schoolchildren took “action” rather than thinking or reading, they could revolutionize genetics and launch earth satellites:
Science is simply acting daringly. There is nothing mysterious about it…. There is nothing special about making nuclear reactors, cyclotrons or rockets. You shouldn’t be frightened by these things: As long as you act daringly you will be able to succeed very quickly.
Many knowledgeable Chinese could foresee the dire consequences of following such advice, but few objected to it. An exception was the defense minister, Marshal Peng Dehuai, who unlike Mao had grown up in poverty and had seen members of his own family starve to death. “I have experienced famine,” he wrote Mao in 1959.
I know the taste of it and it frightens me! We have fought decades of war and the people, poorly clothed and poorly fed, have spilt their blood and sweat to help us so that the Communist Party could win over the country and seize power. How can we let them suffer again, this time from hunger?
Mao was unperturbed. Death, he had told the May 1958 party congress, is “to be rejoiced over…. We believe in dialectics, and so we can’t not be in favor of death…. There should be celebration rallies when people die.” He forbade mourning the dead, since corpses “can fertilize the ground.” In response to Marshal Peng’s letter, Mao convened a meeting of the Politburo and gave them a speech worthy of Stalin. Progress would take time, he said: “When you eat pork, you can only consume it mouthful by mouthful, and you can’t expect to get fat in a single day. Both the Commander-in-Chief [Zhu De] and I are fat, but we didn’t get that way overnight.” The dire warnings of the economists were to be ignored:
Why can’t our commune cadres and peasants learn something about political economy? Everybody can learn. Those who cannot read may also discuss economics, and more easily than the intellectual.
For writing his letter, Peng Dehuai was condemned as a “right opportunist” and put under house arrest. Interrogated and tortured during the Cultural Revolution, he died in prison in 1973.
The Cultural Revolution was promulgated on August 8, 1966, by the central committee of the Chinese Communist Party, which promised that it would launch “a new stage in the development of the socialist revolution in our country” and “meet head-on every challenge of the bourgeoisie.” In practice this meant that thousands of scientists, authors, artists, and intellectuals were shot, beaten to death, or defenestrated by the Red Guards and other thugs, many of them students encouraged to humiliate or assault their teachers. Although no accurate census of the carnage exists, it is estimated that roughly one million people died in the Cultural Revolution and that an equal number were permanently injured. Because the party uses the term “science” to mean Marxism-Leninism, this campaign to murder scholars and destroy Chinese cultural traditions has since been blamed in China on “scientism”—a word originally coined by right-wing French reactionaries to describe the doctrine that science is superior to other systems of thought.
Meanwhile the Chinese struggled to surge ahead in real science, with Mao ordering that scientists and engineers “must be absolutely protected” from persecution, and that those educated in Europe or America were to be “neither labeled nor denounced.” Top scientists were afforded the privileges enjoyed by all but the very highest government officials; during the famines, for instance, they were given cherished soybeans. But the result, as in Russia, was to turn many of them into vocal critics of the communist regime.
The physicist Fang Lizhi, often called the Chinese Sakharov, entered Peking University at sixteen and rose to become vice president of the University of Science and Technology, but when he began speaking out against socialism he was dismissed from his posts and from the Chinese Communist Party for fomenting “bourgeois liberalization.” Sent off for reeducation on a communal farm, Fang spent a year of solitary confinement in a cowshed with only a single book to read, Lev Landau’s Classical Field Theory. Studying it, he began thinking seriously about cosmology, the science concerned with the history and structure of the universe. This otherworldly pursuit brought further confrontations with the authorities. Lenin and Engels had decreed that the universe must be infinite. Otherwise, they thought, the universe would have a boundary, whereas Lenin had declared that “dialectical materialism insists…on the absence of absolute boundaries in nature.” Actually, as the non-Euclidean geometers had already demonstrated mathematically and as Einstein would demonstrate in physics, the universe can be both finite and unbounded—just as the surface of the earth is finite and unbounded. To understand this is fundamental to scientific cosmology, but when Fang taught it to his students he was rebuked in a physics journal for the “sheer folly” of introducing alternatives to the communist world model. “We must ferret out and combat every kind of reactionary philosophical viewpoint in the domain of scientific research,” the article concluded, “using Marxism to establish our position in the natural sciences.”
It didn’t help that Fang habitually spoke and wrote about democracy and liberalism as if he were living in a free country. “Many of us who have been to foreign countries to study or work agree that we can perform much more efficiently and productively abroad than in China,” he said. “Foreigners are no more intelligent than we Chinese. Why, then, can’t we produce first-rate work? The reasons for our inability to develop our potential lie within our social system.” Blamed for fomenting the 1989 demonstrations in Tiananmen Square—where students erected a Statue of Liberty and rallied to cries of “Science and Democracy!” before being mowed down by government troops—Fang was persuaded to take refuge in the U.S. embassy.* He was there when his speech accepting the Robert F. Kennedy Human Rights Award was read in absentia on American network television. “The universe has no center,” it said. “Every place in the universe has, in this sense, equal rights. How can the human race, which has evolved in a universe of such fundamental equality, fail to…build a world in which the rights of every human from birth are respected?” Fang and his wife escaped to America the following year, Fang joining the faculty at the University of Arizona and composing several evenhanded essays about science and liberal democracy. One of them, written with the Princeton University Sinologist Perry Link, identifies five links between science and liberalism as seen from a Chinese perspective:
1. Science begins with doubt. “In order to make a scientific advance, one must begin by wondering about the received version of things…. This is also called ‘dissent’ [but] even an elementary course in physics takes up the problem of error. Students are taught that the only defense against error lies in the scientist’s willingness always to question.”
2. Doubt leads to independence. While studying at Beijing University, Fang realized that “the path from physics to democracy began with ‘independence of thought’” and “reached the conclusion that science placed the burden of finding truth upon each individual person.”
3. Science is egalitarian. “Objective scientific truth is something that lies beyond the variety of subjective views; statements of objective truth are formed only by a consensus of many observers, and are confirmed by independently repeating experimental results. No single observer is privileged; anyone may form hypotheses, and any hypothesis has to be tested by others before ‘truth’ emerges.” As in democracy, “One person’s vote, like his or her subjective view, contributes to a public consensus without determining it…. Just as everyone stands equal before the truth, similarly everyone should be equal before the law.”
4. Science needs a free exchange of information. In the words of the historian of science Xu Liangying, scientific research requires “an atmosphere of freedom conducive to exploration…. Political democracy and academic freedom are necessary to guarantee the flourishing of science.”
5. Science is universal. “There is no such thing as Chinese or Indian science—or, as the Nazis once seriously claimed, ‘German science.’” Fang notes that although his students at the University of Arizona come from many nations, he teaches them all the same relativity and quantum mechanics, and that “what they learn holds in exactly the same way everywhere, even light-years away.” As he said in 1989, “In science, we approach a situation by asking if a statement is correct or incorrect, if a new theory is an improvement over an old one. These are our criteria. We do not ask if a thing originates with our race or nationality…. Where it comes from is irrelevant. There are no national boundaries in scientific thought.”
Like any other totalitarian ideology, communism must pretend to enjoy a total command of the truth; otherwise it could not justify its claimed authority over every aspect of people’s lives. (Fang: “Authoritarianism needs authoritative statements in order to exercise its power.”) That is why millions of students had to memorize Lenin’s ignorant assertions about the universe, why Stalin was described in Russia as “the greatest genius of mankind,” why Mao’s stultifying “Little Red Book” was treated as a work of wisdom on a par with Confucius, and how North Korea’s Kim Jong-il came to be described as not only “the fatherly leader” but as a poet, philosopher, historian, opera composer, film director, and the world’s best golfer. When the Chinese premier Zhao Ziyang said during a visit to Italy, “Were it not for Copernicus, we still wouldn’t know that the earth is round,” his entirely forgivable mistake was repeated by his translator and reported without correction by the editors of the People’s Daily, presumably out of fear of what would happen to them should they reveal that the great leader was flawed in any way. In the liberal-democratic world, everybody appreciates that politicians and scientists are ordinary mortals who sometimes say dumb things. Such mistakes seldom matter much, because the democracies, like the sciences, are broad-based and largely self-correcting; a democratic nation can limp along even when its chief of state is widely understood to be a lazy, bumbling simpleton. Fallible leadership is the only kind of leadership any nation ever has. Since totalitarians cannot afford to admit this, their domains start and end in fantasy.
It often happens that new scientific knowledge not only builds on prior knowledge but also exposes prior ignorance and error. This process poses few problems for democracies, since they incorporate fallibility as a given, but it leaves totalitarian leaders clinging to outmoded doctrines in a changing world. Hence Chinese communism became increasingly antiquated as the sum of scientific knowledge grew. The China scholar H. Lyman Miller, of Johns Hopkins University, notes in a study of scientific political dissent in post-Mao China that “Marxism-Leninism was formulated originally on the basis of nineteenth-century science, and the Chinese version of it was updated not much beyond Lenin. So the chasm between philosophical doctrine and scientific theory was wide”—and it kept getting wider. Even more alarming, from a communist point of view, was the fact that science in itself exemplifies democratic values. The scientific ethos “is inherently antiauthoritarian,” writes Miller.
Just as the scientific community operates according to antiauthoritarian norms of free debate…so science prospers in an external environment that similarly tolerates pluralism and dissent…. Scientific dissidents espoused a strong form of liberal political philosophy that grew out of the norms of their profession.
The Chinese people, as creative and energetic as any in the world, became increasingly to resemble prisoners in a decaying castle, from which they could distantly see the wider world becoming healthier, wealthier, and more knowledgeable than the folks at home.
Their isolation began to end in 1978, when the government finally started inching away from a totally planned economy by introducing limited free-market reforms. The result was an era of double-digit economic growth: China’s GDP soared by an average of 10.3 percent annually during the 1980s and by 9.7 percent from 1990 through 2002. Thanks to these gains, China by 2008 had become the world’s fourth most productive nation (although, being large and recently poor, it ranked only one-hundredth in per capita GDP) and the party’s efforts to keep it cut off—as by throwing up a “great firewall” limiting Internet access—looked increasingly quaint.
Without denying any of the credit due to the Chinese people for the hard work that went into their economic advances, it should be said that for a nation today to liberalize its trade policies means that it is joining an international economic system constructed by the world’s liberal democracies—a club devoted to economic science and economic liberalism. The benefits that it reaps are therefore made possible by global liberalism, regardless of whether the nation in question has liberalized internally, as China circa 2008 assuredly had not.
Will economic liberalism lead the Chinese to political liberalism as well? One way to gauge whether this is happening is to look at China’s scientific productivity. China increased its investment in scientific research and development to an historic high in 2001, when it spent 1.10 percent of its GDP on R&D. This remained well below the R&D spending rates of wealthy liberal democracies like the United States, which spent 2.74 percent during the same period, or Japan (3.06 percent), but it was impressive for a nation whose per capita GDP was only a thirtieth of the American and Japanese. Rich nations tend to spend a higher proportion of their GDP on scientific research than do poor countries, who often lack the educational infrastructure to support world-class research and who feel that they have more pressing problems to solve first. Yet China’s scientific citation rates remained far below what would be expected from a nation of its size and history of intellectual attainment, in part because the Communist Party continued to stress applied science over pure research.
Even Chinese Communist Party officials eventually began conceding that political liberalization would come, once the Chinese were “ready.” This argument may have some merit, if “ready” means rich enough. During the past century, democracies established in nations with a per capita GDP of $1,500, as was China’s in 2008, have on average survived for less than a decade. In any event, this book predicts that no matter how rapidly the Chinese economy grows, China will become scientifically prominent only once its people have won political—and not just economic—freedom.
The Chinese Communist Party has begun experimenting to a limited extent with grassroots democracy, but it describes even these modest efforts in language worthy of Alice in Wonderland. The 2005 party report on democratic reform, titled “Bid to Build Democracy Comes to Fruition,” stated:
Through painstaking exploration and hard struggle, the Chinese people finally came to realize that mechanically copying the Western bourgeois political system and applying it to China would lead them nowhere.
This is vintage rhetoric from communist functionaries, who are always talking about how hard they work and struggle but whose toil seems mainly aimed at degrading the meanings of words. By “the Chinese people” they mean the party; to “lead them nowhere” means that if free and fair elections were to be held, the communists would lose. So there won’t be many such elections; this is called “building democracy.” By way of historical background, the report states that “in 1921, some progressive intellectuals who had studied the ideology of democracy and science combined Marxism and Leninism with the Chinese workers’ movement, and founded the CPC.” In fact, the Chinese Communist Party was founded at the behest of, and with funding from, the Soviet Union. “The CPC creatively combines the general truth of Marxism-Leninism with the actual situation of the Chinese revolution, setting out such democratic concepts as ‘democracy for the workers and peasants,’ ‘people’s democracy,’ and ‘new democracy.’” The reference to the “general truth” of Marxism-Leninism means that it is false, but that the party can bring itself to admit to only a few of its many falsehoods. When the party says that it is “guided by scientific theories,” it means communism. Its promise “to ensure that decision making is scientific and democratic” means that elections will be confined to candidates selected from party ranks.
In discussing communism, this book has focused on monsters like Stalin and Mao. Many other monsters could be added to the account—men like Lavrenti Beria, head of the Soviet nuclear weapons program, a sadistic rapist of Russian actresses and athletes who, when asked by Sakharov, “Why do we always lag behind the USA and other countries, why are we losing the technological race?” blandly replied, “Because we lack R&D and a manufacturing base” before proffering his “plump, slightly moist, and deathly cold” hand in farewell. Such an account has little to say about the millions of well-intentioned communists who joined the party out of a genuine desire to assuage the sufferings of the poor and to help deliver humankind from feudal ignorance and superstition into what they thought would be an age of science and social justice. But that is precisely the point: Good intentions didn’t matter. Communism was inherently incapable of preventing the rise of monsters and of voting them out of office once they got there. Constitutionally unable to respond to experimental results, the communist regimes ultimately became failed experiments. As Boris Yeltsin, Russia’s first popularly elected president, fatalistically declared:
It was decided to carry out this Marxist experiment on us—fate pushed us precisely in this direction. In the end we proved that there is no place for this idea. It has simply pushed us off the path the world’s civilized countries have taken.
An eighty-year-old communist in Hanoi put it more simply: “From the bottom of my heart, we didn’t want to do any bad things. We tried to be good. But it all became such a mess.”
At least three lessons may be learned from all this.
First, while science and liberalism encourage experimentation, social experiments must only be instigated with the permission of the citizens involved, must be consistent with the legal rights of all, and must remain vulnerable to repeal if they fail to attain the wished-for results. None of these provisions were fulfilled by the fascist and communist regimes, which were “experiments” only in the tragic sense employed by Boris Yeltsin. The essence of scientific experimentation is to include a feedback loop through which the researcher examines the data, draws conclusions, and then alters or ends the experiment accordingly. There is, fortunately, a system of government that fulfills these conditions. Liberal democracies and free markets conduct experiments—and respond, however imperfectly, to the results—every day. The outcomes are not ideal, but they are better than all the known alternatives.
Second, any proposed social system that fails to provide such a feedback loop—that does not, for instance, provide for free and fair elections—should be rejected out of hand. It is utterly irresponsible to hand over power to any movement not vulnerable to such review, regardless of its putative qualifications. Especially to be rejected are utopian claims that people by surrendering their freedom shall enter into a wonderful new age in which everybody will be changed for the better—as when Leon Trotsky promised that through communism a new “superman” would emerge, “incomparably stronger, wiser, more subtle” than any seen before. This is not to say that new and somehow better sorts of persons will never appear; in a sense this happens all the time. But no peoples may justifiably barter away their fundamental rights and those of their descendants in the hope of such a development, or for any other reason. This tenet, central to liberalism, is paralleled in science by the demand that new theories ought to answer to the data of past experimentation and observation; no new scientific theory is apt to find much acceptance if it claims to break with everything that has already been learned.
Finally, ends cannot be separated from means. If the means are illiberal, or immoral, or fly in the face of common sense, they are not to be justified by recourse to a predicted outcome. To proceed as if this were not the case is to put too much faith in our ability to predict the future. Science is better at making accurate predictions than any other system of thought, but even science makes no claim to the sort of knowledge that would be required to lead humans into darkness on the promise of eventual light. The very least that can be done to honor the memory of those who died under political repression is to act today in ways that can be justified today, without borrowing against an imagined tomorrow.