32
HEAVEN AND EARTH

The words historic moment have been heavily overused in this century. But if any moment outside war can be described as truly historic it surely occurred at twenty seconds after 3:56 AM BST on Monday, 21 July 1969, when Neil Armstrong stepped off the ladder that led from the ‘Eagle,’ the landing module of the Apollo 11 spacecraft, and on to the surface of the Moon, making him the first person to arrive on a celestial body outside Earth. As he did so, he spoke the words that have since become famous: ‘That’s one small step for man – one giant leap for mankind.1

For the benefit of scientists back at Mission Control in Houston, he then went on in a more down-to-earth, scientifically informative way: ‘The surface is fine and powdery, I can … I can pick it up loosely with my toe. It does adhere in fine layers like powdered charcoal to the sole and sides of my boots. I can only go in a small fraction of an inch. Maybe an eighth of an inch, but I can see the footprints of my boots and the treads in the fine sandy particles…. There seems to be no difficulty in moving around, as we suspected…. We’re essentially on a level place here – very level place here.’2 If the greatest intellectual achievement of the first half of the twentieth century was undeniable – the conception and construction of the atomic bomb – the achievements of the second half were more diverse, including the isolation and understanding of DNA, and the computer. But space travel and the moon landing certainly count as one of the greatest achievements of the century.

After the Russians had sprung their surprise in 1957, stealing a march on the United States with the launch of Sputnik 1, they had built on their lead, putting the first animal in space, the first man (Yuri Gagarin, in 1961), and the first woman (Valentina Tereshkova, in 1963). The United States responded with something close to panic. President Kennedy called a frenzied meeting at the White House four days after the Bay of Pigs disaster (when 1,500 Cuban exiles, trained in America by the U.S. military, invaded the island, only to be killed or captured). Railing that ‘something must be done,’ he had shouted at Lyndon Johnson, the vice president, ordering him to find out if ‘we have a chance of beating the Soviets by putting a laboratory in space, or by a trip round the Moon, or by a rocket to land on the Moon, or by a rocket to go to the Moon and back with a man?’3 The Americans finally put John Glenn into orbit on 20 February 1962 (Alan Shephard made a fifteen-minute nonorbital flight in May 1961). But then the Americans began to catch up, thanks to Kennedy’s commitment to the Apollo program with its aim to land a manned spacecraft on the Moon ‘before this decade is out.’4 Begun in 1963 (though NASA had been created in 1958), America spent up to $5 billion a year on space over the next ten years. That sum gives an idea of the size of the project, which involved, among other things, building a reliable spaceship bigger than a railway engine, designing and manufacturing a rocket heavier than a destroyer, and inventing several new materials.5 The project benefited from the brains of 400,000 people from 150 universities and 20,000 firms. We already know from Korolev that rocket technology lay at the heart of the space program and the biggest U.S. rocket, Saturn 5, weighed 2,700 tons, roughly the same as 350 London buses. Developed under Wernher von Braun, another German emigré, Saturn was 364 feet high, had 2 million working parts, 2.5 million solder joints, and 41 separate engines for guidance purposes, and carried in all 11,400,000 gallons of fuel – liquid nitrogen, oxygen, hydrogen, and helium, some of it stored at minus 221 degrees centigrade to keep it liquid.6 The oxygen alone filled the equivalent of fifty-four railway container tanks.7 The Moonship contained the cone-shaped command module, the only part to come back to earth and which therefore needed to withstand the enormous temperatures on re-entry to the atmosphere (caused by friction at such high speeds).8 One of the main engineering problems was to keep the cryogenic fuels cool enough. The tanks eventually designed were so airtight that, were ice cubes to have been installed inside, they would not have melted for nine years. The exit hatch of the module needed 150 new tools to be invented. Some bolts had to be locked in place by two men using a five-foot wrench.9

No one really knew how conditions in space would affect the men.10 Great care was taken therefore with psychological selection and training. They were taught to be tolerant and careful (always avoiding acute angles where they might snag their suits), and they were given massages every day. The crews that advanced were those that had worked together in harmony for more than a year. Interestingly enough, over the years both the Americans and the Russians came up with a near-identical profile of the ideal astronaut: they should not be too old, no later than their late thirties, or too tall, no taller than five-eleven or six feet; they should be qualified jet and test pilots with degrees in engineering.11 Finally, there was the reconnaissance of the Moon itself. Quite apart from the prospect in time of colonising space and its minerals, there were sound scientific reasons for studying the Moon close up. Since it lacked an atmosphere, the Moon was in some senses in pristine condition, ‘a priceless antique,’ as one scientist called it, in much the same condition as it had been when the universe, or the solar system at least, had first evolved. Examination of the rocks would also help decide how the Moon formed – whether it was once part of Earth, or broke off with Earth from the sun after collision with an asteroid, or was formed by very hot gas cooling.12 Both American and Soviet space probes got progressively closer to the Moon, sending back better and better photographs until objects as small as five feet wide could be distinguished. Five areas were originally chosen for landing, then narrowed to one, the Sea of Tranquillity, actually a flat plain free of craters.13

The biggest disaster of the American program took place in 1967, when a spaceship caught fire on the launchpad at Cape Kennedy after liquid oxygen somehow ignited, killing all three men inside. The world never knew how many Russian astronauts perished because of the greater secrecy surrounding their program, but distress messages picked up by radio hams around the globe suggested that at least eight were in trouble between 1962 and 1967.14 The greatest drama before the moon landing itself was the December 1968 flight of Apollo 8 around the Moon, which involved going behind the Moon to the dark side, which no one had ever seen, and meant that the crew would be out of radio contact with Mission Control for about half an hour. If the ‘burn’ of the engines was too strong, it might veer off into deep space; if it was too weak, it might crash into the Moon, on the far side, never to be heard from again.15 The pope sent a message of goodwill, as did a number of Russian space scientists, acknowledging implicitly at this point that the Americans were now decisively ahead.

At 9:59 AM on Christmas Eve, Apollo 8 swung behind the Moon. Mission Control in Houston, and the rest of the world, waited. Ten minutes of silence passed; twenty; thirty. At 10:39 AM Frank Borman’s voice could be heard reporting data from his instruments. Apollo 8 was exactly on schedule and, as Peter Fairley narrates the episode in his history of the Apollo Project, after a journey of a quarter of a million miles, it had arrived on its trajectory within half a mile of the one planned.16

The scene was set for Apollo 11. Edward ‘Buzz’ Aldrin Jr. joined Neil Armstrong on the surface of the Moon, where they placed a plaque, and a flag, planted some seeds, and collected rock samples with specially designed tools that avoided them having to bend. Then it was back into the ‘Lunar Bug,’ rendezvous with Michael Collins in the command module, and the return journey, splashing down near Johnston Island in the Pacific, where they were met by the USS Hornet with President Richard Nixon on board. The men had returned safely to Earth, and the space age had begun.17

The landing on the Moon was, however, in some ways a climax rather than a debut. Crewed flights to the Moon continued until 1972, but then stopped. As the 1970s wore on, space probes went deeper into the heavens – Venus, Mars, Mercury, Jupiter, the Sun, Saturn, with Pioneer 10, launched in 1972, becoming the first manmade object to leave the solar system, which it did in 1983. Actual landings were considered less necessary after the first flush of excitement, and both the Americans and Russians concentrated on longer flights in orbit, to enable scientists to carry out experiments in space: in 1973, in the United States Skylab, astronauts spent eighty-four days aboard. The first stage of the space age may be said to have matured around 1980. In that year, Intelsat 5 was launched, capable of relaying thousands of telephone calls and two television channels. And in the following year the Columbia, the first reusable space shuttle, was launched. In just ten years space travel had gone from being exotic to being almost mundane.

*

The space race naturally stimulated interest in the heavens in general, a happy coincidence, as the 1960s had in any case seen some very important advances in our understanding of the universe, even without the advantages conferred by satellite technology. In the first half of the century, apart from the development of the atomic bomb and relativity, the main achievement of physics was its unification with chemistry (as epitomised in the work of Linus Pauling). After the war, the discovery of yet more fundamental particles, especially quarks, brought about an equivalent unification, between physics and astronomy. The result of this consilience, as it would be called, was a much more complete explanation of how the heavens – the universe – began and evolved. It was, for those who do not find the reference blasphemous, an alternative Genesis.

Quarks, as we have seen, were originally proposed by Murray Gell-Mann and George Zweig, almost simultaneously in 1962. It is important to grasp that quarks do not exist in isolation in nature (at least on Earth), but the significance of the quark (and certain other particles isolated in the 1960s and 1970s but which we need not describe here) is that it helps explain conditions in the early moments of the universe, just after the Big Bang. The idea that the universe began at a finite moment in the past was accepted by most physicists, and many others, since Hubble’s discovery of the red shift in 1929, but the 1960s saw renewed interest in the topic, partly as a result of Gell-Mann’s theories about the quark but also because of an accidental discovery made at the Bell Telephone Laboratories in New Jersey, in 1965.

Since 1964, the Bell Labs had been in possession of a new kind of telescope. An antenna located on Crawford Hill at Holmdel communicated with the skies via the Echo satellite. This meant the telescope was able to ‘see’ into space without the distorting interference of the atmosphere, and that far more of the skies were accessible. As their first experiment, the scientists in charge of the telescope, Arno Penzias and Robert Wilson, decided to study the radio waves being emitted from our own galaxy. This was essentially baseline research, the idea being that once they knew what pattern of radio waves we were emitting, it would be easier to study similar waves coming from elsewhere. Except that it wasn’t that simple. Wherever they looked in the sky, Penzias and Wilson found a persistent source of interference – like static. At first they had thought there was something wrong with their instruments. A pair of pigeons were nesting in the antenna, with the predictable result that there were droppings everywhere. The birds were captured and sent to another part of the Bell complex. They came back. This time, according to Steven Weinberg’s account published later, they were dealt with ‘by more decisive means.18 With the antenna cleaned up, the ‘static’ was reduced, but only minimally, and it still appeared from all directions. Penzias discussed his mystery with another radio astronomer at MIT, Bernard Burke. Burke recalled that a colleague of his, Ken Turner of the Carnegie Institute of Technology, had mentioned a talk he had heard at Johns Hopkins University in Baltimore given by a young theorist from Princeton, P. J. E. Peebles, which might bear on the ‘static’ mystery. Peebles’s speciality was the early universe. This was a relatively new discipline and still very speculative. As we saw in chapter 29, in the 1940s an emigré from Ukraine, George Gamow, had begun to think about applying the new particle physics to the conditions that must have existed at the time of the Big Bang. He started with ‘primordial hydrogen,’ which, he said, would have been partly converted into helium, though the amount produced would have depended on the temperature of the Big Bang. He also said that the hot radiation corresponding to the enormous fireball would have thinned out and cooled as the universe expanded. He went on to argue that this radiation ‘should still exist, in a highly “red-shifted” form, as radio waves.19 This idea of ‘relict radiation’ was taken up by others, some of whom calculated that such radiation should now have a temperature of 5 K (i.e., 5 degrees above absolute zero). Curiously, with physics and astronomy only just beginning to come together, no physicist appeared to be aware that even then radio astronomy was far enough ahead to answer that question. So the experiment was never done. And when radio astronomers at Princeton, under Robert Dicke, began examining the skies for radiation, they never looked for the coolest kinds, not being aware of their significance. It was a classic case of the right hand not knowing what the left was doing. When Peebles, a Canadian from Winnipeg, started his Ph.D. at Princeton in the late 1950s, he worked under Robert Dicke. Gamow’s theories had been forgotten but, more to the point, Dicke himself seems to have forgotten his own earlier work.20 The result was that Peebles unknowingly repeated all the experiments and theorising of those who had gone before. He arrived at the same conclusion, that the universe should now be filled with ‘a sea of background radiation’ with a temperature of only a few K. Dicke, who either still failed to remember his earlier experiments or didn’t realise their significance, liked Peebles’s reasoning enough to suggest that they build a small radio telescope to look for the background radiation.

At this point, with the Princeton experiment ready to start, Penzias called Peebles and Dicke, an exchange that became famous in physics. Comparing what Dicke and Peebles knew about the evolution of background noise and the observations of Penzias and Wilson, the two teams decided to publish in tandem a pair of papers in which Penzias and Wilson would describe their observations while Dicke and Peebles gave the cosmological interpretation – that this was indeed the radiation left over from the Big Bang. Within science, this created almost as huge a sensation as the confirmation of the Big Bang itself.21 It was this duo of papers published in the Astrophysical Journal that caused most scientists to finally accept the Big Bang theory – not unlike the acceptance of continental drift after Eltanin’s sweep across the Pacific-Antarctic Ridge.22 In 1978, Penzias and Wilson received the Nobel Prize.

Long before then, there had been a synthesis, bringing together what was known about the behaviour of elementary particles, nuclear reactions, and Einstein’s theories of relativity to produce a detailed theory about the origin and evolution of the universe. The most famous summing up of these complex ideas was Steven Weinberg’s book The First Three Minutes, published in 1977 and on which my account is chiefly based. The first thing that may be said about the ‘singularity,’ as physicists call Time Zero, is that technically all the laws of physics break down. Therefore, we cannot know exactly what happened at the moment of the Big Bang, only nanoseconds later (a nanosecond is a millionth of a second). Steven Weinberg gives the following chronology, which for ease of digestion to the layperson is set out here as a table.

After 0.0001 (10–4) seconds:
This, the original ‘moment of creation,’ occurred 15 billion years ago. The temperature of the universe at this near-original moment was 1012 K, or 1,000 billion degrees (written out, that is 1,000,000,000,000 degrees). The density of the universe at this stage was 1014 – 100,000,000,000,000 – grams per cubic centimetre (the density of water is 1 gram per cubic centimetre). Photons and particles were interchangeable at this point.

After 0.01 (10–2) seconds:
The temperature was 100 billion K.

After 0.1 seconds:
The temperature was 30 billion K.

After 13.8 seconds:
The temperature was 3 billion K, and nuclei of deuterium were beginning to form. These consisted of one proton and one neutron, but they would have soon been knocked apart by collisions with other particles.

After 3 minutes, 2 seconds:
The temperature was 1 billion K (about seventy times as hot as the sun is now). Nuclei of deuterium and helium formed.

After 4 minutes:
The universe consisted of 25 percent helium and the rest ‘lone’ protons, hydrogen nuclei.

After 300,000 years:
The temperature was 6,000 K (roughly the same as the surface of the sun), when photons would be too weak to knock electrons off atoms. At this point the Big Bang could be said to be over. The universe expands ‘relatively quietly,’ cooling all the while.

After 1 million years:
Stars and galaxies begin to form, when nucleosynthesis takes place and the heavy elements are formed, which will give rise to the Sun and Earth.
23

At this point the whole process becomes more accessible to experimentation, because particle accelerators allowed physicists to reproduce some of the conditions inside stars. These show that the building blocks of the elements are hydrogen, helium, and alpha particles, which are helium-4 nuclei. These are added to existing nuclei, so that the elements build up in steps of 4 atomic mass units: ‘Two helium-4 nuclei, for example, become beryllium-8, three helium-4 nuclei become carbon-12, which just happens to be stable. This is important: each carbon-12 nucleus contains slightly less mass than three alpha particles which went to make it up. Therefore energy is released, in line with Einstein’s famous equation, E=mc2, releasing energy to produce more reactions and more elements. The building continued, in stars: oxygen-16, neon-20, magnesium-24, and eventually silicon-28.’ ‘The ultimate step,’ as Weinberg describes it, ‘occurs when pairs of siIicon-28 nuclei combine to form iron-56 and related elements such as nickel-56 and cobalt-56. These are the most stable of all.’ Liquid iron, remember, is the core of the earth. This narrative of the early universe was brilliant science but also a great work of the imagination, the second evolutionary synthesis of the century.24 It was more even than that, for although imagination of a high order was required, it also needed to conform to the evidence (such evidence as there was, anyway). As an intellectual exercise it was on a par with the ideas of Copernicus, Galileo, and Darwin.25

But background radiation was not the only form of radio waves from deep space discovered in the 1960s. Astronomers had observed many other kinds of radio activity unconnected with optical stars or galaxies. Then, in 1963, the Moon passed in front of one of those sources, number 273 in the Third Cambridge Catalogue of the Heavens and therefore known as 3C 273. Astronomers carefully tracked the exact moment when the edge of the Moon cut off the radio noise from 3C 273 – pinpointing the source in this way enabled them to identify the objects as ‘star-like,’ but they also found that the source had a very large redshift, meaning it was well outside our Milky Way galaxy. It was subsequently shown that these ‘quasi-stellar’ objects, or quasars, form the heart of distant galaxies that are so far away that such light as reaches us left them when the universe was very young, more than 10 billion years ago. What brightness there is, however, suggests that their energy emanates from an area roughly one light day across, more or less the dimensions of the solar system. Calculations show that quasars must therefore radiate ‘about 1,000 times as much energy as all the stars in the Milky Way put together.’ In 1967 John Wheeler, an American physicist who had studied in Copenhagen and worked on the Manhattan Project, revived the eighteenth-century theory of black holes as the best explanation for quasars. Black holes had been regarded as mathematical curiosities until relativity theory suggested they must actually exist. A black hole is an area where matter is so dense, and gravity so strong, that nothing, not even light, can escape: ‘The energy we hear as radio noise comes from masses of material being swallowed at a fantastic rate.’26

Pulsars were another form of astronomical object detected by radio waves. They were discovered – accidentally, like background radiation – in 1967 by Jocelyn Burnell, a radio astronomer in Cambridge. She was using a radio telescope to study quasars when she stumbled on a completely unknown radio source. The pulses were extremely precise – so precise that at first the Cambridge astronomers thought they might be signals from a distant civilisation. But the discovery of many more showed they must be a natural phenomenon. The pulsing was so rapid that two things suggested themselves: the sources were small, and they were spinning. Only a small object spinning fast could produce such pulses, rather like a very rapid lighthouse beam coming round every so often. The small size of the pulsars told astronomers that they must be either white dwarfs, stars with the mass of the sun? packed into the size of the earth, or neutron stars, with the mass of the sun ‘packed into a sphere less than ten kilometres across.’27 When it was shown that white dwarfs could not rotate fast enough to produce such pulses without falling apart, scientists finally had to accept that neutron stars exist.28 These superdense stars, midway between white dwarfs and black holes, have a solid crust of iron above a fluid inner core made of neutrons and, possibly, quarks. The density of neutron stars has been calculated by physicist John Gribbin as 1 million billion times greater than water, meaning that each cubic centimetre of such a star would weigh 100 million tons.29 The significance of pulsars being identified as neutron stars was that it more or less completed the sequence of stellar evolution. Stars form as cooling gas; as they contract they get hotter, so hot eventually that nuclear reactions take place; this is known as the ‘main sequence’ of stars. After that, depending on their size and when a crucial temperature is reached, quantum processes trigger a slight expansion that is also fairly stable – and the star is now a red giant. Toward the end of its life, a star sheds its outer layers, leaving a dense core in which all nuclear reactions have stopped – it is now a white dwarf and will cool for millions of years, eventually becoming a black dwarf, unless it is very large, in which case it ends as a dramatic supernova explosion, when it shines very brightly, very briefly, scattering heavy elements into space, out of which other heavenly bodies form and without which life could not exist.30 It is these supernovae explosions that give rise to neutron stars and, in some cases, black holes. And so, the marriage of physics and astronomy – quasars and quarks, pulsars and particles, relativity, the formation of the elements, the lives of stars – was all synthesised into one consistent, coherent, story.31

Once one gets over the breathtaking numbers involved in anything to do with the universe, and accepts the sheer weirdness not only of particles but of heavenly bodies, one cannot escape the fact of how inhospitable much of the universe is – very hot, very cold, very radioactive, unimaginably dense. No life as we can conceive it could ever exist in these vast reaches of space. The heavens were as awesome as they had ever been, ever since man’s observation of the sun and the stars began. But heaven was no longer heaven, if by that was meant the same thing as paradise.

When the crew of Apollo 8 returned from their dangerous mission around the Moon, at the end of 1968, they gave a broadcast in which they treated earthlings to readings from the Bible. ‘And the Earth was without form, and void,’ read Frank Borman, quoting from Genesis.32 ‘And darkness was upon the face of the deep,’ continued Bill Anders. This did not please everyone, and the American television networks were swamped with calls from viewers complaining about the intrusion of religion at such a time. But you didn’t have to be much of a philosopher to see that the revolution in the study of the heavens, and the theories being propounded as a result of so many observations, both before the advent of satellites and since, could not be easily reconciled with many traditional religious ideas. Not only had man evolved; so had the very heavens themselves. The modern sciences of astrophysics and cosmology were not the only aspects of the modern world to bring about changes in religious belief, not by a long way. But they were not irrelevant either.

So far as the major religions of the world were concerned, there were three important developments after the end of World War II. Two of these concerned Christianity, and the third involved the religions of the East, especially India. (So far as Judaism and Islam were concerned, their problems were mainly political, arising from the creation of the state of Israel in 1948.) The upsurge of interest on the part of Westerners in the religions of the East is considered in the next chapter. Here we shall examine the two main areas of thought that taxed Christianity.

These may be simply put: the continuing discoveries of science, in particular archaeological discoveries in the Middle East, what some people called the Holy Land, and existentialism. In 1947, a year before Israel was founded, there occurred the most spectacular excavation of archaeological material since the unearthing of Tutankhamen’s tomb in 1922. This was the discovery of the so-called Dead Sea Scrolls at Qumran, which were first found in a cave by an Arab boy, Muhammad Adh-Dhib, chasing a wayward goat as it scampered up a rock face overlooking the inland sea. There were fewer parallels with the boys who discovered Lascaux than at first seemed because events in the wake of Muhammad’s find involved far darker dealings. The area was highly unstable politically, and local traders and even religious leaders held back on the truth, hiding documents by burying them in soil so unsuitable that many were destroyed. It took months for the full picture to emerge, so that by the time trained archaeologists were able to visit the cave where Muhammad had first stumbled across the jars containing the scrolls, much of the context had been destroyed.33

Even so, the significance of the scrolls could not be minimised. Until that point, the last word on biblical archaeology had been F. G. Kenyon’s The Bible and Archaeology, published in 1940. Broadly speaking, this argued that the thrust of science had been to confirm the biblical account, in particular that Jericho, as the Bible said had existed from around 2000 BC to 1400 BC and then been destroyed. The significance of the scrolls was more profound. They had belonged to an early sect that had existed in Palestine from perhaps 135 BC to shortly before the destruction of Jerusalem in ALL 70.34 The scrolls contained early texts from parts of the Bible, including Isaiah. At that stage, scholars were divided on how the Bible had been put together, and many thought there had been a fight in the early centuries as to what should be included and what left out. In other words, in this scenario the Bible too had evolved. But the Qumran texts showed that the Old Testament at least was already, in the first century AD, more or less written as we know it. A second and even more incendiary significance of the Qumran texts was that, as research showed, they had belonged to a very ascetic sect known as the Essenes, who had a Teacher of Righteousness and called themselves either the Sons of Zadok or the Children of Light.35 Jesus wasn’t referred to in the Qumran texts, and there were some marked differences between their lifestyle and his. But the existence of this extremist sect, at the very time Jesus is supposed to have lived, threw a great deal of light on the emergence of Christianity. Many of the events referred to in the Qumran documents are either exactly as described in the Bible, or thinly disguised allegories. The prospect was held out, therefore, that Jesus was a similar figure, beginning his career as the leader of just such a Jewish sect.36

The very authority and plausibility of this overall historical context, as expanded by recent scholarship, was most threatening to Christianity. On 12 August 1950 Pope Pius XII issued Humani Generis, an encyclical designed specifically to counter ‘extreme non-Christian philosophies of evolutionism, existentialism, and historicism as contributing to the spread of error.’37 Not that the encyclical was entirely defensive: the document called on Catholic philosophers and theologians to study these other philosophies ‘for the purpose of combating them,’ conceding that ‘each of these philosophies contains a certain amount of truth.’38 The encyclical condemned all attempts to ‘empty the Genesis accounts in the Old Testament,’ took the view that evolution was not as yet a proven fact, and insisted that polygenism (the idea that man evolved more than once, in several places across the earth) could not be taught (i.e., accepted), ‘for it is not yet apparent how polygenism is to be reconciled with the traditional teaching of the Church on original sin.’39 The encyclical turned existential thinking on its head, blaming Heidegger, Sartre, and the others for the gloom and anxiety that many people felt.

More lively, more original – and certainly more readable – resistance to existentialism, evolutionism, and historicism came not from the Vatican but from independent theologians who, in some cases, were themselves at loggerheads with Rome. Paul Tillich, for example, was a pre-eminent religious existentialist. Born in August 1886 in a small village near Brandenburg, he studied theology in Berlin, Tubingen, and Halle and was ordained in 1912. He was a chaplain in the German army in World War I and afterward, in the mid-1920S, was professor of theology at Marburg, where he came under the influence of Heidegger. In 1929 he moved to Frankfurt, where he became professor of philosophy and came into contact with the Frankfurt School.40 His books, especially Systematic Theology (2 volumes, 1953 and 1957) and The Courage to Be (1952), had an enormous impact. A great believer in the aims of socialism, including many aspects of Marxism, Tillich was instantly dismissed when the Nazis came to power. Fortunately, Reinhald Niebuhr happened to be in Germany that summer and invited him to the Union Theological Seminary in New York.

Tillich mounted a complete rethink of Christian theology, starting from commonsense propositions – at its most basic, the fact that there is something, rather than nothing; that many people sense the existence of God; that there is sin (he thought Freud’s libido was the modern manifestation of the driving force of sin); and that atonement for our sins is a way of approaching God.41 Tillich thought that these feelings or thoughts were so natural that they needed no complicated explanation; in fact, he thought they were forms of reason just as much as scientific or analytic reason – he spoke of ‘ecstatic reason’ and ‘depth of reason’: ‘The depth of reason is the expression of something that is not reason, but which precedes reason and is manifest through it.’ He appears to be saying, in other words, that intuition is a form of reason, and evidence of the divine. Ecstatic reason was like revelation, ‘numinous astonishment,’ which conveyed the feeling of being ‘in the grip of a mystery, yet elated with awe.’42 The Bible and the church had existed for centuries; this needed no explanation either; it merely reflected the reality of God. Tillich followed Heidegger in believing that one had to create one’s life, to create something out of nothing, as God had done, using the unique phenomenon of Christ as a guide, showing the difference between the self that existed, and the self in essence, and in doing so remove man from ‘the anxiety of non-being,’ which he thought was the central predicament.

When he revisited Europe after World War II, Tillich summed up his impression of the theological scene in this way: ‘When you come to Europe today, it is not as it was before, with Karl Barth in the centre of discussion; it is now Rudolf Bultmann who is in the centre.’43 In the twenty years after the war, Bultmann’s ‘demythologising’ made a remarkable impact on theology, an impact comparable to that made by Barth after World War I. Barth’s view was that man’s nature does not change, that there is no moral progress, and that the central fact of life is sin, evil. He rebelled against the beliefs of modernity that man was improving. The calamity of World War I gave great credibility and popularity to Barth’s views, and in the grim years between the wars his approach became known as ‘Crisis Theology.’ Man was in perpetual crisis, according to Barth, on account of his sinful nature. The only way to salvation was to earn the love of God, part of which was a literal belief in the Holy Bible. This new orthodoxy proved very helpful for some people as an antidote to the pseudoreligions in Nazi Germany.

Bultmann took a crucially different attitude to the Bible. He was very aware that throughout the nineteenth century, and in the first decades of the twentieth, archaeologists and some theologians had sought evidence in the Holy Lands for the events recorded in the Old and New Testaments. (One high point in this campaign had been Albert Schweitzer’s Quest for the Historical Jesus, published in 1906.) Rather than express ‘caution’ about these matters, as Humani Generis had done, Bultmann argued that it was time to call a halt to this search. It had been futile from the start and could not hope to settle the matter one way or the other. He argued instead that the New Testament should be ‘demythologised,’ a term that became famous. Science had made much progress, he said, one effect of which was to suggest most strongly that the miracles of the Bible – the Resurrection, even the Crucifixion – may never have taken place as historical events. Bultmann knew that much of the information about Jesus in the Bible had been handed down from the Midrash, Jewish commentary and legend. He therefore concluded that the Bible could only be understood theologically. There may have been an historical Jesus, but the details of his life mattered less than that he was an example of kerygma, ‘the proclamation of the decisive act of God in Christ.’44 When people have faith, said Bultmann, they can enter a period of ‘grace,’ when they may receive ‘revelations’ from God. Bultmann also adapted several ideas from existentialism, but Heidegger’s variety, not Sartre’s (Bultmann was German). According to Heidegger, all understanding involves interpretation, and in order to be a Christian, one had to decide (an existential act) to follow that route (that’s what faith meant), using the Bible as a guide.45 Bultmann acknowledged that history posed a problem for this analysis: Why did the crucial events in Christianity take place where they did so long ago? His answer was that history should be looked upon less in a scientific way, or even in the cyclical way that some Eastern religions did, but existentially, with a meaning fashioned by each faithful individual for himself and herself. Bultmann was not advocating an ‘anything goes’ philosophy – a great deal of time and effort was spent with critics discussing what, in the New Testament, could and could not be demythologised.46 Faith, he was saying, cannot be achieved by studying the history of religion, or history per se, nor by scientific investigation. Religious experience was what counted, and kerygma could be achieved only by reading the Bible in the ‘demythologised’ way he suggested. His final contentious point was that Christianity was a special religion in the world. For him, Christianity, the existence of Christ as an act of God on earth, ‘has an inescapably definitive character.’ He thought that at the turn of the century, ‘when it seemed as if Western culture was on its way to becoming the first world-culture, it … seemed also that Christianity was on its way to attaining a definitive status for all men.’ But of course that didn’t happen, and by the 1950s it appeared ‘likely that for a long time yet different religions will need to live together on the earth.’47 This was close to saying that religions evolve, with Christianity being the most advanced.48

If Bultmann was the most original and uncompromising theologian in his response to existentialism and historicism, Teilhard de Chardin fulfilled an equivalent role in regard to evolution. Marie-Joseph-Pierre Teilhard de Chardin was born on 1 May 1881, the fourth of eleven children, seven of whom died. He went to a school run by Jesuits, where he proved himself very bright but besotted by rocks more than lessons. He became a Jesuit novitiate at Aix in 1890 and took his first vows in 1901.49 But his obsession with rocks turned into a passion for geology, palaeontology – and evolution. In his one person Teilhard de Chardin combined the great battle between religion and science, between Genesis and Darwin. His religious duties took him to China in the 1920s, 1930s, 1940s, where he excavated at Choukoutien. He met Davidson Black and Wen Chung-Pei, two of the discoverers of Peking Man and Peking Man culture. He became friendly with the Abbé Breuil, who introduced him to many of the caves and cave paintings of northern Spain, and with George Gaylord Simpson and Julian Huxley, two of the scholars who helped devise the evolutionary synthesis, and with Joseph Needham, whose seven-volume Science and Civilisation in China began publication in 1954. He knew and corresponded with Margaret Mead. This background was especially significant because Teilhard’s chosen field, the emergence of man, the birth of humanity, profoundly affected his theology. His gifts put him in the position of reconciling as no one else could the church and the sciences, especially the science of evolution.

For Teilhard, the ideas of Darwin showed that the world had moved out of the static cosmos that applied in the days of Plato and the other Greeks, into a dynamic universe that was evolving. In consequence, religions evolved too, and man’s very discovery of evolution showed that, in unearthing the roots of his own humanity, he was making spiritual progress. The supreme event in the universe was the incarnation of Christ, which Teilhard accepted as a fact. The event of Christ, he said, as a self-evidently nonevolutionary event – the only one in the history of the universe – showed its importance; and Christ’s true nature, as revealed in the Scriptures, therefore served the purpose of showing what man was evolving toward.50 Evolution, he believed, was a divine matter because it not only pointed backward but, allied with the event of Christ, showed us the path to come. Although Teilhard himself did not make a great deal out of it, and claimed indignantly that he was not a racist, he said clearly that ‘there are some races that act as the spearhead of evolution, and others that have reached a dead end.’51

All his life, Teilhard planned a major work of religious and scientific synthesis, to be called The Phenomenon of Man. This was completed in the early 1940s, but as a Jesuit and a priest, he had first to submit the book to the Vatican. The book was never actually refused publication, but he was asked several times to revise it, and it remained unpublished at his death in 1955.52 When it finally did appear, it became clear that for Teilhard evolution is the source of sin, ‘for there can be no evolution without groping, without the intervention of chance; consequently, checks and mistakes are always possible.’53 The very fact that the Incarnation of Christ took place was evidence, he said, that man had reached a certain stage in evolution, so that he could properly appreciate what the event meant. Teilhard believed that there would be further evolution, religious as well as biological, that there would be a higher form of consciousness, a sort of group consciousness, and in this he acknowledged an affinity for Jung’s views about the racial unconscious (at the same time deriding Freud’s theories). Chardin was turned down for a professorship at the Collège de France (the Abbé Breuil’s old chair), but he was elected to the Institute of France.

But the church was not only concerned with theology; it was a pastoral organisation as well. It was rethinking the church’s pastoral work that most concerned the other influential postwar religious thinker, Reinhald Niebuhr. Significantly, since pastoral work is essentially more practical, more pragmatic, than theological matters, Niebuhr was American. He came from the Midwest of America and did his early pastoral work in the capital of the motor trade, Detroit. In The Godly and the Ungodly (1958), he set out to rescue postwar America from what he saw as a fruitless pietism, redefining Christianity in the process, and to reaffirm the areas of life that science could never touch.54 The chapters in his book reveal Niebuhr’s anxieties: ‘Pious and Secular America,’ ‘Frustration in Mid-Century,’ ‘Higher Education in America,’ ‘Liberty and Equality,’ plus chapters on the Negro and on anti-Semitism. Niebuhr thought that America was still, in some ways, a naive country, sentimental even. He acknowledged that naïveté had certain strengths, but on the downside he also felt that America’s many sectarian churches still had a frontier mentality, a form of pietism that took them away from the world rather than toward it. He saw it as his job to lead by example, to mix religion with the social and political life of America. This was how Christians showed love, he said, how they could find meaning in the world. He thought higher education was partly to blame, that the courses offered in American universities were too standardised, too inward-looking, to breed truly sophisticated students and were a cause of the intolerance that he explored in his chapters on blacks and Jews. He made it plain that pious Americans labelled everything they didn’t like ‘Godless,’ and this did no one any good.55

He identified ‘three mysteries,’ which, he said, remained, and would always remain. These were the mysteries of creation, of freedom, and of sin. Science might push back the moment of creation further and further, he said, but there would always be a mystery beyond any point science could reach. Freedom and sin were linked. ‘The mystery of the evil in man does not easily yield to rational explanations because the evil is the corruption of a good, namely, man’s freedom.’56 He did not hold out the hope of revelation in regard to any of these mysteries. He thought that America’s obsession with business was actually a curtailment of freedom, and that true freedom, the true triumph over evil, came from social and political engagement with one’s fellow men, in a religious spirit. Niebuhr’s analysis was an early sign of the greater engagement with sociopolitical matters that would overtake the church in the following decades, though Niebuhr, as his calm prose demonstrated, was no radical.57

Catholics were – in theory at least – moved by the same spirit. On II October 1962, 2,381 cardinals, bishops and abbots gathered in Rome for a huge conference designed to reinvigorate the Catholic Church, involve it in the great social issues of the day, and stimulate a religious revival. The conference, the Second Vatican Ecumenical Council, had been called back in 1959 by the then-new pope, Angelo Giuseppe Roncalli, who had taken the name John XXIII. Elected only on the eleventh ballot, when he was within a month of his seventy-seventh birthday, Roncalli was seen as a stopgap pope. But this short, dumpy man surprised everyone. His natural, down-to-earth manner was perfectly attuned to the mood of the times, and as the first pope of the television age, he quickly achieved a world-wide popularity no pope had had before.

Great things were expected from Vatican II, as it was called, though in more traditional quarters there was surprise that the council had been called in the first place: Vatican I had been held ninety-two years before, when its most important decision was that the pope was infallible on theological matters – for such purists there was no need of another council. Questionnaires were sent out to all the bishops and abbots of the church, inviting them to Rome and soliciting their early views on a number of matters that it was proposed to discuss. By the time the council began, one thousand aides had been added, at least a hundred official observers from other religions, and several hundred press. It was by far the largest gathering of its kind in the twentieth century.58

As part of the preparations, the pope’s staff in Rome prepared an agenda of sixty-nine items, later boiled down to nineteen, and then thirteen. For each of these a schema was drafted, a discussion document setting out the ideas of the pope and his immediate aides. Shortly before the council began, on 15 May 1961, the pope issued an encyclical, Mater et Magistra, outlining how the church could become more involved in the social problems facing mankind. As more than one observer noted, neither the encyclical nor the council came too soon; as the French Dominican Yves Congar wrote, in 1961 ‘one man out of every four is Chinese, two men out of every three are starving, one man out of every three lives under Communism, and one Christian out of every two is not Catholic.’59 In practice, the council was far from being an unqualified success. The first session, which began on II October 1962, lasted until 8 December the same year, the bishops spending two to three hours in discussion every morning. The pope issued a second encyclical, Pacem in Terris, the following April, which specifically addressed issues of peace in the Cold War. Sadly, Pope John died on 3 June that year, but his successor, Giovanni Battista Montini, Paul VI, kept to the same schedule, and three more sessions of the council took place in the autumn of 1963, 1964, and 1965.

During that time, for close observers (and the world was watching), the Catholic Church attempted to modernise itself. But although Catholicism emerged stronger in many ways, Rome revealed itself as virtually incapable of change. Depending on the observer, the church had dragged itself out of the Middle Ages and moved ahead either to the seventeenth, eighteenth, or nineteenth century. But no one thought it had modernised itself. One problem was the style of debate.60 On most issues there was a ‘progressive’ wing and a ‘reactionary’ wing. This was only to be expected, but too often open discussion, and dissension, was cut short by papal fiat, with matters referred to a small papal commission that would meet later, behind closed doors. Teaching was kept firmly in the hands of the bishops, with the laity specifically excluded, and in discussions of ecumenism with Protestants and the Eastern Orthodox forms of Christianity, it was made clear that Catholicism was primary. The liturgy was allowed to shift from Latin to the vernacular, and some historical mistakes were admitted, but against that the church’s implacable opposition to birth control was, in the words of Paul Blanshard, who attended all four sessions of the council as an observer, ‘the greatest single defeat for intelligence.’61 On such matters as biblical scholarship, the status of Mary, and women within the church, Catholicism showed itself as unwilling to change and as driven by Rome. Perhaps expectations had been raised too high by calling a council in the first place: in itself that seemed to promise greater democracy. America was now a much greater power in the world, and in the church, and Rome’s way of conducting itself did not sit well with attitudes on the other side of the Atlantic.62 Quite what effect Vatican II had on the numbers of Catholics around the world is unclear; but in the years that followed the rates for divorce continued to rise, even in Catholic countries, and women took their own decisions, in private, so far as birth control was concerned. In that sense, Vatican II was a missed opportunity.

For many people, the most beautiful image of the twentieth century was not produced by Picasso, or Jackson Pollock, or the architects of the Bauhaus, or the cameramen of Hollywood. It was a photograph, a simple piece of reportage, but one that was nevertheless wholly original. It was a photograph of Earth itself, taken from space. This picture, showing the planet to be slightly blue, owing to the amount of water in the atmosphere, was affecting because it showed the world as others might see us – as one place, relatively small and, above all, finite. It was that latter fact that so many found moving. Our arrival on the Moon marked the point when we realised that the world’s population could not go on expanding for ever, that Earth’s resources are limited. It was no accident that the ecology movement developed in parallel with the space race, or that it culminated at the time when space travel became a fact.

The ecological movement began in the middle of the nineteenth century. The original word, oekologie, was coined by the German Ernst Haeckel, and was deliberately related to oekonomie, using as a root the Greek oekos, ‘household unit.’ There has always been a close link between ecology and economy, and much of the enthusiasm for ecology was shown by German economic thinkers in the early part of the century (it formed a plank of National Socialist thinking).63 But whether that thinking was in Germany, Britain, or the United States (the three countries where it received most attention), before the 1960s it was more a branch of thought that set the countryside – nature, peasant life – against urbanity. This was reflected in the writings of not only Haeckel but the British planners (Ebenezer Howard’s garden cities and the Fabians), the Woodcraft Folk, and such writers as D. H. Lawrence, Henry Williamson, and J. R. Tolkien.64 In Germany Heinrich Himmler experimented, grotesquely, with organic farms, but it was not until the 1960s that the modern worries came together, and when they did, they had three roots. One was the population boom stimulated by World War II and only now becoming visible; a second was the wasteful and inhuman planning processes created in many instances by the welfare state, which involved the wholesale destruction of towns and cities; and third, the space race, after which it became common to refer to the planet as ‘spaceship Earth.’

When President Johnson made his Great Society speech in Michigan in the spring of 1964, he referred to the impoverished environment as one of his reasons for acting. Partly, he had in mind the destruction of the cities, and the ‘Great Blight of Dullness’ that Jane Jacobs had railed against. But he was also provoked by the writings of another woman who did much to stir the world’s conscience with a passionate exposé of the pesticide industry and the damage commercial greed was doing to the countryside – plants, animals, and humans. The exposé was called Silent Spring, and its author was Rachel Carson.65

Rachel Carson was not unknown to the American public in 1962, when her book appeared. A biologist by training, she had worked for many years for the U.S. Fish and Wildlife Service, which had been created in 1940. As early as 1951 she published The Sea Around Us, which had been serialised in the New Yorker, a Book-of-the-Month Club alternative choice and top of the New York Times best-seller list for months. But that book was not so much a polemic as a straightforward account of the oceans, showing how one form of life was dependent on others, to produce a balance in nature that was all-important to both its continued existence and its beauty.66

Silent Spring was very different. As Linda Lear, her biographer, reminds us, it was an angry book, though the anger was kept in check. As the 1950s passed, Carson, as a scientist, had gradually amassed evidence – from journals and from colleagues – about the damage pesticides were doing to the environment. The 1950s were years of economic expansion, when many of the scientific advances of wartime were put to peaceful use. It was also a period when the Cold War was growing in intensity, and would culminate at the very time Silent Spring appeared. There was a tragic personal dimension behind the book. At about the time The Sea around Us appeared, Carson had been operated on for breast cancer. While she was researching and writing Silent Spring, she was suffering from a duodenal ulcer and rheumatoid arthritis (she was fifty-three in 1960), and her cancer had reappeared, requiring another operation and radiotherapy. Large chunks of the book were written in bed.67

By the late 1950s, it was clear to those who wished to hear the evidence that with the passage of time, many pollutants that formed part of daily life had toxic side effects. The most worrying, because it directly affects humans, was tobacco. Tobacco had been smoked in the West for three hundred years, but the link between cigarette smoking and lung cancer was not fully aired until 1950, when two reports, one in the British Medical Journal and the other in the Journal of the American Medical Association, both showed that ‘smoking is a factor, and an important factor, in the production of carcinoma of the lung.’68 This result was surprising: the British doctors doing the experiment thought that other environmental factors – automobile exhaust and/or the tarring of roads – were responsible for the rise in lung cancer cases that had been seen in the twentieth century. But no sooner had the British and American results appeared than they were confirmed in the same year in Germany and Holland.

From the evidence that Carson was collecting, it was becoming clear to her that some pesticides were far more toxic than tobacco. The most notorious was DDT, introduced to great effect in 1945 but now, after more than a decade, implicated not just in the deaths of birds, insects, and plants but also in cancerous deaths in humans. An especially vivid example explored by Carson was Clear Lake in California.69 Here DDD, a variant of DDT, had been introduced in 1949 to rid the lake of a certain species of gnat that plagued fishermen and holidaymakers. It was administered carefully, as was thought: the concentration was 1 part in 70 million. Five years later, however, the gnat was back, and the concentration increased to 1 in 50 million. Birds began to die. The association wasn’t understood at first, however, and in 1957 more DDD was used on the lake. When more birds and then fish began to die, an investigation was begun – which showed that certain species of grebe had concentrations of 1,600 parts in a million, and the fish as much as 2,500 in a million. Only then was it realised that some animals accumulate concentrations of chemicals, to lethal limits.70 But it wasn’t just the unanticipated build-up of chemicals that so alarmed Carson; each case was different, and often human agency was involved. In the case of aminotriazole, a herbicide, it had been officially sanctioned for use on cranberry bogs, but only after the berries had been harvested. This particular sequence mattered because laboratory studies had shown that aminotriazole caused cancer of the thyroid in rats. When it emerged that some growers sprayed the cranberries before they were harvested, the herbicide could not be blamed direcdy.71 This is why, when Silent Spring appeared in 1962, again serialised in the New Yorker, the book created such a furore. For Carson not only explored the science of pesticides, showing that they were far more toxic than most people realised, but that industry guidelines, sometimes woefully inadequate in the first place, were often flouted indiscriminately. She revealed when and where specific individuals had died, and named companies whose pesticides were responsible, in some cases accusing them of greed, of putting profit before adequate care for wildlife and humans.72 Like The Sea Around Us, Silent Spring shot to the top of the best-seller lists, helped by the thalidomide scandal, which erupted at that time, when it was shown that certain chemicals taken (for sedation or sleeplessness) by mothers in the early stages of pregnancy could result in deformed offspring.73 Carson had the satisfaction of seeing President Kennedy call a special meeting of his scientific advisory committee to discuss the implications of her book before she died, in April 1964.74 But her true legacy came five years later. In 1969, the U.S. Congress passed the National Environmental Policy Act, which required an environmental impact statement for each governmental decision. In the same year the use of DDT as a pesticide was effectively banned, and in 1970 the Environmental Protection Agency was established in the United States, and the Clean Air Amendment Act was passed. In 1972 the United States passed the Water Pollution Control Act, the Coastal Zone Management Act, and the Noise Control Act, with the Endangered Species Act approved in 1973.

By then, thirty-nine nations had met in Rome in 1969 to discuss pollution. Their report, The Limits to Growth, concluded that ‘the hour was late,’ that at some stage in the next one hundred years the limit to growth would be reached, that the earth’s finite resources would be exhausted and there would be a ‘catastrophic’ decline in population and industrial capacity.75 Attempts to meet this looming problem should begin immediately. In the same year Barbara Ward and René Dubos presented a report to the United Nations World Conference on the Human Environment which, as its title, Only One Earth, showed, had much the same message.76 Nineteen-seventy saw the founding of the ‘Bauernkongress’ in Germany, and in 1973 ecology candidates first stood for election in France and Britain. These events coincided with the Yom Kippur War in 1973, as a result of which the OPEC cartel of oil-producing nations raised oil prices sharply, an oil crisis that forced gasoline rationing in several countries, the first time such measures had been necessary since World War II. It was this, as much as anything, that highlighted not just the finite nature of the earth’s resources, but that such limits to growth had political consequences.

Charles Reich, an academic who taught at both Yale and Berkeley, claimed that the environmental revolution was more than just that; it was a true turning point in history, a pivot when human nature changed. In The Greening of America (1970), he argued that there existed, in America at any rate, three types of consciousness: ‘Consciousness I is the traditional outlook of the American farmer, small businessman and worker who is trying to get ahead. Consciousness II represents the values of an organisational society. Consciousness III is the new generation…. One was formed in the nineteenth century, the second in the first half of this century, the third is just emerging.’77

Beyond this division, Reich’s idea was a very clever piece of synthesis: he related many works of popular culture to his arguments, explaining why particular songs or films or books had the power and popularity they did. He paid relatively little attention to Consciousness I but had great fun debunking Consciousness II, where his argument essentially followed on from Herbert Marcuse, in One-Dimensional Man, and W. H. Whyte’s Organisation Man. Since the mid-1950s, Reich said, that world had deteriorated; in addition to vast organisations, we now had the ‘corporate state,’ with widespread, anonymous, and in many cases seemingly arbitrary power. He argued that the works of Raymond Chandler, such as The Big Sleep or Farewell My Lovely, owed their appeal to their picture of a world in which no one could be trusted, where one could only survive by living on one’s wits. James Jones’s From Here to Eternity pitted a young man against a vast, anonymous organisation (in this case the army), as did Philip Roth’s Portnoy’s Complaint. The appeal of Casablanca, he said, lay in the fact that ‘Humphrey Bogart plays a man who could still change fate by taking action. Perhaps Casablanca was the last moment when most Americans believed that.’78

Reich showed how a large number of popular works took aim at one or other aspect of Consciousness II society, and tried to move on. In Stanley Kubrick’s 2001 : A Space Odyssey, a space traveller is in what appears to be a hotel or motel room, expensive and plastic but lacking entirely anything he can do anything with, ‘no work, nothing that asks a reaction.’79 ‘Almost every portrayal of a man at work [in American films] shows him doing something that is clearly outside of modern industrial society [i.e., the corporate state]. He may be a cowboy, a pioneer settler, a private detective, a gangster, an adventure figure like James Bond, or a star reporter. But no films attempt to confer satisfaction and significance upon the ordinary man’s labour. By contrast, the novels of George Eliot, Hardy, Dickens, Howells, Garland and Melville deal with ordinary working lives, given larger meaning through art. Our artists, our advertisers and our leaders have not taught us how to work in our world.’80 The beginning of Consciousness III he took to be J. D. Salinger’s Catcher in the Rye (1951) but to gather force with the music and words of Bob Dylan, Cream, the Rolling Stones, and Crosby, Stills and Nash. Dylan’s ‘It’s All Right Ma (I’m only Bleeding),’ Reich said, was a far more powerful, and considerably earlier, social critique of police brutality than any number of sociological treatises. ‘Eleanor Rigby’ and ‘Strawberry Fields Forever’ said more about alienation more succinctly than any psychologist’s offering. The same argument, he said, applied to such works as ‘Draft Morning’ by the Byrds, Tommy by the Who, or ‘I Feel Free’ by Cream. He thought that the drug culture, the mystical sounds of Procul Harum, and even bell-bottom trousers came together in a new idea of community (the bell-bottoms, he said, somewhat fancifully, left the ankles free, an invitation to dance). The works of authors like Ken Kesey, who wrote One Flew over the Cuckoo’s Nest (1979) about a revolt in a mental hospital, embodied the new consciousness, Reich said, and even Tom Wolfe, who in The Kandy-Kolored Tangerine-Flake Streamline Baby (1965) was critical of many aspects of the new consciousness, at least conceded that subcultures like stock-car racing and surfing showed people choosing their own alternative lifestyles, rather than simply accepting what was given them, as their parents had.

This all came together, Reich said, in the ‘green’ movement. Opposition to the Vietnam War was an added factor, but even there the underlying force behind the war was corporate America and technology; napalm destroyed the environment and the enemy almost equally. And so, along with a fear for the environment, a realisation that resources were finite, and a rejection of the corporate state, went an avoidance where possible of the technology that represented Consciousness II. People, Reich said, were beginning to choose to bake their own bread, or to buy only bread that was baked in environmentally friendly ways, using organically grown materials. He was in fact describing what came to be called the counterculture, which is explored in more detail in the next chapter. He wasn’t naïve; he did not think Consciousness II, corporate America, would just roll over and surrender, but he did believe there would be a growth of environment-conscious communes, green political parties, and a return to ‘vocations,’ as opposed to careers, with people devoting their lives to preserving areas of the world from the depredations of Consciousness II corporations.

A related argument came from the economist Fritz Schumacher in two books, Small Is Beautiful (1973) and A Guide for the Perplexed published in 1977, the year he died.81 Born in Bonn in 1911, into a family of diplomats and academics, Schumacher was given a very cosmopolitan education by his parents, who sent him to the LSE in London and to Oxford. A close friend of Adam von Trott, executed for his part in the attempt on Hitler’s life in July 1944, Schumacher was working in London in the late 1930s and spent the war in Britain, overcoming his enemy alien status. After the war, he became very friendly with Nicholas Kaldor and Thomas Balogh, economic advisers to Prime Minister Harold Wilson in the 1960s, and was appointed to a senior position on the National Coal Board (NCB). Very much his own man, Schumacher early on saw that the resources of the earth were finite, and that something needed to be done. For many years, however, he was not taken seriously because, in being his own man, he took positions that others regarded as outlandish or even as evidence of instability. He was a convinced believer in unidentified flying objects, flirted with Buddhism, and though he had rejected religion as a younger man, was received into the Catholic Church in 1971, at the age of sixty.82

Schumacher had spent his life travelling the world, especially to the poorer parts, such as Peru, Burma, and India. Gradually, as his religious feelings grew, as the environmental crisis around him deepened, and as he realised that the vast corporations of the West could not hope to offer solutions that would counter the poverty of so many third-world countries, he developed an alternative view. 1971 was for him a turning point. He had recently become president of the Soil Association in Britain (he was an avid gardener), he had been received into the church, and he had resigned from the NCB. He set about writing the book he had always wanted to write, provisionally called ‘The Homecomers,’ because his argument was that the world was reaching a crisis point. The central reality, as he saw it, was that the affluence of the West was ‘an abnormality which “the signs of the times” showed was coming to an end.’ The inflation that had started to plague Western societies was one such sign. The party was over, said Schumacher, but ‘Whose party was it anyhow? That of a small minority of countries and, inside those countries, that of a minority of people.’83 This minority kept itself in power, as was to be expected, but the corporations did little to help the chronic poverty seen in the rest of the world. These countries could not go from their underdeveloped state to a sophisticated state overnight. What was needed, he said, was a number of small steps, which were manageable by the people on the ground – and here he introduced his concept of intermediate technology. There had been an Intermediate Technology Development Group in Britain since the mid-1960s, trying to develop technologies that were more efficient than traditional ones, in India, say, or South America, but far less complex than their counterparts in the developed West. (A classic example of this would be the clockwork radio, which works by being wound up rather than with batteries, which may not be obtainable in remote areas, or may weather badly.) By ‘The Homecomers’ he meant that people would in the future return to their homes from factories, go back to simpler technologies simply because they were more human and humane. The publishers didn’t like the tide and Anthony Blond came up with Small Is Beautiful, at the same time keeping Schumacher’s subtitle: ‘Economics – as if People Mattered.’ The book was published to a scattering of reviews, but it soon took off as word of mouth spread, and it became a cult from Germany to Japan.84 Schumacher had hit a nerve; his main focus was the third world, but it was clear that many people loathed the big corporations as much as he did and longed for a different way of life. Until his death in 1977, Schumacher was a world figure, feted by state governors in America, entertained at the White House by President Carter, welcomed in India as a ‘practical Gandhi.’ His underlying argument was that there is room on earth for everyone, provided that world affairs are managed properly. That management, however, was not an economic question but a moral one, which is why for him economics and religion went together, and why they were the most important disciplines.85 Schumacher’s arguments illustrated Reich’s Consciousness III at its most practical level.

Anxieties about the human influence on our planet accelerated throughout the 1970s, aided by a scare in Italy in 1976 when a massive cloud of dioxin gas escaped from a pesticide plant near Seveso, killing domestic and farm animals in the surrounding region. In 1978 the United States banned CFCs as spray propellants in order to reduce damage to the ozone layer, which normally filtered out ultraviolet radiation from the sun. This damage, it was believed, was causing global warming through the ‘greenhouse effect.’ In 1980 the World Climate Research Program was launched, a study specifically intended to explore human influence on climate and to predict what changes could be expected.

No one has been to the Moon for more than a quarter of a century. We have lost the universal sense of optimism in science that the Apollo program represented.