1. The Human Planet

AMountain Pass Rare Earth Mine, California, USA. In the 1960s it provided europium for colour television sets. Today it supplies neodymium, key to many so-called ‘green’ techs, such as wind turbines and electric cars.

We do not just live on this planet; we have shaped it. Each year, the work of humans moves more soil, rock and sediment than is transported by all other natural processes combined. If you took all humans and their domesticated animals (pets and livestock), their combined mass would be greater than all wild terrestrial vertebrates taken together. Humans have left their mark in the chemistry of the air around them, the rocks, the oceans and the atmosphere. For the first time in Earth’s 4.5-billion-year history, a single species is dictating its future.

The atmosphere is the layers of gases that surround the Earth, and that are kept in place by the planet’s gravity.

Welcome to the Anthropocene.

BAerial view of herdsmen and their cows waiting for buyers at Kara cattle market in Lagos, Nigeria. One of the largest markets in West Africa, thousands of cows pass through it each week.

What is the start date for this epoch? This is a hotly contested question within earth sciences, one that allows us to dig back through the ways in which we built our human planet. The most recent option is 1945, or, more specifically, 5:29 a.m., 16 July 1945, when the first detonation of a nuclear weapon – code name Trinity – was conducted by the US Army in the desert of New Mexico.

This is the timestamp favoured by the Anthropocene Working Group. It is far more than political symbolism. For earth scientists to recognize this as a geologically significant moment, they need to be able to see a record of it in the planet itself. It cannot be merely written in the history books; it has to be transcribed into the Earth.

The Anthropocene Working Group is tasked by the International Commission on Stratigraphy to explore whether and how scientists might officially add the Anthropocene to the geologic time scale. Its membership includes an international mix of earth scientists and archaeologists, as well as experts from history, law and journalism, but has been criticized for being too male and rich-world dominated (parodied online as the ‘manthropocene’).

This is one of the reasons why the Trinity test is so attractive: we can trace markers of its blast of radiation.

ACross sections of a silver fir trunk (left) and a black locust tree trunk (right). The study of tree rings can give scientists vital clues about the historical environments that these trees grew in.

To explore markers like these and help us put them in context, a specialism of earth sciences, paleoclimatology, uses information left in the Earth itself to build a picture of past environments. Rings in trees, for example, reflect the environment within which each layer grew, acting as natural time capsules. Corals have similar rings. Scientists can also look at the chemical make-up of shells left in sediments on the beds of lakes and oceans.

Paleoclimatology is the study of past climates using imprints found in organisms, ice cores, tree rings and sediment cores.

Polar researchers take cores of ice, drawn from deep inside ancient glaciers. Each slice of these cores is made up of snow that fell in a different year, and exploring their chemistry can offer crucial clues to what the Earth was like back then. Bubbles of air trapped inside the ice provide samples of past atmospheres, and from these scientists can measure the concentration of gases, including carbon dioxide. Beyond EPICA, a huge $12-million European project, is currently researching an area of East Antarctica for ice that is more than 1.5 million years old.

From caves in northern Italy to lakes in China, tree rings in the Amazon, ice buried in the Arctic, or Bermudan corals, scientists can trace fingerprints of cold war nuclear testing, peaking with the Nuclear Test Ban Treaty in 1963. One of these markers, the radioactive isotope of carbon, Carbon-14, has a half-life of about 5,750 years. This means it will be around for scientists to find for tens of thousands of years.

The 1945 date is attractive because of the swift rise in the human population after World War II. Planetary scientists are increasingly describing the past 70 or so years as the Great Acceleration, a period marked by a major expansion in human population. The world’s human population has been growing continuously since famine and plague in mid-14th-century Europe. It reached 1 billion for the first time around 1804. It was another 123 years before it grew to 2 billion in 1927, but it took only 33 years to reach 3 billion in 1960. Thereafter, the global population increased to 4 billion in 1974, 5 billion in 1987, 6 billion in 1999 and 7 billion in somewhere between October 2011 and March 2012 (depending on which data you use). Much of this can be seen as positive: we have more people because we are not dying of hunger and disease. But more people can cause more environmental damage.

BChinese scientist He Jianfeng collecting an ice-core sample at the North Pole. Examining ice cores is key to our modern scientific understanding of how the climate has changed over time.

CGeoff Hargreaves, curator at the US National Ice Core Laboratory, surrounded by ice cores. Based in Denver, Colorado, the laboratory stores ice cores from multiple expeditions as a repository for current and future investigations.

Over the past few hundred years, we have increasingly arranged ourselves around technologies that have allowed us to live longer and often happier lives, but we have left a larger ecological footprint as a result. Plastic is a good example, perhaps the exemplar invention of the Anthropocene: a human attempt at making our own material rather than exploiting nature; first as a replacement for ivory in billiard balls and then instead of shellac, a resin secreted by the female lac bug and used as an insulating material in the electricity industry. Plastic is extremely useful and also immensely polluting. It was first developed in the late 19th century, given a boost during military applications in the 1940s and diffused via domestic consumer markets from the 1950s. Plastic has since piled up to become a major problem.

AWomen working at the Columbus Plastic Company, Ohio, 1948. The plastic industry was given a boost by military research during World War II and boomed afterwards as domestic markets opened up.

BSpringwell Colliery Engine No. 2, built by Robert Stephenson in 1826. It was used to move wagons of coal, in this case for a mine in the north east of England.

CJames Watt’s workshop. This is one of a series of photographs taken when the workshop was being moved to the Science Museum, London, UK, in December 1924, where you can still see it on display.

Plastic is only part of the puzzle though, and many scientists want to look much further back than the 20th century for the start of the Anthropocene. After all, we have been industrializing in various forms for thousands of years, polluting the planet along the way. The ghosts of Roman copper smelting can be found in Greenland ice cores, for example. During the Industrial Revolution of the late 18th century, things really heated up, literally, as the start of our addiction to fossil fuels fed more carbon dioxide into the atmosphere, fuelling global warming. And if we want a marker left in the Earth’s record to compete with the Trinity test as a start date for the Anthropocene, the 1830s must be a contender because the Arctic has been warming since that time.

The Industrial Revolution occurred roughly between 1760 and 1840 in Europe and the USA, characterized by a shift from hand-production methods to machines, including the rise of mechanized factories and the application of steam and water power.

Fossil fuels such as coal and oil have been used by humans in different parts of the world, and in various ways, for thousands of years. However, it was not until the 18th century that we burnt them on industrial levels. The first key invention was the steam engine by Thomas Newcomen (1664–1729) in 1712, which provided a way to pump water out of mines speedily. Several decades later, James Watt (1736–1819) patented steam engines suitable for driving factory machinery, and the technology spread, increasing the market for coal in the process. By the end of the 18th century, people were illuminating buildings using gas made from burning coal. By the mid-19th century, commercial coal gas works were supplying light and heat to homes and businesses in cities around the world.

Fossil fuels is an umbrella term referring to the remnants of decayed plants and animals, which, after heat and pressure from the Earth’s crust over hundreds of millions of years, have become combustible fuels, such as oil, coal and natural gas.

AEngravings taken from The Natural History of the Ordinary Cetacea or Whales by William Jardine (1837). The first shows fishermen harpooning a Greenland whale that has tossed one of the attacking boats and the second shows fishermen harpooning a sperm whale. By 1880, the sperm whale population had declined by nearly a third due to whaling.

Richard Trevithick (1771–1833) developed high-pressured engines that were small enough to be used on trains around the beginning of the 19th century, and by the start of the 20th, these had been joined by engines that ran on petrol/gasoline or diesel produced from oil.

The age of cars and aeroplanes had begun.

The world’s first coal-fired electricity station – the Edison Electric Light Station – opened in London, UK, in January 1882. By the following September, it had a cousin in New York, USA. As we found new and exciting ways to use electricity, whole new reasons to burn extensive quantities of fossil fuels opened up. The seeds for a renewable energy future were also planted, with the first homes lit by hydropower and wind turbines in the 1880s.

Renewable energy comes from sources that will be replenished within human timescales. Examples include energy from sunlight (either to produce electricity or heat), wind, rain, tides, waves and geothermal heat. It can also refer to biofuels, energy produced from plants.

Throughout this period, we were also hunting and killing whales for fuel, thereby driving some species to extinction. Whale oil was used to light houses and streets, and as the Industrial Revolution shifted working culture, it came to be utilized to light factories so employees could work longer hours. In the middle of the 19th century, when Herman Melville (1819–91) was penning Moby Dick (1851), whaling was a booming global industry, the fifth largest sector in the US economy. Of the more than 700 whaling ships on the world’s oceans in the 1840s, more than half were based out of the Massachusetts port of New Bedford, USA, known as ‘the city that lit the world’. Whalers would haul the giant beasts onto their ships, cut off their heads and bail out thousands of litres of sperm oil. It has been estimated that, by 1880, the sperm whale population had declined by nearly a third due to whaling. But fossil fuels gradually edged out whale oil, and by the mid-20th century, following scientific advice on whale populations, many countries had banned whaling altogether. The rise and fall of the whaling industry makes for an interesting case study as an example of humans changing their behaviour before it was entirely too late. Once upon a time, we mined whales for oil; now we do not. Perhaps one day soon we will look back on the days we mined coal and find that practice equally unacceptable.

Sperm oil is not technically an oil, more a wax. It burns brightly and does not smell as much as other forms of whale oil, which are produced from boiling the blubber.

The Industrial Revolution was not only about energy.

BGreenwich generating station, London, UK, 1906. Built in 1902 by London County Council to provide electricity for trams, it was originally coal-fired. It is still used as a back up for London’s underground rail system.

There was a host of other inventions, which not only opened up new markets and ways to live, but also found alternative ways to eat up and pollute the natural world in the process. In 1798 in France, Louis-Nicolas Robert (1761–1828) patented a machine for making a continuous sheet of paper on a loop of wire fabric, later to become the scourge of forests the world over. The greater availability of paper and cloth also paved the way for disposable products. The uptake of disposability was driven in part by the emerging advertising industry and consumer culture, but also by concerns over hygiene. Today, our love of disposable products remains extremely damaging, especially since it has been combined with the invention of plastics, thus creating new risks in ocean pollution just as it avoids more immediate health ones.

In 1824, a Leeds bricklayer, Joseph Aspdin (1778–1855) patented a chemical process for combining clay and limestone at high temperatures to produce Portland cement. Today, the cement industry has grown to be one of the world’s primary carbon dioxide polluters, producing about 8% of global emissions.

AAdvertisements from the 1890s, showing the rise of consumer culture. The products on display in these adverts also show how disposable packaging provided a handy canvas for the emerging advertising industry.

So far, we have looked at two contenders for the start of the Anthropocene.

First, the growth of the post-war era marked by proxy through the radioactive fallout of the 1945 Trinity test. And second, the industrial developments of the 18th and 19th centuries, viewable in signs of warming waters in the 1830s. Some scientists would like to look back a lot further, and consider the way the human practice of farming made its mark on our planet. As we will discuss more in Chapter 2, clearing land for farming contributes to both global warming and biodiversity loss. Cut down a forest, and it no longer absorbs carbon. Whole species can lose their homes, too.

Biodiversity loss can refer to the extinction of species (plant or animal) worldwide, and also to the local reduction or loss of species in a particular place.

BMore advertisements from the same era from the USA, France and Russia. By the start of the 20th century, advertising had become an international industry.

Farming emerged during the Neolithic Revolution around 12,000 years ago, with another agricultural revolution taking place in the Islamic Golden Age, some 1,200 to 800 years ago. The Romans had done a good job of developing agricultural techniques and spreading various crops around their empire, but the Arabs studied the topic scientifically, intensifying the results. Major works on agronomy were published, disseminating useful techniques on how to grow, for example, olive trees, wheat and barley, and encouraging people to find and experiment with new kinds of crops. Irrigation techniques, such as the sakia water wheel, and several types of fruit and vegetable were spread across the Islamic world.

The Neolithic Revolution is also known as the first agricultural revolution. Many human cultures moved from a lifestyle of hunting and gathering to one of agriculture and settlement.

Agronomy is the science and technology of producing and using plants for either food or materials.

Europe in the 18th to mid-19th centuries saw another flurry of agricultural development. The refinement of the seed drill by Jethro Tull (1674–1741) in 1701 built on technology that had originated in China and made its way to Europe via India in the middle of the 16th century. There were also improvements to crop rotation, new ploughs and new fertilizers, including the transportation of sodium nitrate deposits from Peru to Britain.

AGuano (seabird and bat excrement) is an effective fertilizer. In the 19th century trade of guano fuelled war and the expansion of colonialist power. It was superseded by artificial fertilizers after World War II.

BHumans have been changing the shape of plants and animals around them long before genetic modification became an option, as these 19th-century depictions of prize winningly large animals show.

People had been experimenting with the selective breeding of wheat, rice, horses and dogs for millennia, but it was around the middle of the 18th century that great advances were made, and a century later that, via Charles Darwin (1809–82), people started talking about the science behind it. Sheep were bred for wool or meat; cows were bred not only to pull a plough but also for their meat.

A 2018 study published by the Royal Society argues that the chicken offers an especially good example of how our approach to feeding ourselves has dominated the planet. Chicken meat consumption is growing faster than any other meat type; it could soon outpace pork. Since the Chicken-of-Tomorrow programme in the early 1950s, launched to encourage the development of higher meat-yielding birds, chicken farming has developed into a complex mechanized system integrating computer software, transportation vehicles, refrigeration, heating, feed processing factories and more. There are now 23 billion so-called ‘broiler’ chickens in the world, bred and reared for their ability to feed us. Compare chicken bones from archaeological sites of Roman Britain with those of a modern broiler, and some are triple the width and double the length. Broiler chickens have a life span of five to seven weeks (compared with 11 years for their wild ancestors, and a year for egg-laying hens) and we slaughter 66 billion a year.

AThis 1595 map shows the routes around the world of Sir Francis Drake, who led the second circumnavigation of the world in a single expedition between 1577 and 1580. The first was led by Portuguese Ferdinand Magellan between 1519 and 1521.

Another possible start date to the Anthropocene is 1610, proposed by earth systems scientists Mark Maslin and Simon Lewis. With this, they point towards the impact of global trade.

Why 1610? There is an observable dip in atmospheric carbon dioxide. As Maslin and Lewis explain, 50 million indigenous Americans died during the first few decades of the 16th century, following the landing of the Europeans in 1492. The land, previously farmed by these now decimated communities, started to shift back into forests. The trees had, at least temporarily, space to grow without humans cutting them down. This regrowth absorbed enough carbon dioxide to create an observable dip in ice core records around 1610. Some researchers even think it could have contributed to the so-called Little Ice Age.

The Little Ice Age was a period from around 1300 to 1850 that seems to have been particularly cold in many parts of the world, with especially cold intervals in 1650, 1770 and 1850. Not cold enough to be a true Ice Age, it was given the title in 1939. Various causes have been suggested, including heightened volcanic activity and decreases in the human population.

The colonization of the Americas also led to a growth of global trade networks that linked Europe to the Americas, and intensified travel to Asia and Africa. For this reason, Maslin and Lewis call their theory supporting the 1610 date as the start of the Anthropocene the Orbis Hypothesis, meaning ‘world hypothesis’, arguing that it marks the beginning of a modern whole-world system. These new trade routes not only shifted people, but other species, too. Plants were moved by humans from one continent to another: sugar cane, oranges, rice and, eventually, coffee were shipped to the Americas, and potatoes, tobacco, maize and tomatoes from the Americas to Europe. Animals moved, too: horses, cows, pigs and, more by accident than design, several types of earthworm. Chickens were brought to the New World by Spanish colonists in the 16th century. As with much of the Anthropocene, not all of this was necessarily harmful for the environment, even if it was very clearly bad for the many people who were exploited by the system. Some was just new.

The Orbis Hypothesis is named from the Latin word for world – ‘orbis’ – because it focuses on the environmental impact of globalized trade and colonialism following 1492.

The London plane tree offers an iconic example, a now widespread species that would not exist if it were not for this new movement of people around the globe. It is a hybrid of two trees: the American sycamore and the Oriental plane. It is not certain exactly where and when the hybridization took place, possibly in Spain, or in London itself, where the tree was first discovered in the mid-17th century by botanist John Tradescant the Younger (1608–62) in his Vauxhall gardens. The trees were planted in London parks in the 18th century and then in the 19th century, taking inspiration from a Parisian trend for leafy boulevards, they began to line London’s streets. In the 20th century, the species was planted throughout Central Park in New York City and even referenced in the park’s logo. Hay fever sufferers may curse the London plane, but it is most interesting for simply existing: a combination of two plants that started life on entirely different sides of the world, brought together by the way humans circumnavigated their planet.

AIllustrations from French pharmacist Pierre Pomet’s 1737 book, A compleat history of drugs, showing, clockwise from top left, harvests of indigo (fabric dye), sugar, roucou (food colouring) and tobacco.

Many of the botanical stories of colonialism are less benign. Crops such as rice, cotton, breadfruit, tobacco and sugar not only crossed oceans, they also relied on slave labour and, especially when it came to rice, the expertise of enslaved Africans. In the process, slaves bankrolled large parts of the Industrial Revolution, as white people who grew rich off the profits could plough their money into new technologies and research. This is another reason why the Orbis Hypothesis is a good candidate for the start of the Anthropocene: the technologies of the Industrial Revolution share a common, colonialist ancestor with that dip in carbon dioxide in 1610.

Perhaps most interesting, from the perspective of our modern environmental crisis, is the case of the palm oil tree. Native to west and south west Africa, the trees were taken to the Indonesian island of Java by the Dutch in 1848, and then to Malaysia by the British in 1910. Today, Indonesia produces about half of the world’s palm oil, with Malaysia providing another third and the rest coming from countries across the tropics, including Thailand, Colombia and Nigeria. You have almost certainly eaten palm oil: in bread, chocolate, ice cream or biscuits. You might have used it in cosmetics or fuel, too.

Between 1980 and 2014, annual global palm oil production increased from 4.5 million tonnes to 70 million tonnes (4.9 million to 77 million tons), and palm oil demand is expected to grow at 1.7% per year until 2050. Industrial-scale oil palm plantations now occupy an area of 18.7 million hectares (46.2 million ac) worldwide (and that figure does not include smallholder plantations). The UN highlights 193 critically endangered, endangered and vulnerable species threatened by palm oil production, including chimpanzees in Nigeria, tigers in Thailand and Indonesia, orangutans in Malaysia and cassowaries in Papua New Guinea.

Whatever date is chosen to start the Anthropocene, there is a profound philosophical point contained in the Anthropocene thesis. As Maslin and Lewis note, it has become commonplace to talk about the history of science as shifting humanity’s view of itself further away from the centre of the universe, challenging our egotism. In 1543, Copernicus (1473–1543) published his astronomical model, putting the sun, not Earth, at the centre of the solar system. In 1859, Darwin established humans as simply part of a tree of life with no special origin. Anthropocene thinking turns this on its head, reasserting humans as active agents in Earth’s functioning. It should also invite us to consider the inequalities between humans.

BPalm oil awaiting shipment in Nigeria, 1922. Until 1934, Nigeria was the world’s largest palm oil producer. Palm oil requires much less land than other vegetable oil crops, but the industry is often criticized for driving deforestation.

A/BCathedral Rocks (left) and Vernal Fall (right), both in Yosemite and captured by photographer Carleton Watkins. They reflect the way an aesthetic for the national sublime has continued to be re-packaged for tourism and national identity in the USA. Yosemite was one of Watkins’ favourite subjects, and his photographs of the valley are thought to have significantly influenced the United States Congress’ decision to preserve it as a National Park.

Whether you are disgusted or delighted by this image of a human-dominated planet is often down to personal philosophy or faith. Some developments have undoubtedly been for the good; others are harder to defend. Many sit in between: a mix of inspiring and enraging.

One way to get a grip on the Anthropocene can be found in 18th-century philosophies of the sublime. To philosophers such as Edmund Burke (1729–97), there was a strong distinction between the merely beautiful and the sublime, the latter being somehow terrifying.

For Immanuel Kant (1724–1804), to experience the sublime is more specifically to feel in awe. We can feel small, weak, insignificant or reverential in comparison, although crucially we also recover some sense of superior self-worth with the realization that the mind was able to conceive something so large and powerful in the first place. Such ideas influenced many European and American artists, but they also speak to broader relationships between humans and the rest of the natural world, in particular a changing understanding of who holds the power.

The classic study Mountain Gloom and Mountain Glory (1959) by Marjorie Hope Nicolson (1894–1981) describes how 17th- and 18th-century European explorers transferred a sense of awe once held for God onto the new and vast mountain ranges, waterfalls and rainforests they encountered. Her analysis helps unpick some of the hopes, fears and aesthetics embedded in our historical relationships with natural and built environments. It helps explain why so much science writing is still infused with religious language. Indeed, the texts these explorers inspired and produced were among early forms of popular science.

AThe tops of these skyscrapers peaking out above the clouds, as the sun rises in Dubai, United Arab Emirates, may evoke in some a sense of awe at the technologically sublime.

In his book American Technological Sublime (1994) historian of technology David E. Nye (b. 1946) develops this idea further. By Nye’s reading, this sense of awe has been transferred once again, this time to modern engineering projects such as bridges, electric lighting or skyscrapers. Because this sublime object is human-made, Nye posits, there is another layer of power involved. We look at it and think, ‘Wow, it is huge and we are tiny,’ as we might in front of a mountain, but we know that people made these things, not nature or God, so we can feel greater connection to them and, possibly, a greater control over the world, too.

Those of a humanist persuasion might find celebration in this. Still, the idea that we have a connection to something purely because it is human-made ignores the social inequalities wrapped up in any technology. Some of us can build skyscrapers; some of us have to live in their shade. Arguably, this is often part of the reason they are built: they are the castles of their day, designed to show power.

Humanism is a philosophical and ethical stance built around the value, understanding and agency of human beings over supernatural forces.

The people who caused the environmental problems are not, most of the time, those who are worst affected by them. There is much that humans share in the Anthropocene – materials, histories, biologies, technologies, powers and dangers – but plenty that we do not share. As Chapter 2 illustrates, we have produced an alarming set of environmental crises.

As things hot up, it will be increasingly difficult for anyone to escape entirely, but the crises will hit some people faster and far harder than others.

BDespite their proximity to the skyscrapers of Rio de Janeiro, Brazil, the children playing in this favela – a low-income area on the outskirts of the city suffering from government neglect – may well feel disconnected from the wealth and power they represent.