Gregory H. Maddox
The fields of African history and environmental history both emerged within the discipline of history in the Western academy in the late 1950s and early 1960s. The subjects had, of course, been around for much longer, but it was not till then that the discipline itself finally came to acknowledge that history was more than the deeds of dead great men and to explore sources beyond the documentary record. Both fields sought to uncover agency in subjects that had most often been treated as objects, and both emerged in contexts that seemed to require that scholars of the past participate in the struggles of the present. Despite the similar circumstances of their birth, African environmental history has brought its own distinctive approach to the concerns of environmental history. African environmental history shares with the broader field of environmental history its focus on the autonomy of the natural world and the place of humanity as but one species in that world, but it also explores the struggles of human communities to create the conditions for survival. African environmental history seeks to give agency to both nature and humanity as mutually constructing each other.
Environmental history, broadly, has focused on several themes. Perhaps most notably, historians have emphasized the ways in which human societies have changed and often degraded environments. Following the work of the Annales historians, many have also sought to explicate the deep interconnectedness between human societies and environment (Braudel 1972; Le Roy Ladurie 1971). Following Alfred Crosby, among others, many also focus on humans as biological entities in constant interaction with the rest of the biological world (Crosby 1986). Environmental historians also explore conceptions of the natural world within specific cultural and intellectual contexts. As such, environmental historians seek “a place for stories,” in William Cronon’s phrase, that integrates these factors (Cronon 1992).
As befits fields born in the 1960s, this first intersection between African and environmental history occurred around the issue of European expansion and domination over Africa as well as the rest of the world. In 1968 Philip Curtin, one of the earliest practitioners of African history in the United States, published an article that argued that Africans became the labor force of choice in tropical New World colonies because of their resistance to tropical diseases such as malaria and yellow fever that European expansion had brought, as well as to other Old World diseases such as smallpox (Curtin 1968). Alfred Crosby expanded this critical insight into his works on the remaking of environments across the globe as a result of what he termed the “Columbian Exchange” (Crosby 1973, 1986). Crosby’s works have become a touchstone for global environmental history,1 and, while he perhaps paid less attention to Africa than to other parts of the world, others have usefully mined his insights.
Curtin’s focus on the disease environments in Africa and the constant process of coevolution between disease‐causing organism and host and vector points to one of the most important themes of African environmental history: the long struggle to survive in difficult and changing environments (Curtin 1985, 1998, 2001).2 James Webb has taken up Curtin’s perspective on the importance of disease environments and pushed it back to show how malaria shaped the distribution of human population in Africa and called forth genetic adaptations that enabled a greater likelihood of survival in areas that were home to the mosquitoes carrying the disease‐causing parasite (Webb 2005, 2009). John Ford, in one of the pioneering works of the field, examined the relationship between tsetse‐fly‐borne trypanosomaisis, which causes sleeping sickness in humans and livestock, and settlement and land use patterns in East Africa (Ford 1971). James Giblin followed up with detailed examinations of how African societies sought to manage exposure to the vectors carrying the disease through environmental manipulation (Giblin 1990).
The emphasis on adaptation to the environment and on shaping of the environment seen in the histories of disease and environment carries through into longer‐term histories of humanity and the natural world. To some extent African environmental history stands environmental history on its head. As Cronon points out, environmental history has often taken the form of a declensionist narrative that chronicles the destruction of nature by the encroachment of human civilization (Cronon 1992: 1352). African environmental history often takes a more constructive approach, emphasizing the way in which humanity and environment mutually shape one another. In the longest perspective, this vantage point draws on the fact that African environments actually shaped humanity in its earliest forms. The evolution of species ancestral to humans and of modern humanity all took place within Africa (Stringer and McKie 1996).
This long‐term perspective leads to a deeply environmental view of the course of human history in Africa. The most important scholars of this deep history combine archaeology with historical linguistics to hypothesize about the ways in which, over the course of tens of thousands of years, human communities in Africa developed technologies to survive in its varied environments and cultures that allowed the transmission of this hard‐won knowledge across generations. They view long‐term climate change as an integral and even causal agent in this process.
The linguistic historian Christopher Ehret’s work presents the most comprehensive picture of this process (Ehret 1998, 2002). Ehret draws on the painstaking work of archaeologists in discovering the evidence of both human material cultures and the gradual development of the technologies they created to ensure their survival, and in several instances has goaded them to examine assumptions about African backwardness in the process (Clark and Brandt 1984; Sutton 1996). Ehret’s most important insights come in the area of the development of food production in Africa. He used linguistic evidence to link the very scattered evidence produced by archaeology on the development of food production into a more coherent picture. He emphasized not diffusion from the ancient Near East as many had thought to be the case, given the paucity of evidence in the past, but the development of food production techniques within African environments, drawing on local plant and animal resources. In many cases his suggestive conclusions led to new research that substantiated his broad findings.
Using linguistic and archaeological evidence, Ehret suggested that cattle were domesticated in Africa before they arrived from the Near East. Work in southern Egypt has provided signs that domesticated cattle appeared 8,000 years ago and spread across the Sahel thereafter (Wendorf, Close, and Schild 1985). The disease environment which seems already to have included trypanosomaisis, however, apparently blocked the spread of cattle, keeping them in the more arid savannas. Only about 3,000 years ago does evidence emerge of cattle keeping in eastern Africa south of northern Kenya (Gifford‐Gonzalez 1998, 2000).
Further, Ehret’s work has helped push back the timeline and diversify our understanding of plant domestication in Africa. Ehret sees at least three major areas of domestication in Africa, all leading to different “traditions” of agriculture. In the forest savanna borderlands, he argues that yam cultivation became a staple that expanded across Africa from west to east and gradually through the forested regions of West and Central Africa. In the central Sahel, he argues that communities domesticated sorghum as well as kept cattle. In the Ethiopian highlands, millet and teff developed as main crops. Plant domestication did not occur overnight, and represented a long process spread over at least centuries. African cultivators also eventually supplemented their own crops with others from outside Africa in the Old World, especially from Asia. The “Monsoon Exchange” brought rice, bananas, and domesticated fowl like chickens to Africa while it took sorghum and millets and African cattle to Asia.
Ehret’s strength here is his willingness to draw a big picture over long timescales out of diverse evidence. In some respects his conclusions represent suggestions for further research; the exact modalities of the broad changes he describes remain subject to verification by other forms of evidence. In addition, while his broad picture adheres to environmental contours in locating regions for domestication and the potential spread of technologies, it presents a rather static picture of the interaction between human societies and the world around them. Others have developed a more dynamic view of the relationship between environment and human activity.
The archaeologist Roderick McIntosh has developed perhaps the most evocative approach to the dynamics of environmental change and human activity. Drawing on the work of many collaborators, McIntosh has integrated landscape and landscape change into a history of change and urbanization in the Niger Bend region of the Sahel in West Africa. With Susan Keech McIntosh, he has shown how the development of very early urban centers along the Inner Niger Delta both coincided with agricultural intensification based on African rice as well as millets, and responded to cycles of climate change (McIntosh 1997, 1998). Most importantly, he postulates that urban social structure in the region responded to wetter and drier periods in the Sahel as populations moved south during the dry phases and north during the wet phases (McIntosh 1993). He calls this dynamic a “pulse model,” and it has applicability outside the Inner Niger Delta.
McIntosh’s integration of environmental variability as a dynamic factor in human history has been taken up by other scholars (see the essays in McIntosh, Tainter, and McIntosh 2000). While scholars have generally assumed that the extensive nature of food production in sub‐Saharan Africa limited the long‐term effect of humanity on African landscapes until recent centuries, some have also begun to look at the interplay of human action and climate variability on landscapes. Sharon Nicholson pioneered the analysis of climate variability by analyzing the annual level of flood in the Nile Delta as an indicator of rains in East Africa (Nicholson 1976, 1981). Her work has led other scholars to attempt to relate historical phenomena to climate variability, such as George Brooks’s work on conflict in the Sahel (Brooks 1993).
The archaeologist Peter Schmidt has taken this one step further in postulating that a rapid expansion of iron production in the Great Lakes region of Africa in the centuries around the beginning of the Current Era resulted in widespread deforestation of a large area west of Lake Victoria, Nyanza, and after about the seventh century CE a sharp decline in population. He uses not only archaeological evidence to show the extensive use of wood types in creating the fuel for smelting, but also pollen records derived from cores taken in lake beds and sedimentary evidence. After the decline in population in the area west of the lake, forests regrew, and in the fourteenth century populations in the region began to increase as farmers adopted new varieties of bananas as staple crops and iron smelting became more efficient in its fuel use. By the nineteenth century, however, forest cover had declined to the point where iron was imported into the region (Schmidt 1997).
David Schoenbrun has sought to reconstruct the history of landscape and cultural change in the Great Lakes region, focusing not just on ironworking but on changing food production technologies against a backdrop of climate variability. He argues that the spread of bananas and plantains as staple crops allowed for a dramatic remaking of land use in the region. Banana production allowed for denser population in well‐water areas along the lakes and in highland zones. Stock rearing then concentrated in the lowlands, where it competed with rain‐fed grain for access to the best land. He suggests that societies in the lakes region integrated these different ecological zones through social differentiation, perhaps leading to the stratified societies that existed in the region such as Rwanda, Bunyoro, and Buganda (Schoenbrun 1990, 1993a, 1993b, 1994, 1998).
Changing technology in food production over the long term is also the focus of James McCann in his work on Ethiopia (McCann 1995). McCann examines the development of plow agriculture in the highlands of Ethiopia. In general, animal‐traction‐based plowing did not exist in sub‐Saharan Africa outside of the northeastern quadrant consisting of the Ethiopian Highlands and Nilotic Sudan. Both the presence of trypanosomaisis and other diseases and the thinness of many African soils inhibited the spread of plow agriculture. In the Ethiopian highlands, the combination of relatively regular rainfall with cooler temperatures in the highlands meant that plowing on terraced land became established in the last centuries before the Current Era. McCann goes on to argue that plow agriculture came to define Ethiopian civilization, and that areas in highlands suitable for the system became integrated into the highland social order. The broader system, though, was subject to shocks from climate variation, with McCann arguing that the region in the long run was caught in “Malthusian scissors” that limited population growth.
The intensification indicated by the use of plow agriculture was not the only form that existed in sub‐Saharan Africa. Esther Boserup famously argued, using primarily African evidence, that, rather than being locked in Malthus’s cycle of demographic boom and bust, population growth provided the conditions for technological innovation to increase agricultural productivity (Boserup 1965). In many regions of Africa, agriculture remained extensive as population densities stayed low, and farmers frequently relied on shifting fields to ensure fertility (Jacobs 2003). However, intensification occurred in areas where productive land or land suitable for specific crops came to be in short supply.
In West Africa, rice cultivation led to complex systems designed to ensure access to floodplains for cultivation. Farmers also grew African rice using dry‐land techniques that required intensive labor inputs to ensure adequate supplies of moisture (Carney 2001; Richards 1985). Walter Hawthorne has argued that the era of the Atlantic slave trade saw intensification of production in seaside lagoons as people sought refuge from violence and increased their use of iron implements in production (Hawthorne 2003).
Intensification also occurred in highland regions, especially in eastern Africa. Mountains and highlands running from Ethiopia to South Africa generally receive more rainfall than the surrounding lowlands. In many cases, such as Mounts Kilimanjaro, Meru, and Kenya, soils on the mountains also proved more fertile than those of lowlands. These uplands quickly came to host dense populations, especially after the spread of banana cultivation as a staple crop sometime before the beginning of the second millennium of the Current Era. Intricate irrigation systems came into existence on many of these highlands, managed by local collaborative institutions, and farmers kept at least some livestock in pens at the home sites to ensure a supply of manure for the bananas. These well‐watered areas operated in a system that linked communities across ecological zones. Households often had fields in different areas to ensure supplies of grain as well as bananas, and either owned cattle herds that grazed in the lowlands or traded with cattle herders for access to cattle for their homesteads (Widgren and Sutton 2004).
Intensification also occurred in cases where access to water in arid regions allowed the possibility of agricultural production. The archaeological site of Engaruka in northwestern Tanzania reveals an intricate irrigation system that channeled water down the slope of the Rift Valley onto fields spread across the drier lowlands. The system operated for several centuries before the seventeenth century when it was abandoned, possibly as a result of the expansion of Maasai pastoralists into the region (Sutton 1984, 2004). Around Lake Baringo in Kenya, likewise located in an arid region, demand for food for caravans in the eighteenth and nineteenth centuries sparked the development of an irrigation system drawing on the waters of the lake (Anderson 1988, 2002).
If African societies showed a long‐term ability to adapt to many different environments, they especially displayed it after the advent of the Columbian Exchange. With extensive trade links across the Sahara to North Africa and along the Indian Ocean coast to Southwest and South Asia, Africa had never been isolated from the rest of the world. The advent of direct contact with Europe along the Atlantic coast and then into the Indian Ocean marked as big a change for African societies as it did for the rest of the world. Of course people made up Africa’s biggest contribution to the Columbian Exchange. Upwards of 13 million Africans left the continent involuntarily between the fifteenth and nineteenth centuries. The advent of trade both in captives and in other commodities shifted wealth and power in western Africa toward the coast and away from the Sahara. Although it was spread out both over time and geographically, the loss of population dramatically reduced population growth in parts of the region (Manning 1990).
Africa’s participation in the Columbian Exchange was otherwise not as dramatic. Africans adopted New World crops with alacrity. In particular, maize and cassava spread rapidly after the sixteenth century. Both crops fitted into existing cultivation systems, giving African farmers more flexibility in crop choices. Maize, which generally requires more moisture than the African grains of sorghum and millet, also had a shorter growing season and often higher yields. African farmers experimented with maize varieties and developed their own varieties, bred for their environments. Many areas, especially in the more arid regions, continued to rely on sorghums and millets (McCann 2004). Slave traders seem to have encouraged the production of cassava as the root stored well on the transatlantic voyage. It fitted into yam cropping systems, as did other types of yams introduced into Africa during this time (Manning 1990). One of the more interesting elements of the exchange is the possibility that African farmers, brought to the Carolinas as slaves, introduced or at least used their experience to make possible the expansion of rice cultivation on plantations there (Carney 2001).
The importance of African diseases in the Columbian Exchange has been noted above. Malaria and especially yellow fever played large roles in the decimating indigenous American populations and limiting the growth of European populations in the tropics (McNeill 2010). These same diseases also limited the ability of Europeans to settle in tropical Africa. These disease environments did not, however, stop Europeans from colonizing the one neo‐Europe on the continent, the extreme southern tip at the Cape of Good Hope.
Beinart and Coates have compared the settlement of South Africa beginning in the seventeenth century to the settlement of the colonies that became the United States. As in the New World, the Dutch settlers at the Cape found a population relatively isolated from the rest of the world. The arid region surrounding the Cape had limited the expansion of agriculture into the region and the denser populations and Old World diseases that came with it. Dutch settlers set about creating their neo‐Europe by beginning wheat and grape cultivation, importing sheep, and driving local populations from the land, assisted by a demographic collapse of the Khoi San‐speaking peoples of the area. Settler expansion famously reached an early limit at the Fish River which roughly marked a boundary between the region where Africans grew African crops and environments not suited to them. This rough, permeable boundary would hold until the nineteenth century (Beinart and Coates 1995; Elphick 1977).
The nineteenth century saw the coming of direct European rule over most of the African continent. This expansion had political, economic, and even, in a way, moral causes but, as many have argued, it was very much shaped by environmental constraints and possibilities. Richard Grove has most famously argued that European expansion was shaped not just by a drive to take control of people and resources but also by a conservationist ethic that sought to conserve the natural world as well as exploit it. Such perspectives meant overriding the claims and practices of peoples and states in favor of what Europeans thought was the superior knowledge produced by Western science (Grove 1995, 1997).
The immediate transition to imperial rule, however, did not result in conservation but in destruction. In the first half of the nineteenth century “legitimate commerce” expanded as European demand for captive labor declined. African societies across the continent began to export raw materials and agricultural commodities both to Europe and to Asian markets. In addition, exports of natural resources and animal products such as ivory expanded. Scholars who have studied these transitions have tended to focus on the disruptive nature of this commercial expansion. In one of the most influential studies of a region during this transition Helge Kjekshus argues that African societies in what became first German East Africa and then British Tanganyika had generally developed agricultural and pastoral systems that “controlled” their environments and limited the risk of famine. He suggests that the expansion of the slave and ivory trade in the area in the nineteenth century, and then European conquest in the last two decades, disrupted these control mechanisms and brought population decline (Kjekshus 1977). Mike Davis has elaborated this perspective to argue that the late nineteenth century saw what he calls a “late Victorian holocaust” as European expansion caused the death and destruction not only of conquered states and societies but also of finely tuned local systems of environmental management (Davis 2001).
While the imperial project fundamentally sought to destroy the autonomy of African societies and to integrate them economically and politically into a global imperial order, colonial states as they came to be established also sought to restructure African landscapes radically. Much of this reimagining was based on an economic vision of African resources feeding metropolitan economies and African labor becoming both producers and consumers in the global economy. As such, colonial regimes followed paths toward those goals in part determined by environment. In parts of the continent where disease environments allowed for Europeans to survive without extensive landscape modification, European settlers often controlled large estates and tried to use the force of the colonial state to force Africans to become cheap labor for their enterprises. Regions holding exploitable raw materials, especially minerals, also saw enclave development using cheap African labor, with the added complication of the necessity of ensuring adequate supplies of food for the workforce. Most African colonies, though, saw colonial regimes attempt to encourage commodity production by African farmers and herders for market. Many colonies contained more than one of these types of areas, and, as a number of scholars have pointed out, colonial policies on resource management developed in a context of imperial institutions that sought to find best practices through much communication across imperial boundaries (Fairhead and Leach 1998; Hodge 2007; Tilley 2011).
Perhaps the most profound change for African landscapes in the late nineteenth and early twentieth centuries was the opening up of large parts of the continent to more rapid transportation, not just of people but of goods as well. Starting in southern African, colonial regimes invested heavily in transportation infrastructure, most notably in railways. Most lines sought to connect perceived areas of export production with ports. In South Africa railways linked the gold and diamond regions with both export–import hubs and sources of labor and food for that labor. Railways, steam power on some rivers and lakes, and roads and motorized transport remade transportation networks that had previously relied on porterage and human and sail power on water. Trade and human networks “snapped” to these new modes of transport. Likewise, these networks continued the extractive orientation of trade that had developed in the nineteenth century; they ran from the interior to the coast and hardly ever linked colonies ruled by different imperial powers, a phenomenon that remains true for the most part up to the present (Maddox 1998).
The development of the transportation infrastructure had the intended effect of expanding production for markets, especially for exports. In the very early years of colonial rule, such production often took the form of continuing precolonial practices of resource extraction on a grander and more destructive pace. Ivory exports led to the decimation of elephant herds, and rubber tapping in the Central African rainforest increased dramatically (Harms 1987). Such production occurred both because colonial states encouraged settler estates where possible, and because African producers responded to the possibilities of new markets, often in ways not foreseen by colonial officials. Cocoa production expanded in the Gold Coast and elsewhere in West Africa (Hill 1970; Hart 1982) and coffee production in several parts of East Africa (Mbilinyi 1976; Weiss 2003) despite official indifference or opposition. Likewise, the production of food crops for growing colonial urban areas sometimes conflicted with colonial concerns about local food supplies in rural areas (Monson 1991).
The expansion of production led, by the early twentieth century, to a growing concern with sustainability in African agriculture as practiced by both African farmers and European settlers. Such concerns became part of regular discourse among colonial officials in the 1930s, informed in part by American experiences during the Dust Bowl era, but fears about “unscientific” African farmers and “greedy” European settlers “mining the soil” date back to at least the late nineteenth century (Anderson 1984; Beinart 1984; Grove 1989). In settler colonies such as Kenya and South Africa colonially inspired conservation efforts focused first on supporting settler agriculture, often by trying to limit African access to land in the name of conservation. Elsewhere, in the 1930s efforts focused on improving soil conservation efforts through imposing often labor‐intensive efforts at changing agricultural practices (Anderson and Grove 1987).
The end of World War II saw a major expansion in colonial efforts promote what came to be known as development. These efforts included increased spending on research about African environments and efforts to invest in productive growth. The French constructed a massive irrigation project on the Niger River after the war to promote cotton production. While cotton failed to become a large export, the project found a function as African farmers began to produce rice, mostly for local markets (van Beusekom 2002). The British Groundnut Scheme sought to grow peanuts on mostly empty land in Tanganyika and Nigeria using military surplus equipment. The project failed miserably because the land, though underutilized by “primitive” African agriculture, in fact had no water supplies (Hogendorn and Scott 1983). Many other projects, great and small, often failed; the ones that succeeded did so most often when African farmers and herders found their own ways to use the resources that such projects brought, and commonly called forth resistance from communities asked to, in the words of a colonial official in Tanganyika, “undertake significant increases in labour for no apparent gain.”3
If control over the productive processes in agricultural and pastoral production often served as a flash point for conflict, control over land and access to its resources did so even more. Conflict over access to resources took three basic forms: alienation of land for settler estates or extractive industry; conservation of resources, especially forest resources that limited local communities’ access to them; and preservation of wilderness and wildlife. All three forms relied on violence by colonial states and their independent successors for their achievement.
Alienation of land for white settlement begins, of course, in South Africa in the 1700s. The idea that “idle” or ill‐used land existed in large quantities in Africa, though, became part of the drive for colonial expansion in the second half of the nineteenth century, bolstered in part by white South Africans’ claims that their expansion in the first half of the nineteenth century had been into primarily empty land. Such claims were not true in South Africa or anywhere else in the continent. European appropriation of land occurred against the resistance of African communities. Eventually settlers controlled much of the land in Southern Rhodesia (Zimbabwe) and Kenya, and existed in smaller numbers in Tanganyika, Northern Rhodesia (Zambia), the Belgian Congo, and a few of the French colonies. African resistance to land loss fueled uprisings in Zimbabwe (Alexander 2006; Alexander, McGregor, and Ranger 2000) and Kenya (Anderson 2005; Throup 1987). In Tanganyika, a post‐World War II effort to expand settler estates around Mount Meru became a catalyst for nationalist agitation (Spear 1997). While the legacy of white settlement and resistance have continued to define struggles over land in South Africa and Zimbabwe to this day, such issues remain important as in the last decade African states have taken to selling or leasing land to foreign interests for food production and to collaborating in removing existing communities from their often long‐held territories.
Forest lands also became a major site of contention over the course of the colonial era and beyond. As Fairhead and Leach have shown, European conceptions of African forests began often with a serious misconception and proceeded on the basis of a need to ensure the profitable exploitation of forest resources (Fairhead and Leach 1998). European scientists and administrators often assumed that the forests they found in the rainforest belt and on mountain ranges had recently come under threat with the expansion of wasteful exploitation by African farmers. They set about to “conserve” forest resources by restricting African access to forests and regulating exploitation (Conte 2004; Fairhead and Leach 1998; Sunseri 2009). Fairhead and Leach argue that African landscapes at the beginning of the colonial era represented the result of centuries of human residence and exploitation within particular environments. In many cases, human action had promoted the spread of tree cover in areas where naturally occurring fires during the dry season would have prevented their growth (Fairhead and Leach 1996; McCann 1997). In other areas, African communities used fire to control bush growth that harbored disease‐carrying insect vectors to promote both animal and human survival (Ford 1971; Giblin 1990). Efforts to separate human activity from forests often led to both threats to forest survival and the spread of disease (Hoppe 2003). At the same time, protecting forested areas became the centerpiece of the largest and most influential environmental movement in postcolonial Africa, the Green Belt movement in Kenya, led by the Nobel laureate Wangari Maathai. The movement focused not just on the loss of green space in a rapidly expanding Nairobi but also on the degradation of rural people’s lives due to loss of forests. It promoted planting trees to replace lost forests for use as well as to protect green space (Maathai 2006).
Perhaps no image is more evocative of Africa than that of its wildlife, often portrayed moving spectacularly through a virgin wilderness. It should come as no surprise that behind these images lies a great deal of struggle. As in America and the rest of the world, humans created wilderness (Cronon 1996). In eastern and southern Africa especially, the drive to create preserved areas that would become national parks came not just from a conservation ethic but also from a desire to preserve hunting grounds for Europeans in the early days of colonial rule. At first, these efforts often focused on managing elephant herds for ivory, but they expanded as knowledge of the necessity of habitat preservation came into being (MacKenzie 1989; Steinhart 2005). In East Africa, the great rinderpest epidemic of the 1890s not only wiped out the herds of African pastoralists but decimated much of the game animals of the greater Serengeti ecosystem. Pastoral peoples were forced to move to agricultural areas and the new colonial cities, often as impoverished laborers, in an effort to garner resources to rebuild their herds. In the absence of their herds and their management of pasture through burning, bush expanded that harbored tsetse fly, making it impossible for pastoralists to return to many of the areas they had previously occupied. As a result, when the Germans and British began to create game reserves in the region, they took much of the region to be unoccupied since time immemorial. They framed attempts by pastoralists and farmers to return to their lands as encroachment on the newly created wildernesses. Creating these reserves became a constant struggle with Africans trying to reclaim ancestral resources and Europeans decrying them as poachers and farmers bent on “deforestation” of the regions (Århem 1985; Neumann 1998).
The process in southern Africa was similar, with the great national parks developing in areas that were less hospitable to white settlement. South Africa itself lost much of its biodiversity by the early twentieth century because of white hunting and the loss of habitat to mechanical agriculture (MacKenzie 1989). The South African state created the largest national park in the area, Krueger, in a region that hosted tsetse fly and malaria (Carruthers 1995).
In the second half of the twentieth century, international organizations replaced colonial states as the advocates for continuing preservation efforts. At the same time, postindependence governments have recognized the value of conservation as a generator of both foreign exchange through tourism and, increasingly, payments for conservation and as means to preserve biodiversity. The ability of state or quasi‐state actors to successfully engage in conservation, however, has varied greatly. While states have often successfully defended the borders of preserved areas against encroachment by local communities, they have had less success in controlling poaching, especially of high‐value products of wildlife such as ivory and rhino horn. Conflict between local communities that feel deprived of access to grazing, water, and farming land has occurred constantly. States and international organizations have developed a number of programs to enlist local communities in conservation and to give them benefits. These community conservation efforts have themselves been embroiled in conflict as the goals of local communities and conservationists often differ radically (Adams and McShane 1996; Brockington 2002; Galvin et al. 2002; Perkin 1995).
As independence came to African nations from the 1950s onward, concerns over African environments grew. The postcolonial era has seen mostly the continuation of the trends established during the colonial era. First, land use and agricultural productivity have generally expanded, growing at least enough to feed the vast majority of the population in most years. The expanded use of technology, including mechanization and fertility enhancement of soils, have driven the expansion. Population has continued to grow, putting pressure on land availability in some countries. Most of the population growth has occurred in urban areas, however, indicating the ability of most parts of the continent to produce enough food to feed this population. In some regions, wildlife conservation efforts have expanded as tourism becomes a growing economic sector. In others, the lack of tourism has put extreme pressure on conservation efforts.
All of these trends face large challenges in the future. While population growth has moderated in recent decades, Africa continues to have the fastest‐growing population of any region in the world. Theoretically, continued adoption of improved technology means that African states could grow enough food to feed themselves; this growth has and will put pressure on the environment, causing landscape change and loss of biodiversity. Climate change further complicates the picture. The warming of global temperatures will cause changes in weather patterns and most likely greater instability in those patterns. The ability of African communities to adapt to their environments as they have done for thousands of years will be put to the test.
Two particular foci then should animate the future of environmental history. One looks backwards toward the biological adaptation of humanity and the rest of the natural world to changing environmental conditions. The mapping of the human genome holds the promise of new therapies to treat human disease, but it also allows a more sophisticated understanding of the origins, movements, and interactions of human populations on long timescales (Bryc et al. 2010; Campbell and Tishkoff 2008; Tishkoff and Kidd 2004; Tishkoff et al. 2009). The integration of this new type of evidence is still in its infancy, and for the most part it has tended to confirm the conclusions of historians working with linguistic and archaeological evidence, but it has the potential to greatly strengthen conclusions about early African history. The second focus looks forward to the fate of Africa’s environments in an age of rapidly changing climate. One of the great claims of environmental history is that it is firmly rooted in a longue durée approach to the existence of human societies on the earth. Nowhere is this type of history more necessary and more revealing than in Africa. Human societies, born in the natural world, recreating environments, and being shaped them have existed longer on the continent than anywhere else. Understanding the environmental history of Africa requires coming to terms not with Hegel’s antithesis of history, not with an Africa devoid of human influence, but with the very real way humans made Africa.