11

CONFLICT IN THE ERA OF ECONOMIC DECLINE

MANY OF THE READERS OF MY WRITINGS, AND THOSE OF MY colleagues, have come to share a certain view of the world. It’s probably fair to say that, as a group, we see resource depletion, financial chaos, and environmental disasters (principally associated with global climate change) as looming storms converging on industrial civilization. We also tend to see the unprecedented level of complexity of our society today as resulting from the historically recent energy subsidies of fossil fuels, and to a certain extent the enabling factor of debt in various forms. Thus, as the quality and quantity of our energy sources inevitably decline and financial claims melt away with the ongoing burst of history’s greatest credit bubble, we see a simplification and decentralization of societal systems as inevitable.

In this essay, I hope to explore some of the broader social implications of simplification and decentralization. Will wars and revolutions break out with ever-greater frequency? Will democracy thrive, or will traumatized masses find themselves at the mercy of tyrants? Will nation states survive, or will they break apart? Will regional warlords rule over impoverished and enslaved survivors? Or will local food networks and Transition groups positively transform society from the ground up?

I don’t claim to have a functioning crystal ball. But tracing current trends and looking to historic analogies may help us understand our prospects better, and help us make the most of them.

The 21st Century Landscape of Conflict

Looking forward, four principal drivers of conflict are easily apparent. More may be lurking along the way.

First is the increasing prospect of conflict between rich and poor—i.e., between those who benefitted during history’s biggest growth bash and those who provided the labor, sat on the sidelines, or were pushed aside in resource grabs.

Economic growth produces inequality as a by-product. Not only do industrialists appropriate the surplus value of the labor of their workers, as Marx pointed out, but lenders accumulate wealth from the interest paid by borrowers. We see inequality being generated by economic growth in real time in China, where roughly six hundred million people have been lifted from poverty in the last 30 years as a result of nine percent annual averaged economic growth—but where economic inequality now surpasses levels in the United States.

Just as economic growth produces winners and losers domestically, the level of wealth inequality between nations grows as the global economy expands. Today the disparity between average incomes in the world’s richest and poorest nations is higher than ever.

The primary forces working against inequality as economies grow are government spending on social programs of all sorts, and international aid projects.

As economic growth stops, those who have benefitted the most have both the incentive to maintain their relative advantage and, in many cases, the means to do so. Which means that in a contracting economy, those who have the least tend to lose the most. There are exceptions, of course. Billionaires can in theory go broke in a matter of hours or even seconds as a result of a market crash. But in the era of “too-big-to-fail” banks and corporations, government provides a safety net for the rich more readily than for the poor.

High and increasing inequality is usually bearable during boom times, as people at the bottom of the wealth pyramid are encouraged by the prospect of its overall expansion. Once growth ceases and slips into reverse, however, inequality becomes socially unsustainable. Declining expectations lead to unrest, while absolute misery (in the sense of not having enough to eat) often results in revolution.

We’ve seen plenty of examples of these trends in the past few years in Greece, Ireland, Spain, the United States, and the Middle East.

In many countries, including the US, government efforts to forestall or head off uprisings appear to be taking the forms of criminalization of dissent, the militarization of police, and a massive expansion of surveillance using an array of new electronic spy technologies. At the same time, intelligence agencies are now able to employ up-to-date sociological and psychological research to infiltrate, co-opt, misdirect, and manipulate popular movements aimed at achieving economic redistribution.

However, these military, police, public relations, and intelligence efforts require massive funding as well as functioning grid, fuel, and transport infrastructures. Further, their effectiveness is limited if and when the nation’s level of economic pain becomes too intense, widespread, or prolonged.

A second source of conflict consists of increasing competition over access to depleting resources, including oil, water, and minerals. Among the wealthiest nations, oil is likely to be the object of the most intensive struggle, since oil is essential for nearly all transport and trade. The race for oil began in the early 20th century and has shaped the politics and geopolitics of the Middle East and Central Asia; now that race is expanding to include the Arctic and deep oceans, such as the South China Sea.

Resource conflicts occur not just between nations but also within societies: witness the ongoing insurgencies in the Niger Delta, where oil revenue fuels rampant political corruption while drilling leads to environmental ravages felt primarily by the Ogoni ethnic group; see also the political infighting in fracking country here in the United States, where ecological impacts put ever-greater strains on the social fabric. Neighbors who benefit from lease payments no longer speak to neighbors who have to put up with polluted water, a blighted landscape, and the noise of thousands of trucks carrying equipment, water, and chemicals. Eventually, however, boomtowns turn to ghost towns, and nearly everyone loses.

Thirdly, climate change and other forms of ecological degradation are likely to lead to conflict over access to places of refuge from natural disasters. The responsible agencies—including the United Nations University Institute for Environment and Human Security—point out that there are already 12 million environmental refugees worldwide, and that this number is destined to soar as extreme weather events increase in frequency and severity. Typically, when bad weather strikes, people leave their homes only as a last resort; in the worst instances they have no other option. As America learned during the Dust Bowl of the 1930s, when hundreds of thousands were displaced from farms in the prairies, rapid shifts in population due to forced migration can create economic and social stresses, including competition for scarce jobs, land, and resources, leading to discrimination and sometimes violence.

Where do refugees go when the world is already full? Growing economies are usually able to absorb immigrants and governments may even encourage immigration in order to keep wages down. But when economic growth ceases, immigrants are often seen as taking jobs away from native-born workers.

For this reason as well, conflict will appear both within and between countries. Low-lying island nations may disappear completely, and cross-border, weather-driven migrations will increase dramatically. Inhabitants of coastal communities will move further inland. Farmers in drought-plagued areas will pick up stakes. But can all of these people be absorbed into shantytowns in the world’s sprawling megacities? Or will at least some of these cities themselves see an exodus of population due to an inability to maintain basic life-support services?

Lastly, climate change, water scarcity, high oil prices, vanishing credit, and the leveling off of per-hectare productivity and the amount of arable land are all combining to create the conditions for a historic food crisis, which will impact the poor first and most forcibly. High food prices breed social instability—whether in 18th-century France or 21st-century Egypt. As today’s high prices rise further, social instability could spread, leading to demonstrations, riots, insurgencies, and revolutions.1

In summary, conflict in the decades ahead will likely center on the four factors of money, energy, land, and food. These sources of conflict will overlap in various ways. While economic inequality will not itself be at the root of all this conflict (one could argue that population growth is a deeper if often unacknowledged cause of strife), inequality does seem destined to play a role in most conflict, whether the immediate trigger is extreme weather, high food prices, or energy shortages.

This is not to say that all conflict will be over money, energy, land, or food. Undoubtedly religion will provide the ostensible banner for contention in many instances. However, as so often in history, this is likely to be a secondary rather than a primary driver of discord.

War and Peace in a Shrinking Economy

Will increasing conflict lead to expanding violence?

Not if neuropsychologist Stephen Pinker is right. In his expansive and widely praised book The Better Angels of Our Nature: Why Violence Has Declined, Pinker claims that, in general, violence has waned during the past few decades. He argues that this tendency has ancient roots in our shift from peripatetic hunting and gathering to settled farming; moreover, during the past couple of centuries the trend has greatly intensified. With the emergence of Enlightenment philosophy and its respect for the individual came what Pinker calls the Humanitarian Revolution. Much more recently, after World War II, violence was suppressed first by the “mutually assured destruction” policies of the two opposed nuclear-armed sides in the Cold War, and then by American global hegemony. Pinker calls this the Long Peace. Wars have become less frequent and less violent, and most societies have seen what might be called a decline of tolerance for intolerance—whether manifested in schoolyard fights, bullying, or picking on gays and minorities.

But there is a problem with Pinker’s implied conclusion that global violence will continue to decline. The Long Peace we have known since World War II may well turn out to be shorter than hoped as world economic growth stalls and American hegemony falters—in John Michael Greer’s words, as “the costs of maintaining a global imperial presence soar and the profits of the imperial wealth pump slump.”2 Books and articles predicting the end of the American empire are legion; while some merely point to the rise of China as a global rival, others describe the looming failure of the essential basis of the US imperial system—the global system of oil production and trade (with its petro-dollar recycling program) centered in the Middle East. There are any number of scenarios describing how the end of empire might come, but few credible narratives explaining why it won’t.

When empires crumble, as they always eventually do, the result is often a free-for-all among previous subject nations and potential rivals as they sort out power relations. The British Empire was a seeming exception to this rule: in that instance, the locus of military, political, and economic power simply migrated to an ally across the Atlantic. A similar graceful transfer seems unlikely in the case of the United States, as 21st-century economic decline will be global in scope. A better analogy to the current case might be the fall of Rome, which led to centuries of incursions by barbarians as well as uprisings in client states.

Disaster per se need not lead to violence, as Rebecca Solnit argues in her book A Paradise Built in Hell: The Extraordinary Communities that Arise in Disaster. She documents five disasters—the aftermath of Hurricane Katrina; earthquakes in San Francisco and Mexico City; a giant ship explosion in Halifax, Canada; and 9/11—and shows that rioting, looting, rape, and murder were not automatic results. Instead, for the most part, people pulled together, shared what resources they had, cared for the victims, and in many instances found new sources of joy in everyday life.

However, the kinds of social stresses we are discussing now may differ from the disasters Solnit surveys, in that they comprise a “long emergency,” to borrow James Kunstler’s durable phrase. For every heartwarming anecdote about the convergence of rescuers and caregivers on a disaster site, there is a grim historic tale of resource competition turning normal people into monsters.

In the current context, a continuing source of concern must be the large number of nuclear weapons now scattered among nine nations. While these weapons primarily exist as a deterrent to military aggression, and while the end of the Cold War has arguably reduced the likelihood of a massive release of them in an apocalyptic fury, it is still possible to imagine several scenarios in which a nuclear detonation could occur as a result of accident, aggression, preemption, or retaliation.3

We are in a race—but it’s not just an arms race; indeed, it may end up being an arms race in reverse. In many nations around the globe the means to pay for armaments and war are starting to disappear while the incentive to engage in international conflict is increasing, as a way of rechanneling the energies of jobless young males and distracting the general populace, which might otherwise be in a revolutionary mood. We can only hope that historical momentum can maintain the Great Peace until industrial nations are sufficiently bankrupt that they cannot afford to mount foreign wars on any substantial scale.

Post-carbon Governance

Are we headed toward a more autocratic or democratic future? There’s no hard and fast answer; the outcome may vary by region. However, recent history does offer some useful clues.

In his recent and important book Carbon Democracy: Political Power in the Age of Oil, Timothy Mitchell argues that modern democracy owes a lot to coal. Not only did coal fuel the railroads, which knitted large regions together, but striking coal miners were able to bring nations to a standstill, so their demands for unions, pensions, and better working conditions played a significant role in the creation of the modern welfare state. It was no mere whim that led Margaret Thatcher to crush the coal industry in Britain; she saw its demise as the indispensable precondition to neoliberalism’s triumph.

Coal was replaced, as a primary energy source, by oil. Mitchell suggests that oil offered industrial countries a path to reducing internal political pressures. Its production relied less on working-class miners and more upon university-trained geologists and engineers. Also, oil is traded globally, so that its production is influenced more by geopolitics and less by local labor strikes. “[P]oliticians saw the control of oil overseas as a means of weakening democratic forces at home,” according to Mitchell, and so it is no accident that by the late 20th century the welfare state was in retreat and oil wars in the Middle East had become almost routine. The problem of “excess democracy,” which reliance upon coal inevitably brought with it, has been successfully resolved, not surprisingly by still more teams of university-trained experts—economists, public relations professionals, war planners, political consultants, marketers, and pollsters. We have organized our political life around a new organism—“the economy”—which is expected to grow in perpetuity, or, more practically, as long as the supply of oil continues to increase.

Andrew Nikiforuk also explores the suppression of democratic urges under an energy regime dominated by oil in his brilliant book The Energy of Slaves: Oil and the New Servitude. The energy in oil effectively replaces human labor; as a result, each North American enjoys the services of roughly 150 “energy slaves.” But, according to Nikiforuk, that means that burning oil makes us slave masters—and slave masters all tend to mimic the same attitudes and behaviors, including contempt, arrogance, and impunity. As power addicts, we become both less sociable and easier to manipulate.

In the early 21st century, carbon democracy is still ebbing, but so is the global oil regime hatched in the late 20th century. Domestic US oil production based on hydraulic fracturing (“fracking”) reduces the relative dominance of the Middle East petro-states, but to the advantage of Wall Street—which supplies the creative financing for speculative and marginally profitable domestic drilling. America’s oil wars have largely failed to establish and maintain the kind of order in the Middle East and Central Asia that was sought. High oil prices send dollars cascading toward energy producers but starve the economy as a whole, and this eventually reduces petroleum demand. Governance systems appear to be incapable of solving or even seriously addressing looming financial, environmental, and resource issues, and “democracy” persists primarily in a highly diluted solution whose primary constituents are money, hype, and expert-driven opinion management.

In short, the 20th-century governance system is itself fracturing. So what comes next?

As the fracking boom unavoidably fails due to financial and geological constraints, a new energy regime will inevitably arise. It will almost surely be one mainly characterized by scarcity, but it will also eventually be dominated by renewable energy sources—whether solar panels or firewood. That effectively throws the door open to a range of governance possibilities. As mobility declines, smaller and more local governance systems will be more durable than empires and continent-spanning nation states. But will surviving regional and local governments end up looking like anarchist collectives or warlord compounds? Recent democratic innovations pioneered or implemented in the Arab Spring and the Occupy movement hold out more than a glimmer of hope for the former.

Anthropologist David Graeber argues that the failure of centralized governmental institutions can open the way for democratic self-organization; as evidence, he cites his own experience doing doctoral research in Madagascar villages where the state had ceased collecting taxes and providing police protection. Collecting revenues and enforcing laws are the most basic functions of government; thus these communities were effectively left to govern and provide for themselves. According to Graeber, they did surprisingly well. “[T]he people had come up with ingenious expedients of how to deal with the fact that there was still technically a government, it was just really far away. Part of the idea was never to put the authorities in a situation where they lost face, or where they had to prove that they were in charge. They were incredibly nice to [government officials] if they didn’t try to exercise power, and made things as difficult as possible if they did. The course of least resistance was [for the authorities] to go along with the charade.”4

Journalism professor Greg Downey, commenting on Graeber’s ideas, notes, “I saw something very similar in camps of the Movimento Sem Terra (the MST or ‘Landless Movement’) in Brazil. Roadside shanty camps attracted former sharecroppers, poor farmers whose small plots were drowned out by hydroelectric projects, and other refugees from severe restructuring in agriculture toward large-scale corporate farming.” These farmers were victims, but they were by no means helpless. “Activists and religious leaders were helping these communities to set up their own governments, make collective decisions, and eventually occupy sprawling ranches. . . . The MST leveraged the land occupations to demand that the Brazilian government adhere to the country’s constitution, which called for agrarian reform, especially of large holdings that were the fruits of fraud. . . . [C]ommunity-based groups, even cooperatives formed by people with very little education, developed greater and greater ability to run their own lives when the state was not around. They elected their own officials, held marathon community meetings in which every member voted (even children), and, when they eventually gained land, often became thriving, tight-knit communities.”5

A Theory of Change for a Century of Crisis

If groups seeking to make the post-carbon transition go more smoothly and equitably are to have much hope of success, they need a sound strategy grounded in a realistic theory of change. Here, briefly, is a theory that makes sense to me.

For the past four decades, since the release of Limits to Growth, there have been many scattered efforts to develop alternatives to our current fossil-fueled, growth-based industrial paradigm. These include renewable energy systems; local, organic, and permaculture food systems; urban design movements seeking to reduce the dominance of the automobile in our built environment; architectural programs with the goal of designing buildings that require no external energy input and that are constructed using renewable and recycled materials; alternative currencies not attached to interest-bearing debt, as well as alternative banking models; and alternative economic indicators that take account of social and environmental factors. While such efforts have achieved some small degree of implementation, varying significantly from place to place around the globe, they have generally failed to substantially reduce reliance on fossil fuels, blunt the overall momentum of society toward increased consumption, reduce financial instability, or curtail profound environmental impacts, including climate change and loss of biodiversity and topsoil.

What will it take for the conservers, localizers, and de-growthers to win? They have a lot stacked against them. The interests promoting a continuation of growth-as-usual are powerful and have spent decades honing advertising and public relations messages whose proliferation is subsidized by hundreds of billions of dollars annually. These interests have captured the allegiance of nearly every elected official in the world. Most ordinary folks are easily swept along because they want more and better jobs, cheaper gasoline, more flat-screen TVs, and all the other perks that come with fossil-fueled economic expansion.

The main downside to growth-as-usual is that it is unsustainable: it is destined to end in resource depletion, economic unraveling, and environmental catastrophe. The conservers, localizers, and de-growthers must therefore hope that if the growth-as-usual bandwagon cannot be turned back with persuasion, its inevitable crash will occur in increments, so that they can seize each step-down in industrial output as an opportunity to demonstrate and promote the need for alternatives.

Advocates of the post-carbon crisis theory of change can point to several useful historic examples. One is the transformation of Cuba’s food system during that country’s “Special Period” in the 1990s. The collapse of the Soviet Union and the resulting disappearance of subsidized Soviet oil shipments set the stage with a crisis. Several Cuban agronomists had previously advocated for more localized and organic agriculture, to no avail, but when the country was suddenly threatened with starvation, they were called upon to redesign the entire food system. The moral of the story: advocates of a post-carbon economy are likely to make limited headway during times of cheap energy and rapid economic growth, yet when push comes to shove obstacles may disappear. The Cuban example is encouraging, but it is often called into question on the grounds that what worked on an island with an authoritarian government might not work so well in a large, pluralistic democracy such as the United States.

Paul Gilding, in his book The Great Disruption, proposes World War II as an illustration of the crisis-led theory of change: “[O]n the objective facts, Hitler represented a clear and undeniable threat long before action was taken to defeat him,” he writes. “Famously, Churchill and others had long warned of this threat and been largely ignored or even ridiculed. Society remained in denial, preferring not to recognize the threat. This was because denial avoided full acceptance and what that meant—war and a strong change to the status quo. Yet once. . . denial ended, the response was swift and dramatic. Things changed almost overnight. Without the benefit of a retrospective view, it would be much harder to predict when exactly the denial of Hitler’s threat would end. So it’s also hard to predict when the moment will come [when the need for action on climate change is finally recognized], even though in hindsight it will be ‘obvious.’”

Post-Fukushima Japan offers yet another example. In the wake of catastrophic nuclear plant meltdowns, the Japanese people insisted that other reactors be idled; soon only two of the nation’s atomic power plants were operating. That left Japan with substantially less electricity than normal—enough of a shortfall that economic collapse could have resulted. Instead, businesses and households slashed energy use, driven by a collective ethical imperative. Solar photovoltaic (PV) systems have appeared on rooftops across the nation.

The Kansas town of Greensburg was flattened by a tornado in May 2007, but the residents—rather than drifting away or merely trying to rebuild what they had—decided instead to use insurance and government disaster aid money to build what they are calling “America’s greenest community,” emphasizing energy efficiency and using 100 percent renewable energy.

Economist Milton Friedman may have laid down a manifesto for crisis-led theories of change when he wrote: “Only a crisis—actual or perceived—produces real change. When the crisis occurs, the actions that are taken depend upon the ideas that are lying around. That, I believe, is our basic function: to develop alternatives to existing policies, to keep them alive and available until the politically impossible becomes politically inevitable.” In this brief passage, Friedman not only sums up the theory nicely, but also forces us to contemplate its dark side. In her 2007 book The Shock Doctrine: The Rise of Disaster Capitalism, Naomi Klein describes how Friedman and other neoliberal economists used crisis after crisis, beginning in the 1970s, as opportunities to undermine democracy and privatize institutions and infrastructure across the world. Somehow, citizens and communities need to be the first to seize the opportunities presented by crisis, to build local, low-carbon production and support infrastructure.

The post-carbon theory of change doesn’t seek to expedite or exacerbate crisis; instead, it encourages building resilience into societal systems in order to minimize the trauma of rapid change. Resilience is often defined as “the ability to absorb shocks, reorganize, and continue functioning.” Shocks are clearly on the way, so we should be doing what we can now to build local inventories and disperse the control points for critical systems. We should neither simply wait around for crisis to hit or hope for crisis as an opportunity to alter the status quo; rather, we should do as much as possible to conserve ecosystems and relocalize production and trade now, so as to minimize the crisis—which, after all, could potentially prove overwhelming for both humanity and nonhuman nature. If and when crisis arrives, such preparations will be crucial in guiding response efforts and providing a basis for resisting “disaster capitalism.”

What’s the likelihood of success? It depends partly on how we define the term in this context. Many people speak of “solving” problems like climate change, as though we could make a modest investment in new technology and then carry on living essentially as we are. Implicit in the post-carbon crisis theory of change is the understanding that the way we are living now is at the heart of our problem. Success could therefore be better defined in terms of minimizing human suffering and ecological disruption as we adapt toward a very different mode of existence characterized by greatly reduced energy and materials consumption.

Some self-proclaimed “doomers” have concluded that crisis will overwhelm society no matter what we do. Many have joined the “prepper” movement, stockpiling guns and canned goods in hopes of maintaining their own households as the rest of the world comes to resemble Cormac McCarthy’s novel The Road. Other doomers are convinced that human extinction is inevitable and that efforts to prevent that outcome are just so much wasted motion.

I do not share either outlook. Of course there is no guarantee that crisis will open opportunities for sensible adaptation and not simply wallop us, leaving humanity and nature wounded and reeling. But for those who understand what’s coming to simply give up efforts to protect nature and humanity before the going gets tough seems premature at best. There could hardly be more at stake; therefore extraordinary levels of effort and extreme persistence would appear justified if not morally mandatory. The post-carbon crisis theory of change may appear to be a strategy born of desperation. But we should hold open the possibility that it will prove surprisingly apt and effective—to the extent that we have invested our best efforts.

As we build resilience and prepare to make the most of the opportunities that come our way, it’s important that we celebrate the improvements in quality of life that come with reducing our dependency on consumption, advertising, automobiles, and all the other life-smothering accoutrements of our crumbling industrial existence. Let’s also celebrate our adaptability in times of crisis, and continually remind one another that small committed groups sometimes do make history—just as history makes them.

— DECEMBER 2012